Reading a law professor letter to NYTimes, I notice a line of thoughts encroached our society: Truth, fairness and objectivity are within reach with data analytics. The author arguing against using data scores to calculate sentencing said
“Data-driven predictions grounded in legitimate factors might be about as accurate as current profiling schemes. There is no persuasive evidence that the current troubling variables add much predictive value, once criminal conduct is already taken into account. But even if they do improve accuracy, this gain doesn’t justify sacrificing fairness.”
In turn, she tried to weight traditional and data driven methods, when justices and fairness are concerned. The underlying tone is that there is a correct sentencing and judges should pursue it whenever possible.
Human hunt of fairness and objectivity goes astray. A correct sentencing doesn’t exist, no matter how we hard we try, how smart our algorithms become. Using data driven decision making tools should not let us to play God’s (or Gods’ ) role.
If we accept data is not truth and we are not God, then the seemingly unfair situation “that people should be imprisoned longer because they are poor” is a fallacy. One way of the other, judges make decision based some references point, be it his/her visit to Disneyland or prison, be it the risk score of the convicted. There is no faultless human decision.