There was a great story in Slate yesterday called “Errors in Judgment — Were hundreds of criminals given the wrong sentences because lawyers messed up a basic work sheet?”.
The background: the state of Maryland established a worksheet that graded the severity of a convict’s crime and his risk to society, and was intended to make sentencing more consistent and the administration of justice a little less arbitrary.
The problem? And all too-common problem with anything to do with information and analysis: human error. Despite the high-stakes (“months and years of freedom gained or lost”), researcher Emily Owens found that the system was generating errors in 1 in 10 trials, even though there were multiple opportunities for the data to be reviewed and corrected.
And what did people find so hard about the worksheet? Simply looking at the right number! From the article (click on the little plus sign by the third-to-last paragraph):
“The work sheet generated separate “scores” for the felon and his crime. The recommended sentence was then read off a table with offender and offense scores corresponding to the rows and columns of a grid. More than 90 percent of errors resulted from the person completing the work sheet entering the figure from a cell next to the correct one. (Using, say, a ruler to get to the correct cell would have prevented this.) The remaining errors came mostly from incorrect choice of criminal statute in calculating the offense score and from a handful of math errors (in operations that were literally as simple as adding two plus two).”
Morals of the story for business intelligence deployments:
- You can never overestimate underestimate the “information competency” of your users
- Eliminate manual processes where possible (“the Commission had already been at work developing an automated worksheet with the explicit goal of eliminating errors.”)
- Build in checks and balances and collaboration – the answer to the “people problem” is more people to review the decision-making process. (“multiple levels of evaluation helped to undo some of the damage”)
Comments
6 responses to “Blind (and Incompetent) Justice Thanks to Spreadsheet?”
Glad I don’t live in Maryland anymore. I don’t know what Delaware, my home for 37 years uses but I have seen friends and family ground up in the criminal injustice system. My grandson, 21, studied criminal justice and became sickened at how it really works. If you have the money, you get a good lawyer who gets you off or makes a sweetheart deal. If you ha ve a public defender, he assumes you’re guilty and trys to make a deal. If you go to trial, hope you don’t get certain judges. At last, that’s how it works in Sussex County where I live. Get a ticket, get it switched to court of common pleas and make a deal with the DA at a reduced charge. Justice is not blind, humans make mistakes, and money is a BIG part of justice. Sad, isn’t it?
Laws are written not by scientist or engineers but by humanity subject based degrees. These lawyers never tell the crime and punishment in a clear concise way. Example: unintended kick – punishment 4 lashes. etc. The legal profession wants its rewards no matter how absurd is the law.
Mike: Read the article. I quote:
There is an adversarial system. The defense attorneys failed to correct the mistakes make by the prosecutors.
e
“You can never overestimate the “information competency” of your users”
Think about that one for a moment. You just again demonstrated how fallible human users actually are.
“Build in checks and balances and collaboration”
No, what you need to build in is an adversarial system. Give the worksheet to the convicted person, his lawyer, and the DA; they have opposing interests, yet they all need to come up with the same answer.
You can never overestimate the “information competency” of your users?
Wouldn’t that be underestimate?
– Peace
Neat illustration, huh? But fixed it anyway…