waltky
Wise ol' monkey
Mystery algorithm...
How maths can get you locked up
Mon, 17 Oct 2016 - Are algorithms used to give US criminals computer-generated "risk scores" as fair as they first appear?
How maths can get you locked up
Mon, 17 Oct 2016 - Are algorithms used to give US criminals computer-generated "risk scores" as fair as they first appear?
Criminals in the US can be given computer-generated "risk scores" that may affect their sentences. But are the secret algorithms behind them really making justice fairer? If you've seen the hit Netflix documentary series Making A Murderer, you'll know the US state of Wisconsin has had its problems delivering fair justice. Now there's another Wisconsin case that's raised questions about how the US justice system works. In the early hours of Monday 11 February 2013, two shots were fired at a house in La Crosse, a small city in the state. A witness said the shots came from a car, which police tracked down and chased through the streets of La Crosse until it ended up in a snow bank. The two people inside ran off on foot, but were found and arrested.
One of them was Eric Loomis, who admitted to driving the car but denied involvement in the shooting. In court, he was sentenced to six years in prison - and this is where the maths comes in. In deciding to lock Loomis up, the court noted that he had been identified as an "individual who is at high risk to the community" by something called a Compas assessment. The acronym - which stands for Correctional Offender Management Profiling for Alternative Sanctions - is very familiar to Julia Angwin of ProPublica, an independent investigative journalism organisation in the US. "Compas is basically a questionnaire that is given to criminals when they're arrested," she says. "And they ask a bunch of questions and come up with an assessment of whether you're likely to commit a future crime. And that assessment is given in a score of one to 10."
A prisoner stands in an isolation cell in the Dane County Jail in Madison, Wis.
Angwin says the questions include things like: "Your criminal history, and whether anyone in your family has ever been arrested; whether you live in a crime-ridden neighbourhood; if you have friends who are in a gang; what your work history is; your school history. And then some questions about what is called criminal thinking, so if you agree or disagree with statements like 'it's okay for a hungry person to steal'." A risk score might be used to decide if someone can be given bail, if they should be sent to prison or given some other kind of sentence, or - once they're in prison - if they should be given parole. Compas and software like it is used across the US. The thinking is that if you use an algorithm that draws on lots of information about the defendant it will help make decisions less subjective - less liable to human error and bias or racism. For example, the questionnaire doesn't ask about the defendant's race, so that in theory means no decisions influenced by racism.
But how the algorithm gets from the answers to the score out of 10 is kept secret. "We don't really know how the score is created out of those questions because the algorithm itself is proprietary," says Angwin. "It's a trade secret. The company doesn't share it." And she says that makes it difficult for a defendant to dispute their risk score: "How do you go in and say I'm really an eight or I'm a seven when you can't really tell how it was calculated?" It was partly on that basis that Loomis challenged the use of the Compas risk score in his sentencing. But in July the Wisconsin Supreme Court ruled that if Compas is used properly, it doesn't violate a defendant's rights.
MORE