The Algorithm Compas, who worked for the American police since 1998, was engaged in the analysis of data from defendants, and then, on the basis of the information received helped to solve, for example, whether to release the offender on bail or is it better to leave detention. Choosing a measure of restraint, the system took into account age, gender and level in the criminal career. For 20 years, «» algorithm estimated more than a million people, but recently it is recognized as incompetent, after which he was immediately withdrawn.
Scientists from Dartmouth College to check how accurate the system is and if we can trust her. For this, they have recruited freelancers, ordinary people without legal training, to make their own decisions on the basis of short certificates of people by giving subjects information about sex, age, criminal record and several other parameters.
The accuracy of the forecast has a small dossier of freelancers amounted to almost 70 percent, but the program has lagged behind men by five percent, while relying on the 137 paragraphs of the biography. Analysis of the judgments of the algorithm showed that black inmates are often the program under suspicion.
«Error in such cases can be very expensive, so it is worth considering, do I need to apply this algorithm to the court's judgments», — says one of the study's authors.
Having Studied the principle of operation of the algorithm, the researchers came to the conclusion that the younger the defendant, and the more arrests he has, the higher the likelihood of relapse, and experts in the field of AI have recognised the technology unreliable.
About the harm it was written so much to read the data in scientific papers is possible for weeks on end.
Prime numbers are more than numbers that are divisible by themselves and one.
Perhaps in the entire history of the Universe, there was no other intelligent, technologically advanced species of creatures, besides humans.