| AI technology predicts crime a week in advance with 90% accuracy |
Image credit- https://pxhere.com/en/photographer/767067
Use of AI technology in crime:
Artificial intelligence technology that screens crime data can predict the location of crime next week with up to 90% accuracy, but there are concerns about how such a system would perpetuate stigma.
Artificial intelligence technology can now predict the location and crime rate in a city a week in advance with up to 90% accuracy. Similar systems have been shown to be capable of perpetuating racist bias in policy, and the same may be true in this case, but the researchers created this AI for know it can also be used to expose those biases.
Ishanu Chattopadhyay of the University of Chicago and his colleagues created an AI model that analyzed historical crime data from Chicago, Illinois from 2014 to the end of 2016 and then predicted crime levels over the weeks. after this training period.
The model predicts the probability of certain crimes in the city, divided into squares about 300 meters in diameter, one week in advance with up to 90% accuracy. It has also been trained and tested on data from seven other major US cities, with similar performance levels.
Previous attempts to use AI technology to predict crime have been controversial because they can perpetuate racial biases. In recent years, the Chicago Police Department has been testing an algorithm that generates a list of people deemed most at risk of participating in a shooting, either as a victim or as a perpetrator. The details of the algorithm and the initial list were kept private, but when the final list was released, it turned out that 56% of the city's black men between the ages of 20 and 29 were on it.
Chattopadhyay admits that the data used by his model will also be biased, but says that efforts have been made to reduce the effect of bias and that the AI fails to identify suspects, only pages potential criminal web.
“Law enforcement resources are not infinite. So you do want to use that optimally. It would be great if you could know where homicides are going to happen,” he says.
Chattopadhyay says AI technology predictions can be used more reliably to provide high-level policy information, rather than being used directly to allocate police resources. He makes the data and algorithms used in the study public so other researchers can study the results.
The researchers also used the data to look for areas where human bias influences policymaking. They analyzed the number of criminal arrests in Chicago neighborhoods with different socioeconomic levels. This shows that crime in affluent areas leads to more arrests in poorer areas, suggesting a bias in police response. Lawrence Sherman of the Cambridge Center for Evidence-Based Policy, UK, expressed concern about the inclusion of reactive and proactive policy data in research, or which crimes tend to be recorded by people. Reports and crimes tend to be recorded due to the police following them. The second type of data is very prone to bias, he said.“It could be reflecting intentional discrimination by police in certain areas,” he says.

0 Comments