We Believe In Fighting For Civil Rights

Even computerized policing can be discriminatory 

On Behalf of | Nov 7, 2024 | Civil Rights

There have long been issues with police officers discriminating or allowing bias to influence their decisions. If African-American men are arrested at a higher rate than Caucasian men, for example, it could be because the officers hold inherent biases. They make assumptions about certain groups of people, and those assumptions and biases show up in the arrest rate.

For example, one study noted that only 22% of Caucasian men have been arrested by age 18. For African-American men at the same age, the rate is 30%. For Hispanic men, it is 26%. These overall population groups are statistically smaller, but being arrested at higher rates potentially underlines discrimination in the police system.

Issues with predictive policing

One potential way that some departments have sought to counter this is by using predictive policing. Essentially, a computer system analyzes data and predicts where crime is most likely to occur. This affects where officers are sent on patrol and who they may end up arresting. But since the computer is making the determination, it doesn’t seem like it could be biased.

The trouble, however, is that the computer’s algorithm is simply trained on data from human officers. So if the police officers in the department are discriminating, that’s going to be reflected in the statistics they feed into the predictive policing model. This in turn makes the algorithm biased against the same population groups, so some researchers from MIT have claimed that these computer algorithms are racist.

Those who feel they have been treated unfairly by the police, especially if it may have been a violation of their fundamental rights, need to know what legal options they have.