Add your name to join Color Of Change and call on Securus and LeoTech to suspend the use of AI for predictive policing in prisons, conduct an independent investigation into their use, and commit to regular racial equity audits!
Add Your Name: Stop AI Listening to Prison Phone Calls!
In prisons across the country, artificial intelligence has been introduced to screen and listen in on calls made by people in prison.
Artificial Intelligence Programs like Securus and LeoTech are listening in on phone calls and collecting personal information. The use of artificial intelligence in this way is disconcerting to people in prison as the technology works to recognize and duplicate voices, listen for trigger words, and “detect” and “predict” future criminal activity, largely impacting Black and Brown incarcerated people.
Color Of Change is leading a bold campaign to hold corporations accountable that are technologically racially profiling us. The programs used by Securus and LeoTech use algorithms to eavesdrop on phone calls and listen out for certain keywords so they can supply prison officials with information on people in prison suspected of organizing crimes over the phone.
Here is the Petition:
Black and Brown individuals are already overrepresented in the prison industry, with 60% of the prison population being people of color. Additionally, 1 in 3 Black men and 1 in 6 Hispanic/Latino men can expect to be incarcerated in their lifetime, compared to 1 in 17 white men.
This technology gives corporations and prison profiteers the capacity to target even more Black and Brown incarcerated populations. We cannot allow algorithms to racially profile us.
Our Demands:
- Suspension of Artificial Intelligence for the usage of predictive policing in prisons
- An immediate suspension of all applications, algorithms, or technologies that aim to “predict” future crime, or flag particular people in prison due to protected class; race, class, gender, and/or ethnicity.
- Racial Equity Audits
- Regular and mandatory independent racial equity audits into predictive policing and algorithmic bias based on protected class.
- Transparency & Accountability
- Commission an independent investigation into the current implications of learning language model algorithms, applications, and other technologies and the perpetuation of biases against protected classes.
- All findings must be published and accessible to the general public
