That is The Marshall Mission’s Closing Argument e-newsletter, a weekly deep dive right into a key prison justice subject. Need this delivered to your inbox? Join future newsletters.
In July, Tesla followers lined up for hours in Los Angeles to take a look at the brand new “retro-futuristic” diner and charging station opened by Elon Musk. Among the many sights was the corporate’s “Optimus” robotic, which served popcorn to hungry clients close to the people grilling Wagyu burgers. Fifty miles east in Chino, Delinia Lewis, the affiliate warden of the California Establishment for Girls, hopes to at some point put AI-powered machines like these to work in her jail doing much more vital jobs than slinging snacks. As staffing shortages proceed to plague prisons across the nation, Lewis believes AI may assist shut the hole.
“Medication distribution, cell feeding, safety searches, bundle searches for fentanyl, all of the hazardous and routine duties that employees do not wish to do,” stated Lewis. “Why not let the robotic do it? Then employees can give attention to extra intricate components of the job.”
Lewis has written about using AI in corrections, and stated she is forming a enterprise to provide AI-driven robots to be used in corrections settings. Whereas she hopes the tech may very well be employed inside the subsequent 10 years, the state’s finances disaster makes buying cutting-edge AI instruments powerful.
“Who is aware of when California will probably be again within the inexperienced,” Lewis stated of the state’s finances, “however we’re dropping employees at a report price, so the bridge has received to interrupt, and we’ve gotta actually reap the benefits of expertise.”
Robots behind bars could also be a methods off, however prisons and jails have been quickly adopting different AI and machine-learning instruments. Advocates essential of the expertise are involved about opaque knowledge assortment processes, privateness violations and bias.
Jail telecommunications firms had been a number of the first to dip their toes in AI expertise. In 2017, LeoTech started advertising Verus, a cellphone surveillance instrument to report and monitor calls. The corporate makes use of Amazon’s cloud and transcription companies to flag key phrases which may alert employees to “invaluable intelligence.” At the very least three states used the instrument to observe cellphone requires mentions of coronavirus through the pandemic, in an try to trace outbreaks, in accordance with The Intercept. Whereas instruments like Verus had been initially marketed as add-ons to present cellphone companies, many jail telecommunications giants have since made AI name monitoring a default a part of their companies.
“Given Securus and World Tel Hyperlink are actually offering it, it means it’s going to be much more accessible in much more locations,” stated Beryl Lipton, an professional on regulation enforcement and jail surveillance instruments on the Digital Frontier Basis.
The usage of these instruments has led to critical breaches of attorney-client privilege. Over the past 5 years, lawsuits have been filed in a number of states towards Securus, alleging that the corporate recorded privileged calls. Securus has settled a number of the lawsuits and has denied purposely recording protected calls. The controversy hasn’t stopped corrections departments from utilizing the expertise, or distributors from advertising it. LeoTech has been lobbying in Ohio, the place lawmakers handed a finances this yr that features $1 million for the state’s jail system to pay for software program that can “transcribe and analyze all inmate cellphone calls” starting subsequent yr, in accordance with Sign Ohio. Florida inked a cope with LeoTech in 2023.
Lipton’s major concern with the AI instruments in prisons and police departments is how the information they collect is saved, retained, and later fed into different techniques.
“Regulation enforcement and the businesses serving to them do that are very involved in accumulating all the knowledge they presumably can accumulate on any individual, as a result of they suppose that is going to assist them in fixing or stopping a future crime,” stated Lipton.
Whereas some AI expertise is making its method into the system, in some methods, the U.S. is taking part in catch-up with different international locations. Final month, the UK’s Ministry of Justice laid out its plan to embed AI throughout prisons, probation companies and courts. Among the company’s targets embody integrating AI transcription and doc processing instruments for probation officers, and the creation of a “digital assistant…to assist households resolve youngster association disputes exterior of court docket.”
However the star of the announcement is a brand new “AI violence predictor” that guarantees to forestall jail violence by analyzing knowledge, together with an incarcerated particular person’s age and former involvement in violent incidents. If this sounds acquainted, you could be considering of threat evaluation instruments which have lengthy been used throughout the U.S., which ProPublica documented almost 10 years in the past to be rife with racial bias and “remarkably unreliable in forecasting violent crime.” The older instruments typically assess threat by contemplating a set of weighted variables — comparable to age and prior convictions — both manually or by utilizing an algorithm. AI-driven “predictors” are like threat evaluation instruments on steroids, drawing on a lot bigger datasets.
Whereas as we speak’s AI-driven instruments are extra refined in some methods, the chance for bias and error continues to be there, and the efficacy of predictive instruments has repeatedly been referred to as into query.
“A number of these predictive instruments can create unintended errors the place sure communities are underserved or misunderstood due to how the mannequin missed or wrongly accounted for people’ dangers in that neighborhood,” stated Albert Fox Cahn, founder and govt director of the Surveillance Expertise Oversight Mission, who has studied AI surveillance in prisons.
Along with predicting violence towards others, some correctional employees wish to use “biometric behavioral profiling” instruments together with AI to forestall in-custody deaths and medical emergencies. The Maricopa County Sheriff’s Workplace, in Arizona, needs to purchase wearable expertise to trace coronary heart price, physique temperature, and different “key indicators,” in accordance with AZ Central. Jails in Colorado, Alabama, and elsewhere in Arizona have already begun utilizing comparable instruments.
Lewis, the affiliate warden in California, is effectively conscious of the moral considerations that include AI instruments, and believes criticism will finally produce higher outcomes.
“I welcome considerations, as a result of that offers us a chance to do extra analysis and resolve these considerations,” stated Lewis. “I do not suppose it is going to inhibit us, I feel it is simply going to assist us make a extra superior and a greater product.”

















