Skip to main content
Find a Lawyer
Please enter a legal issue and/or a location
Begin typing to search, use arrow keys to navigate, use enter to select

Find a Lawyer

More Options

Criminal Justice System Tech Is All in the AI of the Beholder

By Christopher Coble, Esq. | Last updated on

Like any other law enforcement tool, there's good tech and bad tech. Using computer code to clear cannabis convictions after recreational weed is legalized? Good. Using facial recognition software citywide? Not so good. And those are just the City of San Francisco's thoughts on the matter.

Balancing the pros and cons of artificial intelligence's place in the criminal justice system can be tricky, especially as a country that "has been the world leader in AI development," as the ABA's Jason Tashea notes, but "laggards when it comes to the regulation and oversight of the same technology."

Blind Leading the Blind?

One of the most controversial uses of AI in the criminal justice system has been risk assessment tools that try to predict recidivism using statistical probabilities based on factors such as age, employment history, and prior criminal record. As Tashea writes:

"The challenges and harms of these technologies are well-documented. Facial recognition and risk assessments show racial bias. Complex algorithms are not built to 'explain' their conclusions, which closes a part of an otherwise open court process. Even if AI software is 'explainable,' private companies shield their software from scrutiny by claiming it as a trade secret -- despite being used by a public agency."

The idea of taking bail, parole, and probation decisions out of the hands of fallible -- and possibly racist -- humans and turning them over to ostensibly objective machines is clearly an attractive one. The problems arise when considering the creators of these algorithms may write in their own implicit or explicit biases, and the lack of oversight and standards when it comes to assessing the tech's legality or constitutionality.

"[T]he use of algorithms is not fundamentally the problem," according to Tashea. "The problem is the lack of accountability, effectiveness, transparency and competence surrounding these tools."

So where do law enforcement agencies who want to employ AI turn to answer these tough questions? Unfortunately, it's not the courts. Yet. Thus far most judges have deferred to the trade secret interests in code-writing companies over the due process rights of criminal defendants. And currently there is no federal oversight agency when it comes to assessing and auditing AI in the criminal justice realm.

Considering the stakes for criminal defendants and the safety interests of the public, the country will need to come up with some kind of comprehensive plan to monitor the use of tech and AI in the criminal justice system. "It's time for U.S. policymakers to take the road less traveled," says Tashea, "and hold AI deployed in the criminal justice system accountable."

Related Resources:

Was this helpful?

You Don’t Have To Solve This on Your Own – Get a Lawyer’s Help

Meeting with a lawyer can help you understand your options and how to best protect your rights. Visit our attorney directory to find a lawyer near you who can help.

Or contact an attorney near you:
Copied to clipboard