Skip to main content
Skip to content
Case File
d-31400House OversightOther

Critique of Proprietary Algorithms in Criminal Justice and Their Legal Implications

The passage discusses general concerns about trade‑secret algorithms in sentencing and law enforcement, citing academic sources but provides no concrete names, transactions, dates, or actionable leads Algorithms are increasingly used in sentencing and parole decisions across U.S. states. Trade‑secret algorithms can hinder defendants' ability to challenge evidence. Proprietary algorithms may embed

Date
November 11, 2025
Source
House Oversight
Reference
House Oversight #016383
Pages
1
Persons
0
Integrity
No Hash Available

Summary

The passage discusses general concerns about trade‑secret algorithms in sentencing and law enforcement, citing academic sources but provides no concrete names, transactions, dates, or actionable leads Algorithms are increasingly used in sentencing and parole decisions across U.S. states. Trade‑secret algorithms can hinder defendants' ability to challenge evidence. Proprietary algorithms may embed

Tags

predictive-analyticstechnology-impactalgorithmic-sentencingtrade-secretspolicy-critiquelegal-exposurecriminal-justice-reformhouse-oversightlegal-transparency

Ask AI About This Document

0Share
PostReddit

Extracted Text (OCR)

EFTA Disclosure
Text extracted via OCR from the original document. May contain errors from the scanning process.
insight. There can be no royal road to becoming Goethe. In scientific atlas after scientific atlas, one sees explicit argument that “subjective” factors had to be part of the scientific work needed to create, classify, and interpret scientific images. What we see in so many of the algorists’ claims is a tremendous desire to find scientific objectivity precisely by abandoning judgment and relying on mechanical procedures—in the name of scientific objectivity. Many American states have legislated the use of sentencing and parole algorithms. Better a machine, it is argued, than the vagaries of a judge’s judgment. So here is a warning from the sciences. Hands-off algorithmic proceduralism did indeed have its heyday in the 19th century, and of course still plays a role in many of the most successful technical and scientific endeavors. But the idea that mechanical objectivity, construed as binding self-restraint, follows a simple, monotonic curve increasing from the bad impressionistic clinician to the good externalized actuary simply does not answer to the more interesting and nuanced history of the sciences. There is a more important lesson from the sciences. Mechanical objectivity is a scientific virtue among others, and the hard sciences learned that lesson often. We must do the same in the legal and social scientific domains. What happens, for example, when the secret, proprietary algorithm sends one person to prison for ten years and another for five years, for the same crime? Rebecca Wexler, visiting fellow at the Yale Law School Information Society Project, has explored that question, and the tremendous cost that trade-secret algorithms impose on the possibility of a fair legal defense.*4 Indeed, for a variety of reasons, law enforcement may not want to share the algorithms used to make DNA, chemical, or fingerprint identifications, which puts the defense in a much weakened position to make its case. In the courtroom, objectivity, trade secrets, and judicial transparency may pull in opposite directions. It reminds me of a moment in the history of physics. Just after World War II, the film giants Kodak and Ilford perfected a film that could be used to reveal the interactions and decays of elementary particles. The physicists were thrilled, of course—until the film companies told them that the composition of the film was a trade secret, so the scientists would never gain complete confidence that they understood the processes they were studying. Proving things with unopenable black boxes can be a dangerous game for scientists, and doubly so for criminal justice. Other critics have underscored how perilous it is to rely on an accused (or convicted) person’s address or other variables that can easily become, inside the black box of algorithmic sentencing, a proxy for race. By dint of everyday experience, we have grown used to the fact that airport security is different for children under the age of twelve and adults over the age of seventy-five. What factors do we want the algorists to have in their often hidden procedures? Education? Income? Employment history? What one has read, watched, visited, or bought? Prior contact with law enforcement? How do we want algorists to weight those factors? Predictive analytics predicated on mechanical objectivity comes at a price. Sometimes it may bea price worth paying; sometimes that price would be devastating for the just society we want to have. More generally, as the convergence of algorithms and Big Data governs a greater and greater part of our lives, it would be well worth keeping in mind these two lessons “4 Rebecca Wexler, “Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System,” 70 Stanford Law Review, XXX (2018). 163

Forum Discussions

This document was digitized, indexed, and cross-referenced with 1,400+ persons in the Epstein files. 100% free, ad-free, and independent.

Annotations powered by Hypothesis. Select any text on this page to annotate or highlight it.