
AI tools now draft judicial rulings in American courts, raising alarms over government overreach into impartial justice and eroding conservative principles of limited bureaucracy.
Story Snapshot
- Los Angeles Superior Court pilots “Learned Hand” AI to generate draft rulings, mimicking judges’ styles amid massive caseload backlogs.
- Federal judges faced Senate scrutiny after issuing error-filled AI-assisted orders citing fake cases and misquoting laws.
- Over 60% of federal judges use AI, but critics warn of bias, inaccuracies, and threats to unbiased human judgment.
- Mandatory reviews required, yet pilots expand to 10 states, fueling concerns about judicial independence.
Los Angeles Court Launches AI Pilot Program
Los Angeles County Superior Court initiated a pilot in early 2026 using the AI tool Learned Hand. This system summarizes legal filings, analyzes motions, and produces draft rulings tailored to individual judges’ writing styles. Court spokesman Rob Oftring Jr. described the AI as a supplement to human clerks, targeting repetitive civil tasks in a system overwhelmed by backlogs. Superior Court judges handle heavy caseloads, often citing identical case law in summary judgment motions. Officials mandate full judicial review and editing before any adoption to preserve independence.
Federal Precedents Highlight AI Risks and Errors
U.S. District Judge Henry T. Wingate in Mississippi and Judge Julien Xavier Neals in New Jersey issued orders with AI-generated errors. These rulings misquoted laws, referenced non-existent cases, and contained factual inaccuracies. Senate Judiciary Chairman Chuck Grassley launched inquiries in 2026, demanding transparency on AI use in federal courts. Grassley stated such errors undermine the deliberative judicial process essential to American justice. These incidents prompted calls for stricter oversight amid rising AI adoption.
Stakeholders Clash Over Efficiency Versus Judicial Integrity
Shlomo Klapper, CEO of Learned Hand’s developer, promotes the tool as verifiable “co-intelligence” now active in 10 states, including Michigan’s Supreme Court since summer 2024. LA County Bar President Nathan Hochman warns AI drafts risk embedding bias, predisposing judges before full review. An anonymous LA judge echoed concerns about subtle influences on decision-making. Northwestern’s 2026 survey found over 60% of federal judges using AI, primarily for research, though only 22% employ it daily or weekly. Courts position AI as a “judicial sous chef” for tedium, not final authority.
Broader Adoption Trends and Expert Warnings
Generative AI adoption surged post-2023, evolving from document summaries to tentative rulings. A 2025 federal survey revealed 30% AI use for legal research and 15.5% for document review. Experts from Thomson Reuters advocate a 7-step framework emphasizing human oversight and security. Academic tests show AI excels at timelines but fails complex opinions without refined prompts. Pilots enforce a “bright line test”: AI cannot perform tasks beyond clerks’ capabilities. Long-term, improved GenAI could near-final drafts, but human judgment remains irreplaceable.
Implications for Court Efficiency and Public Trust
Short-term gains include faster backlog clearance and reduced tedium for judges. Economic boosts arise from higher throughput in overburdened systems like LA Superior Court. Risks involve eroded public trust from errors and potential bias in civil resolutions. Litigants face faster outcomes but heightened error exposure. Politically, Senate scrutiny may yield federal guidelines. Socially, inaccuracies challenge faith in impartial justice, a cornerstone of conservative values favoring limited government intervention in fair proceedings.
Sources:
Los Angeles Courts Pilot AI Tool to Help Judges Draft Rulings
Judging AI: Generative AI in Courts
Grassley Scrutinizes Federal Judges’ Apparent AI Use in Drafting Error-Ridden Rulings
Northwestern Study Finds a Significant Number of Federal Judges Are Already Using AI Tools
The Judge’s Guide to AI Adoption Without Compromising Authority














