Key findings:
- DOGE used ChatGPT to flag 1,057 of 1,163 NEH grants as "DEI" — including a grant to fix a museum HVAC system — without subject-matter expert review, according to court discovery in ACLS v. NEH
- DOGE staff conducted official business on Signal with auto-delete enabled, potentially violating the Federal Records Act; plaintiffs allege the deletions were intentional
- Across the federal government, DOGE has driven the termination of 15,887 grants totaling approximately $49 billion, per the Center for American Progress tracker
- The NEH case is moving toward summary judgment and could set the first binding precedent on whether AI chatbot output can serve as legal justification for cancelling federal obligations
The Department of Government Efficiency promised to root out wasteful spending with surgical precision. Court documents unsealed last month show DOGE actually handed the scalpel to a chatbot — and then destroyed the operating notes.
Discovery materials released on March 6 in ACLS, AHA, and MLA v. NEH, filed in the U.S. District Court for the Southern District of New York, reveal the process DOGE used to cancel more than $100 million in grants issued by the National Endowment for the Humanities. The method was straightforward: DOGE operatives fed grant descriptions into OpenAI's ChatGPT, asked the model whether each grant qualified as "DEI," entered the chatbot's yes-or-no answers into a spreadsheet, and used that spreadsheet to determine which grants would be terminated.
Of 1,163 grant proposals fed through the process, ChatGPT flagged 1,057 as DEI. Just 42 were kept.
No subject-matter expert reviewed the AI's output before the terminations were executed.
What Got Flagged
The grants ChatGPT identified as promoting "discriminatory equity ideology" included a documentary on Jewish women's slave labor during the Holocaust, an archival project preserving the lives of Italian Americans, a project to digitize photograph collections of Appalachian residents, and multiple initiatives to preserve endangered Native American languages.
A $349,000 grant to replace the HVAC system in a North Carolina museum was cancelled after ChatGPT flagged it as DEI. The grant's description mentioned no demographic criteria, no diversity objectives, and no equity framework. It described a broken heating system in a public building.
These were not edge cases caught by an overly cautious filter. They were the central output of a process that replaced human judgment with algorithmic guessing.
The Records Problem
Depositions of two DOGE team members and two senior NEH officials — Adam Wolfson, NEH Assistant Chair for Programs, and Michael McDonald, then-Acting Chair — revealed a second, arguably more serious issue: DOGE staff conducted official government business about the grant cancellations using Signal, a private messaging app not authorized for federal employees, with messages set to auto-delete.
The Federal Records Act requires that records of official decision-making be preserved. Signal's disappearing-message function is, by design, incompatible with that requirement. The plaintiffs allege these deletions were not accidental but intentional — that DOGE operatives chose an auto-deleting communication channel precisely because they knew the decisions they were making would not survive scrutiny.
If that allegation holds, the question is not just whether the grant cancellations were arbitrary. It is whether the people who made them knew they were arbitrary and sought to destroy the evidence.
Scale Beyond the NEH
The NEH case is a window, not the full picture. By January 2026, DOGE had driven the termination of 15,887 federal grants totaling approximately $49 billion across the government, according to the Center for American Progress DOGE cuts tracker. AmeriCorps lost nearly $400 million in active grants, shutting down over 1,000 programs and eliminating more than 32,000 positions, according to The Washington Post. The Department of Justice cancelled 373 grants worth $820 million — funding violence reduction programs, crime victim services, juvenile justice, and community policing in 37 states, per a Council on Criminal Justice analysis.
A survey by the Urban Institute found that one in three nonprofit service providers experienced a government funding disruption in the first four to six months of the cuts, with 21 percent losing a grant or contract outright.
Whether ChatGPT was the mechanism across all of these cancellations or only at the NEH, the discovery materials raise a question that applies universally: what process was used to decide which grants lived and which died? And can anyone produce the receipts?
The Constitutional Question
The plaintiffs' motion for summary judgment argues that DOGE's process violated the Federal Equal Protection Clause of the Fifth Amendment by using an AI model whose training data and decision logic are opaque to make determinations that affected constitutionally protected speech and scholarship. They further allege Federal Records Act violations for the Signal deletions.
Several federal judges have already issued orders blocking or reversing specific DOGE grant terminations. The NEH case, now moving toward summary judgment, could establish the first binding precedent on whether an AI chatbot's output can serve as the legal basis for cancelling federal obligations.
Who Benefits?
Who benefits from AI-driven grant cancellations?
- DOGE leadership, which can claim billions in "savings" without the slow, accountable work of actual auditing
- Political operatives who want to defund specific research areas but need plausible deniability about targeting
- Nobody in the public, who lose access to community safety programs, cultural preservation, and public services that were funded for documented, reviewed reasons
Who loses?
- Every American community that relied on the 15,887 cancelled grants for public safety, education, and cultural preservation
- The principle of accountable government, which requires that when the state takes something away from a citizen, it can explain why
- Future oversight, because if the Signal messages are gone, no inspector general or congressional committee can reconstruct what happened
The ACLS, AHA, and MLA motion for summary judgment is pending before the U.S. District Court for the Southern District of New York. Bastion Daily will continue reporting as the case develops.