Judge Rules DOGE Used ChatGPT Illegally to Cancel $100M in Grants

Key Takeaways

- A federal judge ruled DOGE's grant cancellations unconstitutional, restoring over $100 million in humanities funding
- DOGE staffers used ChatGPT to scan grants for DEI-related terms without defining what DEI meant to the AI
- Staffers created 'Detection Codes' targeting protected characteristics including race, national origin, and sexuality
A federal judge has ruled that the Department of Government Efficiency acted unconstitutionally when it used ChatGPT to identify and cancel more than $100 million in federal grants. The 143-page decision, issued Thursday by US District Judge Colleen McMahon, found that DOGE staff used the AI chatbot to target grants based on protected characteristics like race, national origin, religion, and sexuality.
The ruling restores funding for grants from the National Endowment for the Humanities that were shut down under the pretense of eliminating diversity, equity, and inclusion programs.
How DOGE Used ChatGPT to Flag Grants
The court filing details testimony from Justin Fox, a DOGE staffer who worked with colleague Nate Cavanaugh to eliminate 97 percent of grants under the NEH. Fox testified that he submitted each grant description to ChatGPT using a standardized prompt.
“Does the following relate at all to DEI? Respond factually in less than 120 characters. Begin with 'Yes.' or 'No.' followed by a brief explanation.”
— Justin Fox's ChatGPT prompt, as cited in court filing
Fox admitted under oath that he did not define "DEI" for ChatGPT. He also testified that he had no idea how the AI system understood the term. The chatbot was essentially making consequential funding decisions based on its own interpretation of a politically charged acronym.
The 'Detection Codes' System
Beyond the ChatGPT prompts, Fox created what he called "Detection Codes" to identify grants he labeled "Craziest Grants" and "Other Bad Grants." These search terms targeted specific demographic groups.
- BIPOC (Black, Indigenous, People of Color)
- Minorities
- Native, Tribal, Indigenous
- Immigrant
- LGBTQ, Homosexual, Gay
When asked whether he ran this list of words through every grant description from NEH, Fox confirmed "yes." Judge McMahon wrote that Fox "constructed and applied explicit classifications based on protected characteristics and used them as the operative criteria for revoking federal grants."

The Constitutional Problem
Judge McMahon's ruling found that using protected characteristics as criteria for federal funding decisions violates constitutional equal protection principles. The court wrote that "it could not be more obvious that DOGE used the mere presence of particular, protected characteristics to disqualify grants from continued funding."
The lawsuit was originally filed in 2025 by humanities groups affected by the grant cancellations. Thursday's ruling represents a significant legal setback for DOGE's approach to cutting federal spending through AI-assisted reviews.
What This Means for AI in Government
The case highlights the risks of deploying AI tools for high-stakes government decisions without proper oversight or methodology. Fox's testimony reveals a process with no clear definitions, no human review of ChatGPT's reasoning, and no consideration of whether the AI's outputs were legally sound.
Using a consumer chatbot to make $100 million funding decisions, while admitting you don't know how the AI interprets key terms, is now documented in a federal court ruling as both ineffective and unconstitutional.
Logicity's Take
Another case of technology governance failures in critical systems
Context on the scale of AI investment versus governance maturity
Frequently Asked Questions
What grants did DOGE cancel using ChatGPT?
DOGE canceled over $100 million in grants from the National Endowment for the Humanities, eliminating 97% of the agency's grants based on ChatGPT scans for DEI-related content.
Why was DOGE's use of ChatGPT ruled unconstitutional?
The court found that DOGE used protected characteristics like race, national origin, religion, and sexuality as criteria for canceling grants, which violates constitutional equal protection principles.
What were the 'Detection Codes' DOGE used?
DOGE staffer Justin Fox created search terms targeting demographic groups including BIPOC, Minorities, Native, Tribal, Indigenous, Immigrant, LGBTQ, Homosexual, and Gay to flag grants for elimination.
Can government agencies use AI for funding decisions?
This ruling suggests agencies must ensure AI tools don't make decisions based on protected characteristics. Using undefined prompts in consumer chatbots without human review creates significant legal risk.
Need Help Implementing This?
Huma Shazia
Senior AI & Tech Writer
اقرأ أيضاً

رأي مغاير: كيف يؤثر اختراق الأمن الداخلي الأميركي على شركاتنا الخاصة؟
في ظل اختراق عقود الأمن الداخلي الأميركي مع شركات خاصة، نناقش تأثير هذا الاختراق على مستقبل الأمن السيبراني. نستعرض الإحصاءات الموثوقة ونناقش كيف يمكن للشركات الخاصة أن تتعامل مع هذا التهديد. استمتع بقراءة هذا التحليل العميق

الإنسان في زمن ما بعد الوجود البشري: نحو نظام للتعايش بين الإنسان والروبوت - Centre for Arab Unity Studies
في هذا المقال، سنناقش كيف يمكن للبشر والروبوتات التعايش في نظام متكامل. سنستعرض التحديات والحلول المحتملة التي تضعها شركات مثل جوجل وأمازون. كما سنلقي نظرة على التوقعات المستقبلية وفقًا لتقرير ماكنزي

إطلاق ناسا لمهمة مأهولة إلى القمر: خطوة تاريخية نحو استكشاف الفضاء
تعتبر المهمة الجديدة خطوة هامة نحو استكشاف الفضاء وتطوير التكنولوجيا. سوف تشمل المهمة إرسال رواد فضاء إلى سطح القمر لconducting تجارب علمية. ستسهم هذه المهمة في تطوير فهمنا للفضاء وتحسين التكنولوجيا المستخدمة في استكشاف الفضاء.