Pentagon to Anthropic: Build the Surveillance Machine, or Else
United States – February 26, 2026 – The Pentagon is pressuring an AI firm to loosen safeguards, and it smells like mass surveillance by contract.
The newsroom lights are too bright and the coffee tastes like burnt compliance training. My phone keeps buzzing like a cheap ankle monitor. And out of the static comes the familiar sound of Washington clearing its throat: a federal agency wants a new power, a private vendor is in the way, and someone is trying to turn a contract clause into a constitutional workaround.
Congress is urged to probe a Pentagon-Anthropic fight over AI limits
Axios reports that advocacy groups are urging Congress to investigate a dispute between the Department of Defense and Anthropic over how the Pentagon can use frontier AI. This is not a vibes fight. It is a fight over whether the government gets advanced AI for mass domestic surveillance and fully autonomous weapons, and whether a company can keep restrictions in place without getting kneecapped by the state. The Pentagon is expected to decide by Friday whether to keep a reported $200 million contract with Anthropic. The point of the ask is simple: drag it into the hearing room with documents and sworn testimony.
Common Cause published the coalition letter laying out the allegation in plain ink: Defense Secretary Pete Hegseth is pressuring Anthropic to remove red lines against mass domestic surveillance and fully autonomous weapons, with consequences threatened if it does not comply by February 28, 2026. The letter says those consequences could include branding Anthropic a supply chain risk or forcing tailor-made changes through the Defense Production Act.
Translation: They want the AI without the guardrails
Translation: When the Pentagon says it needs models for “all lawful purposes,” read it as: we will decide what “lawful” means, in-house, behind closed doors, and you will not ask what we are doing with the tool.
That is the bureaucratic version of a blank check with invisible ink. The coalition letter frames the dispute as the Pentagon trying to reserve the right to violate the law and Americans’ constitutional rights, and wanting systems “free from usage policy constraints” that might limit military applications.
Axios also notes lawmakers reacting like human beings for once. Sen. Mark Warner said he is “deeply disturbed” and pointed to broad public opposition to AI-facilitated surveillance and unsupervised autonomous weapons. Sen. Chris Coons warned that demanding “complete obedience” from a private company to surveil Americans or build self-firing weapons is a chilling concept.
Here is the mechanism: Procurement becomes policy
Here is the mechanism: Congress moves slow, so agencies route around it with procurement, classification, and vendor lock-in. Then, once the system is built, they point at the system and say it is now the baseline reality, so the law must adapt.
The letter spells out the pressure tool. If Anthropic refuses, the government can threaten to label it a supply chain risk, a label typically used for foreign adversaries. That flips a political dispute into a compliance crisis. Partners panic. The holdout caves, or it gets replaced by a more obedient model shop. The letter also argues the Pentagon is trying to “set the tone” for every AI company negotiating with the military. This is not one contract. It is a template.
Follow the money: A $200 million contract is gravity
Follow the money: A $200 million contract is not just a check. It pulls engineers, roadmaps, infrastructure, and executive priorities toward the buyer. For the Pentagon, frontier models offer scale and speed, plus the ability to sift oceans of data with fewer humans asking pesky questions about warrants, targeting thresholds, bias, error rates, and accountability. If you can “connect dots” across metadata, location data, data broker dossiers, and open-source feeds, you do not need to change the law to change lived reality. You just need the pipeline.
The coalition letter claims other frontier AI firms have accepted the Pentagon’s “all lawful purposes” standard for certain systems, and says xAI formally agreed to deploy Grok in classified systems with no conditions attached. The market signal, if true, is loud: obedience is bankable.
The quiet part: The Pentagon does not want to be told “no” by the Constitution, so it is trying to be told “yes” by a contract.
Mic drop: If the Pentagon wants new powers, it can come to the hearing room and ask for them in plain language, under bright lights, with watchdogs and courts and voters watching. No more policy-by-procurement. Audit the contracts, strengthen reporting requirements, fund independent oversight, and organize like your privacy is on the line, because it is.