Zone1 Former judges side with Anthropic and raise concerns about Pentagon’s use of supply chain risk label

NewsVine_Mariyam

Diamond Member
Joined
Mar 3, 2018
Messages
12,974
Reaction score
9,137
Points
2,230
Location
The Beautiful Pacific Northwest
I tried bringing attention to the reasons that this situation was so important and all I pretty much got was the platitude "the customer is always right" (speaking of the federal government ala Trump, Hegseth et al). That simply shows that the person is clueless about the actual situation.

We’re not talking about a disagreement over features or pricing—we’re talking about Anthropic having installed in its terms of service/use agreement two explicit red lines:
  • no domestic surveillance of U.S. citizens
  • no autonomous lethal decision-making without human authority
Those weren’t hidden. Those were known constraints.

So if the government enters into a contract with a company that has those boundaries—and then pressures them to remove them—this isn’t about customer preference.

It’s about whether a vendor is allowed to maintain basic ethical safeguards without being punished for it.

Ending a contract is one thing.

Labeling a company a “supply chain risk” after they refuse to cross those lines is something else entirely. Quite frankly it's straight retaliation.

Nearly 150 retired federal and state judges have filed an amicus brief on Tuesday supporting AI company Anthropic in its lawsuit against the Trump administration for designating it a “supply chain risk,” CNN has learned.
The former judges, appointed by both Republicans and Democrats, join a growing list of Anthropic supporters that includes industry organizations and former senior national security government officials, as well as Microsoft and staffers from competing AI companies.
The amicus brief underscores concerns raised in the tech, legal and national security community over the precedent the situation could set regarding government influence over private companies. For Anthropic, the stakes could be significant; the “supply chain risk” label could affect the company’s contracts with the vast ecosystem of private-sector firms that do business with the military.
“More fundamentally, as a practical matter, no one is trying to force the Department to contract with Anthropic,” the judges wrote. “Instead, Anthropic is asking only that it not be punished on its way out the door.”
The Pentagon “misinterpreted the statute and violated the necessary procedures” when it labeled Anthropic a “supply chain risk,” they also wrote.
The Defense Department designated Anthropic a “supply chain risk” earlier this month after negotiations over the use of the company’s AI models in classified systems broke down. The Pentagon wanted to use Claude in “all lawful” cases, but Anthropic refused to back down over two key redlines: AI’s use in autonomous weapons, and AI’s use in mass surveillance of American citizens.
The “supply chain risk” label is usually given to companies associated with foreign adversaries and has never been given to an American company in modern times. It means companies with military contracts must ensure that any use of Anthropic’s tools are kept separate from that work.
In addition to the ‘supply chain risk’ designation, President Donald Trump ordered all federal agencies to stop using Claude.
Anthropic CEO Dario Amodei said the company had “no choice but to challenge it in court.” In response to Anthropic’s lawsuit last week, White House spokesperson Liz Huston said the president “will never allow a radical left, woke company” to dictate how the military operates.
Anthropic’s chief financial officer said in a legal filing that the company is at risk of losing “hundreds of millions” in revenue in 2026 because of the government’s action.

 
Back
Top Bottom