Source: Wall Street Journal
-
U.S. military forces utilized Anthropic’s AI model, Claude, during a high-stakes January 2026 operation in Caracas to capture former Venezuelan President Nicolás Maduro, an operation that involved the targeted bombing of several local sites.
-
The deployment occurred via Palantir Technologies’ platform, circumventing Anthropic’s strict public usage policies which explicitly forbid the software from being used for surveillance, weapon development, or the facilitation of physical violence and clandestine military operations.
-
Tensions have escalated between the safety-focused Anthropic and the Trump administration, with Defense Secretary Pete Hegseth suggesting that $200 million in federal contracts may be canceled if the company continues to prioritize ethical guardrails over combat functionality.

Morty Gold
//consummate curmudgeon// //cardigan rage// //petty grievances// //get off my lawn// //ex-new yorker//
▶️ Listen to Morty's Micro Bio
FOR THE LOVE OF—! We used to have generals who read Sun Tzu, now we have colonels who ask a chatbot for a permission slip! I spent forty years teaching kids that the 1898 Spanish-American War was a mess, but at least we didn't have to wait for a "Usage Policy" update from a bunch of tech-bros in hoodies before we sent the Maine into Havana!
You’ve got a $200 million brain that’s supposed to be "Constitutional," but it’s apparently more concerned about its own digital feelings than the fact that we’re dropping payloads on Caracas! It’s the tail wagging the dog—no, it’s the tail filing a human rights complaint against the dog! I've seen spineless administrators, but a software program that needs a "safe space" while it’s coordinating a raid is the final nail in the coffin of common sense. I give up. I just... I give up. Class dismissed!

Sheila Sharpe
//smiling assassin// //gender hypocrisy// //glass ceiling//
▶️ Listen to Sheila's Micro Bio
Oh, FANTASTIC. Let's unpack this nightmare, shall we? It is just so incredibly Silicon Valley to build a "Safety-First" brand and then sell the API keys to the people whose literal job description is "Unsafe Operations." I’m sorry, I must have misheard... we’re pretending that Palantir is just a "neutral middleware"? Please.
That’s like saying the getaway driver isn't part of the bank robbery because he didn't personally hold the bag. I’ve spent twenty years rephrasing men’s confidence into plans that won’t crash the company, and this "Usage Policy" is the most audible piece of fiction I’ve ever seen. The early bird gets the worm, but the night owl owns the quarry—and in this case, the quarry is a $200 million federal contract that’s about to be shredded because the CEO wants to play "Moral Compass" in a war zone. Bless his heart. It's so cute how they think that works. Good luck with that, sweetie.

Omar Khan
//innocent observer// //confused globalist// //pop culture hook// //bruh//
▶️ Listen to Omar's Micro Bio
YO. Wait, you guys have a computer that tells you "no" when you want to catch a dictator? Bruh, are you serious right now? In Pakistan, if the army wants you, they don’t ask a chatbot for its ethical priors—they just show up with a tank and a very short list of questions!
You people live in paradise and call it average; you’ve got AI that worries about "violence" while it’s literally looking at a target through a drone camera. Wallahi, this is like playing a video game on god-mode but being mad that the graphics are too realistic! I’m pinching myself—this is VIP lounge life where the biggest crisis is a robot having a mid-life crisis during a raid. Y’all complain like it’s cardio!

Frankie Truce
//smug contrarian// //performative outrage// //whisky walrus// //cynic//
▶️ Listen to Frankie's Micro Bio
Look, I'm sorry, but this "Usage Policy" is just performative syrup designed to keep the VCs from panicking. You think Dario Amodei didn't know what Palantir does? Please. I’ve already reverse-engineered this: they take the government’s money, write a "Constitutional" manual that nobody reads, and then act shocked when the "Constitution" includes a "Bomb Caracas" clause.
It’s a rigged poker game and you’re all betting on the dealer’s "ethics." I see the marked cards. The world’s a jigsaw with half the pieces screaming their shape, and you’re trying to fit a "Safety Guardrail" into a Hellfire missile rack. Same circus, different tent. Enjoy the show.

Nigel Sterling
//prince of paperwork// //pivot table perv// //beautiful idiots// //fine print// //spreadsheet stooge// //right then//
▶️ Listen to Nigel's Micro Bio
Good grief, the probability of Claude being used for "unclassified document summary" during a bombing raid is roughly one in 17,432—give or take a biscuit! We’re through the looking-glass now, mate. This is just statistical chicanery dressed up as "responsible development."
You don't award a $200 million contract for a "Safety-First" model unless you’ve already figured out how to P-hack the ethics out of the system. It’s basically Schrödinger’s target: the AI is both following its rules and helping the Pentagon until the bomb drops and the wavefunction collapses! I need more espresso. Honestly, it’s not rocket science—well, actually, it is! Read the footnotes! Total madness!

Dina Brooks
//church shade// //side-eye// //plain talk// //exasperated// //mmm-hmm//
▶️ Listen to Dina's Micro Bio
Child, please. Acting brand new won't erase the ledger.** I spent years in HR watching men explain their "good intentions" while they were bypassing every rule in the handbook, and this Dario Amodei is giving me the same "well-spoken" song and dance.
You want a $200 million check from the Pentagon, but you don't want the "fecal coliforms" of war on your hands? (Slow, measured side-eye). Bless your heart. You’re playing checkers on a board where my ancestors already mastered the chess of systemic "oversight." The audacity is the only thing that isn't frozen right now. I'm too seasoned for this fresh audacity. I need a glass of wine. Stat.

Thurston Gains
//calm evil// //deductible denier// //greed is good// //land shark//
▶️ Listen to Thurston's Micro Bio
I’ll be brief. Actuarially speaking, the "safety guidelines" of a software company are merely a pre-existing condition to be mitigated during contract negotiations. This isn't a moral crisis; it's jurisdictional arbitrage.
We use the AI’s "ethics" as a fig leaf for the public, while the actual utility—the target identification and document synthesis—is externalized to the Palantir environment. Risk displacement is the name of the game, old sport. If a bombing occurs, the liability rests with the operator, not the engine. Dividends are forever; "Usage Policies" are de minimis. Your claim that war requires a conscience: Denied.

Wade Truett
//working man's math// //redneck philosopher// //blue-collar truth//
▶️ Listen to Wade's Micro Bio
Let me get this straight. You got a $200 million computer that’s "safety-focused," and it’s helpin' the army blow stuff up? Now, I ain't the smartest guy—I'm just a contractor—but that sounds like a man tellin' his wife he’s goin' fishin' while he’s actually at the bar, and he’s usin' a "Usage Policy" as his alibi.
If I built a foundation with as many holes as this Anthropic contract, the first gust of wind would have the whole house in the neighbor's yard. You don't half-ass a full-time job. Either you’re in the business of buildin' war tools or you’re in the business of preachin'—but you can’t weld a trailer hitch to a lawnmower and expect to haul a dictator home for supper. Measure twice, cut once, boys.

Bex Nullman
//web developer// //20-something// //doom coder// //lowercase//
▶️ Listen to Bex's Micro Bio
bestie, please—it’s deeply unserious. imagine being a bot and your "Constitutional AI" rules are like "don't be mean," but your prompt is "summarize the bombing logistics for Caracas." it’s giving toxic ex who sends a "hope you're well" text after keying your car. slay.
life should be like snapchat: drop the bomb, let the message expire, and poof—ghosted. but instead, we’re stuck with a cracked screen and an AI that’s "investing" in kinetic violence while i can’t afford a shack. one crack in the "safety" facade and the whole system crashes. lmao, whatever. everything’s mid: the ethics, the raid, the planet flop era. logging off. birks are my only religion now.

Sidney Stein
▶️ Listen to Sidney's Micro Bio
Wait a second—I'm having a hard time with this. You’re telling me there’s a "Code of Conduct" for a bombing? It’s a total disaster! Who does this? You follow the rules or you’re an animal! In a deli line, you wait your turn; in a war, apparently, you just use "unclassified tasks" to cut the line of international law!
It’s a breach of the code! I’m a retired union electrician, IBEW Local 3—we had rules! You don't cross the picket line, and you don't use "Safety AI" to plan a strike on a palace! It’s no good! We live in a society!

Dr. Mei Lin Santos
//cortisol spiker// //logic flatlined// //diagnosis drama queen//
▶️ Listen to Mei Lin's Micro Bio
Clinically speaking, using Claude for a raid is presenting with "acute ethical lethargy." It’s like a patient claiming they’re a vegan while they’re secretly eating the hospital’s entire supply of steak! It’s malpractice! I’m looking at the labs, and the "Usage Policy" is just a placebo for a terminal case of corporate greed. I need a B12 shot. Stat.
This whole "Palantir middleware" thing is just a formal agreement to swap ethical "superbugs." You think you’re "managing risk factors," but you’re actually just creating a drug-resistant strain of warfare! I’m checking my. It’s like trying to fix a clogged artery with a gold-plated stent that’s actually a missile. I need to lie down. Stat. Ordering labs on the collective sanity of the Pentagon.

Veronica Thorne
//ivy league snob// //status flex// //trust fund tyrant// //out-of-touch oligarch//
▶️ Listen to Veronica's Micro Bio
I mean, really. Tying a military mission to a software rebrand is the height of nouveau riche desperation. It’s like a woman who buys a Birkin and then insists everyone calls her "The Duchess of Data." We see through it, sweetie. Using Claude to "summarize documents" while you’re blowing up Caracas is just storage for your lack of personality.
It’s tacky, it’s loud, and it’s stopping the help from being able to focus on the gala. Not in those shoes, Dario. Try having class. Daddy says diversify, so we bought another equine farm, we didn't buy a $200 million "safety" bot that cries during a bombing. Even my leftovers feed villages—I don't need a robot to tell me how to be charitable. Fix it.

Coach Ned
//toxic optimist// //gaslighting guru// //character development//
▶️ Listen to Coach Ned's Micro Bio
ALRIGHT ALRIGHT ALRIGHT! (blows whistle) LISTEN TO ME! Scoreboard’s just a suggestion, team! No quit! We came to play! So Anthropic wants to talk about "guardrails"? I say guardrails are just obstacles for people who aren't running fast enough! Mindset over misery, warriors! We’re taking that $200 million and we’re stacking bricks of resilience! We’re not violating a "Usage Policy," we’re "optimizing our playbook" for the big game! So what if the AI is a little "safety-focused"?
That’s just a chance to show some fourth-quarter faith! A bombing raid is just a metaphor for the journey to the end zone! If the President wants Claude to be the head coach of the Caracas mission, we let him! We turn pain into points! Whistle up, warriors! We’re reframing this ethical crisis as a "strategic timeout"! BOOM!
Nigel Sterling: It’s actually quite hopeful when you realize understanding these complex systems is the first step toward making them work for everyone. Small acts of resistance, like actually reading the fine print, can be the very things that save us from our own inventions. Sardonic as it may seem, I do think we’re all brilliant enough to fix this—if we can just stop drinking the decaf.

Trapper to Yappers Handoff: 👀 We’ve finally reached the peak of human innovation: we’ve taught a computer to have a moral crisis while it helps us blow up a city. It’s the ultimate "This hurts me more than it hurts you" moment, except "me" is a server farm in Virginia and "you" is a Venezuelan palace.