Anthropic’s Claude Code Leak: Why the Instinct to Fire Someone Is the Lazy Response

418
4

Last week, Anthropic accidentally leaked nearly 2,000 internal source code files for Claude Code, their AI coding tool. Within hours, the code was copied across the internet. A post sharing a link to it got 29 million views.

Anthropic called it “a release packaging issue caused by human error.” News outlets repeated that framing. And predictably, the internet had one question: did someone get fired?

It's a reasonable question. Most people would ask it. I would have asked it too, earlier in my career.

But here's what I've learned in 30 years of studying how organizations handle mistakes: the decision of whether or not to fire someone after a high-profile error is one of the most revealing choices a leader makes. And most leaders get it exactly backward.

The Most Expensive Person to Fire

Thomas J. Watson, the founder of IBM, was once asked if he planned to fire an employee whose mistake cost the company $600,000. His response: “No, I just spent $600,000 training him.”

Watson wasn't being soft. He was being strategic. That employee now understood a failure mode better than anyone else in the building. Firing him would mean giving away that expensive lesson for free — and handing it to whatever company hired him next.

Boris Cherny, the creator of Anthropic's Claude Code, seemed to understand this when the internet came calling for someone's head. Nobody was fired. Cherny said the person involved still had the company's full trust, and that it was a process failure anyone could have fallen into.

“It's never an individual's fault,” he wrote.

boris cherny individuals fault - Tweet

Some people read that and thought it was a PR line. I read it and thought: that's someone who's actually thought about how mistakes work.

Why Firing Feels Right (But Isn't)

Let's be honest about why the instinct to fire someone is so strong. It's satisfying. It signals decisiveness. It tells the rest of the organization: we take this seriously. And it gives everyone — the boss, the board, the public — a clean story. There was a problem. We found the person responsible. We dealt with it. Move on.

The trouble is that clean story is almost always a fiction.

From what's been reported, the Anthropic leak happened because of a manual step in their release packaging process — a step that apparently lacked the automated safeguards it needed. The person who tripped the wire didn't create the wire. They walked into a system that was already set up to produce this outcome, eventually, with someone.

Firing that person doesn't remove the manual step. It doesn't add the missing automation. It doesn't fix the process. All it does is replace someone who now viscerally understands the failure with someone who doesn't.

And it sends a message to everybody else: when something goes wrong here, keep your head down.

What You Actually Lose When You Punish

This is the part that's invisible to most leaders, which is why it's so costly. Punishment doesn't just affect the person being punished. It changes the behavior of everyone who's watching.

In organizations that punish mistakes, people learn to hide them. They learn not to report near misses. They learn to cover tracks instead of raising flags. The errors don't decrease — they just go underground, where they compound until something bigger breaks.

I've seen this pattern across industries. In healthcare, punishing nurses for medication errors doesn't reduce the error rate. It reduces the reporting rate. Reported events go down. Actual events stay the same or get worse, because now nobody is surfacing the signals that would allow the system to learn and improve.

Aviation figured this out decades ago. The airline industry built mandatory incident reporting systems that, by law, cannot punish the individuals who use them. Pilots report errors and near misses freely because they know the goal is prevention, not prosecution. That single cultural and structural choice is a huge part of why commercial flying became the safest mode of transportation by far.

The comparison is worth thinking about. Aviation chose learning. Healthcare, for the most part, chose blame. Look at the safety records and tell me which approach works.

Read more: Beyond Blame: How Punishing Healthcare Workers Fails to Prevent Medication Errors

Punishment Is the Easy Option

There's a perception that firing someone is the “tough” response and that not firing them is the “soft” response. I think that's exactly backward.

Firing someone is easy. It takes one conversation and some paperwork. It gives you a scapegoat and a press statement. It lets everyone else off the hook — including the leaders who built or tolerated the system that made the mistake possible.

Actually investigating the system? That's hard. It means asking uncomfortable questions. Why was this step manual? Why didn't we automate it? Why were we shipping faster than our quality controls could keep up with? Those questions often point back to decisions made by the people in charge, not the person on the line.

Anthropic had been shipping at an extraordinary pace. Claude Code's run-rate revenue reportedly grew past $2.5 billion. Speed creates pressure. Pressure creates shortcuts. Shortcuts create gaps. And then a person walks into one of those gaps and gets blamed for falling in.

The disciplined response — the one that actually prevents the next incident — is to fix the gap. Not to replace the person who found it the hard way.

What a Shrewd Response Looks Like

Cherny's public response did something that I think a lot of business leaders would have talked themselves out of. He acknowledged the mistake, named it as a process issue, confirmed nobody was punished, and focused on prevention.

In the short term, that might look weak to some observers. In the long term, it's the move that produces better outcomes. Here's why:

The person who made the mistake is now the most motivated and informed person on your team when it comes to this particular failure mode. They will never make this specific error again, and they'll probably catch it when someone else is about to. You can't buy that knowledge. You can only earn it the hard way — or throw it away.

Meanwhile, every other engineer at Anthropic watched what happened and learned: this is a place where I can admit mistakes without destroying my career. That's the foundation of what researchers call psychological safety, and it's the single biggest predictor of whether a team actually improves over time or just gets better at hiding problems.

Toyota has operated this way for decades. When something goes wrong, the question isn't “who messed up?” It's “what in the system allowed this to happen?” And Toyota's track record — in quality, reliability, and sustained performance — speaks for itself.

The Real Test

Compare Anthropic's response with SolarWinds, whose CEO publicly blamed an intern for a major security incident. One company said: the system failed and we'll fix it. The other said: we found the person and we dealt with it.

Which company would you rather work for? Which company do you think gets more honest problem reports from its engineers? Which company do you think is more likely to catch the next issue before it becomes a headline?

The answer to all three is the same.

I've been writing and teaching for years that “human error” is not a root cause. It's a description, not a diagnosis. The diagnosis comes from asking why. Why was the mistake possible? What conditions allowed it? What would need to change so that the next person in the same situation can't make the same error?

Those questions don't have satisfying, one-sentence answers. They don't give you someone to fire. But they give you something much more valuable: a system that actually gets better.

The question for every leader reading this is straightforward. The next time something goes wrong on your watch — and it will — are you going to take the easy path or the effective one?

Get New Posts Sent To You

Select list(s):
Previous articleRyan McCormack’s Operational Excellence Mixtape: April 3, 2026
Next articleNew Book: “Psychological Safety for Lean Leaders” — Now Available (In Progress) on Leanpub
Mark Graban
Mark Graban is an internationally-recognized consultant, author, and professional speaker, and podcaster with experience in healthcare, manufacturing, and startups. Mark's latest book is The Mistakes That Make Us: Cultivating a Culture of Learning and Innovation, a recipient of the Shingo Publication Award. He is also the author of Measures of Success: React Less, Lead Better, Improve More, Lean Hospitals and Healthcare Kaizen, and the anthology Practicing Lean, previous Shingo recipients. Mark is also a Senior Advisor to the technology company KaiNexus.

4 COMMENTS

  1. I recently read that the manager’s job is to make the wrong things hard to do and the right thing easy to do.

    this from the book The Friction Projecy

  2. It’s refreshing to see the perspective that firing someone after a code leak isn’t always the most effective solution. This idea challenges the common, often reflexive, punitive approach in such situations.

  3. I really appreciate the point about how firing someone can often be the lazy response. It’s so true that the instinct is to just punish instead of understanding what went wrong.

LEAVE A REPLY

Please enter your comment!
Please enter your name here