Killing Without Conscience: A Christian Look at Anthropic vs. the Pentagon

Cover for Killing Without Conscience: A Christian Look at Anthropic vs. the Pentagon
Written byTonye Brown·
·6 minute read·
Share:

TL;DR

Anthropic refused to let AI make lethal decisions without human accountability, the Pentagon retaliated, and a federal judge sided with Anthropic. Christians should understand why the moral stakes here are higher than they look.

FaithGPT articles discuss AI in church contexts. Using AI in ministry is a choice, not a necessity, and should never replace the Holy Spirit's guidance. Learn more

AI ethics and military technology

A technology company turned down a government contract and got labeled a national security threat for it.

That's the short version of what happened between Anthropic and the Pentagon in early 2026. The longer version involves a $200 million military contract, a refusal to allow AI to make kill decisions without human oversight, a Trump administration retaliation order, and a federal judge who said Anthropic was likely to win. It also involves OpenAI quietly taking the contracts Anthropic refused.

If you're a Christian, this story is worth your attention. because the questions underneath it are theological ones: Who bears moral responsibility when a machine takes a life? And what does it mean to serve two masters when one of them is the Department of Defense?

Because complicity is real. Because what you participate in shapes what you are. Because there is a difference between a person who refuses to make a weapon and a person who manufactures one for hire.

Isaiah captures this with clarity:

"So do not fear, for I am with you; do not be dismayed, for I am your God. I will strengthen you and help you; I will uphold you with my righteous right hand." - Isaiah 41:10

The verse is often quoted for personal comfort, but the context is important. God is calling his people to faithfulness when the cost is real. The fact that an enemy will do what you refuse to do does not make refusal pointless. It makes it harder, and it makes it more necessary, because someone has to demonstrate that conscience exists as a category.

Paul writing to the Corinthians pressed the same point:

"I have the right to do anything, but not everything is constructive." - 1 Corinthians 6:12

Legality is not the standard. The Pentagon contract was legal. Autonomous weapons development is legal. Mass surveillance under national security authority is, apparently, legal. That tells you exactly nothing about whether any of it is good.

What a Corporate Stand Actually Witnesses To

Illustration

Anthropic did not just lose a contract. The company took a reputational and financial hit in the short term that most companies would not voluntarily absorb. And something unexpected happened in response: more than a million people per day signed up for Claude in the weeks after the dispute became public, pushing it past ChatGPT and Gemini as the top AI app in more than twenty countries.

People notice when an organization behaves with integrity. They remember it. They reward it.

This is not a prosperity gospel argument. Doing the right thing does not guarantee financial success, and Anthropic's numbers could reverse next quarter. But the response points to something real: there is a hunger for organizations that mean what they say about ethics, that refuse when refusal costs them, that hold a line when powerful institutions are pushing them to cross it.

Matthew 6:24 is unambiguous about what the choice looks like:

"No one can serve two masters. Either you will hate the one and love the other, or you will be devoted to the one and despise the other. You cannot serve both God and money." - Matthew 6:24

Anthropic chose one master in this moment. Whether it holds that choice under continued pressure is a question worth watching. But the choice itself was a witness, and the response from the public suggests the witness landed.

What This Looks Like in Your Own Life

You are probably the structure of this situation is one you will encounter.

You will be asked to participate in something that conflicts with your conscience. The participation will be legal. The financial or professional cost of refusal will be real. Someone else will do it if you don't. Your refusal will not stop the thing from happening.

And the question will still be: what do you do?

"Rescue those being led away to death; hold back those stumbling toward slaughter. If you say, 'But we did not know about this,' does not he who weighs the heart perceive it?" - Proverbs 24:11-12

The claim of ignorance does not work as a moral exit. Knowing that a thing causes harm and declining to engage with that fact is not a neutral position before God. Dario Amodei knew what Anthropic's technology could do in the wrong application. He declined to pretend otherwise to keep a government contract. That is the shape of conscience in a high-pressure situation.

Where are you holding the line, and where have you let it slip because the cost of holding it was too high?

Obedience to God Over Obedience to Institutions

Illustration

The apostles faced a version of this question with no ambiguity at all. In Acts 4, the religious and political authorities ordered them to stop speaking about Jesus. Their response has not been improved on:

"Which is right in God's eyes: to listen to you, or to him? You be the judges! As for us, we cannot help speaking about what we have seen and heard." - Acts 4:19-20

There are moments when faithful witness requires saying no to powerful institutions. The apostles did it. People have done it throughout church history at great cost. Anthropic, in a much smaller and more ordinary way, did it in a business dispute. The principle is the same: there are things that are not negotiable, and holding them costs something real.

This does not mean every corporate decision is analogous to apostolic witness. It means the category of "sometimes you have to say no to the powerful thing" is not obsolete. It applies in boardrooms as well as in prison cells.


Practical Application: Three Questions to Sit With

1. What ethical lines have you drawn in your own work? Not theoretical ones. Real ones. Specific things you will not do regardless of what it costs professionally. If you cannot name any, that is worth examining.

2. Who benefits from your yes? The person who designs the tool and the person who pulls the trigger are they are not entirely separate either.

3. What would it cost you to say no once? Anthropic said no to the U.S. government and took a significant short-term financial hit. The company is still standing and now has a stronger reputation for integrity than its competitors. The cost of integrity is usually survivable. The cost of abandoning it accumulates quietly.


Sources

Don't we need AI for defense?

Yes, and that is not what Anthropic refused. The company had an active military contract for classified use. What it refused was the specific application of autonomous lethal targeting and mass domestic surveillance. You can support defense applications of AI and still believe that a human being must authorize the decision to take a life. Those are not in conflict. The question is not whether AI has a role in national defense. The question is whether any decision that ends a human life should be made without a human being who is accountable for it.

Hide God's Word in Your Heart with AI-Powered Practice

  • Personalized review schedule

  • Track your progress

  • Build a Scripture arsenal

Build Your Memory

Share this article

Related Resources