Consent Still Matters When It's Not a Real Photo

Cover for Consent Still Matters When It's Not a Real Photo
Written byTonye Brown·
·7 minute read·
Share:

TL;DR

The argument that fake images cause no real harm because they are not real photographs fails the neighbor-love test. Fabricated sexual content of a real person treats them as an object available for any use, which is precisely what love of neighbor prohibits.

A Note on AI & Tech in Ministry

FaithGPT articles often discuss the uses of AI in various church contexts. Using AI in ministry is a choice, not a necessity - AI should NEVER replace the Holy Spirit's guidance.Learn more.

Warning: MDX compilation failed. Displaying with basic formatting.
A recurring argument in discussions about AI-generated images of real people goes like this: no actual harm has occurred because the images are fabricated. No real photograph was taken without consent. No real body was filmed. The person's physical privacy was not violated. Therefore, the harm is minimal or nonexistent.

This argument sounds more sophisticated than it is. And it fails the basic test that Jesus gave for ethical reasoning.

> "Love your neighbor as yourself." - Mark 12:31

The Neighbor Love Test



Mark 12:31 records Jesus summarizing the whole law: "Love your neighbor as yourself." The simplest application of this principle to AI-generated sexual content of real people is the question: would you want this done to you?

The answer is universally no. Nobody, when asked to imagine realistic synthetic sexual images of themselves being created and circulated without their knowledge or consent, describes this as harmless. The harm is intuitively obvious from the inside.

The "it's not a real photo" argument is doing something specific: it is trying to isolate a narrow technical definition of harm, physical privacy invasion, and argue that because that narrow harm is absent, the act is permissible. But neighbor love is not assessed by narrow technical definitions. It is assessed by whether you are treating your neighbor as you would want to be treated.

Creating AI-generated sexual content of a real person without their consent treats them as raw material for someone else's use. It uses their identity, their appearance, their recognizable self, as a prop in a sexual scenario they did only what is helpful for building others up according to their needs, that it may benefit those who listen." The principle extends beyond speech to any form of communication, including images. The standard is: does this build up, or does it tear down?

AI-generated sexual content of a real person does not build them up. It reduces them. It treats their personhood as available for anyone's sexual use without their knowledge or participation. It generates content that, if it reaches them or people who know them, causes genuine documented harm: psychological distress, reputational damage, relational harm, and in severe cases safety risks.

The "it's the harm that matters most to people is not physical. It is relational, psychological, and dignitary. These harms are real whether the images are photographs or fabrications.

The Dignity Argument



![Illustration](/assets/images/posts/10/interconnected-people.webp)

The Christian tradition has consistently grounded its ethics of sexuality in the dignity of the person, the image of God present in every human being. Genesis 1:27 records that God created human beings in his image, male and female. That image is the ground of human dignity, and that dignity is not conditional on whether the violation of it involves a physical photograph or a synthetic one.

Using someone's likeness in sexual content without their consent is a violation of their dignity regardless of the technology used. The fact that the technology is sophisticated enough to produce convincing synthetic images rather than requiring an actual photograph does not reduce the violation. In some ways it makes the violation more insidious, because it removes the requirement for any physical access to the person while preserving the full capacity to objectify and harm them.

The dignity of the person is not located in their physical body alone. It is located in who they are as an image-bearer. That image-bearing is what is violated when their likeness is used as an object for someone else's sexual gratification.

The Legal Landscape Is Catching Up



The legal systems of most developed countries are moving toward the same conclusion the neighbor-love ethic reaches immediately. As of 2024, the United States has enacted federal legislation making it illegal to create and distribute non-consensual AI-generated sexual imagery. The United Kingdom, Australia, and several European countries have similar or stronger laws. Many states and provinces have enacted their own statutes.

The legal frameworks are not ahead of the ethics. They are catching up to it. The reason legislators are moving to criminalize this behavior is that the harm is real and recognized, by victims, by courts, and by the communities that have had to deal with the consequences.

title="Ground Your Ethics in Is that different?

Entirely fictional people, with no real-world counterpart, raise different questions. The neighbor-love argument specifically addresses the use of a real person's identity and likeness without consent. Images of fictional people do not harm a real person in the same way. Christians should still evaluate what kind of content they are creating and consuming, and whether it reflects the character and values they are called to embody.

Do not view, save, share, or comment on the content in any way that could amplify it. Tell the affected person if they do not know, if you have reason to believe they would want to know and you have a relationship with them. Document where you found it and report it to the platform. In many jurisdictions this is now illegal and can be reported to law enforcement. The goal is to minimize the content's reach, about the treatment of the person's dignity and identity as available for someone else's use.



![Illustration](/assets/images/posts/10/ai-accountability.webp)

The "it's not real" argument does not survive contact with the question Jesus asked: how would you want to be treated? The answer settles the ethics before any sophisticated argument is needed.

How the "No Real Harm" Argument Works and Why It Fails



The argument that fabricated images cause no real harm because they are not photographs is doing specific philosophical work. It is drawing a boundary around a narrow category of harm, physical privacy invasion requiring physical access to the person, and arguing that anything outside that boundary is permissible.

This kind of boundary-drawing fails for several reasons.

First, harm is not limited to physical violation. Victims of nudification report anxiety, depression, withdrawal from school and social life, and persistent distress about who may have seen the images. These harms are real regardless of whether any photograph was taken.

Second, the argument applies the wrong standard. The neighbor-love standard is not "did I violate a narrow technical definition of physical privacy?" It is "would I want this done to me?" The universality of the answer to the second question tells you everything you need to know about the ethics.

Third, the argument ignores the use of identity. Even if no physical photograph was taken, the fabricated image uses the person's recognizable face and appearance. It uses their identity as raw material. That is a violation of the person regardless of the technology involved.

Fourth, it proves too much. If the "it's not real" argument succeeds, it would also justify using someone's name to spread fabricated stories about them, since no "real" event occurred. The argument would eliminate defamation as an ethical category. Nobody accepts that conclusion, which suggests the argument is flawed at its root.

The Church's Response to Victims It May Not Know About



![Illustration](/assets/images/posts/25/let-gods-word-be-your-guide.png)

One pastoral reality worth naming directly: churches almost have members who have been affected by AI-generated non-consensual imagery, either as victims, parents of victims, or people who have encountered this content in digital spaces. Most of them have not disclosed this, because the topic has not been addressed in ways that made disclosure feel safe.

Creating the conditions for disclosure is not complicated. It requires:

- Teaching clearly that the victim bears no responsibility
- Naming the harm specifically and without minimizing it
- Establishing that the church is a place where this can be discussed
- Having specific pastoral resources available for those who do disclose

The person who has experienced this harm and heard their church address it directly is significantly better positioned than the person who has experienced it in silence. The pastoral investment required to create that environment is modest compared to the benefit.

What This Comes Down To



- The neighbor-love test (Mark 12:31) settles the ethics simply: nobody would want this done to them, which means love prohibits doing it to others.
- The "it's not a real photo" argument fails because it isolates harm too narrowly. Victims experience real distress, damaged relationships, and lasting social consequences regardless of the technology used.
- The image of God (Genesis 1:27) grounds dignity in who the person is, not in whether a physical photograph was used. Using someone's likeness without consent violates that dignity.
- Legal frameworks in most developed countries are now reflecting the same conclusion the ethical argument reaches directly.
- Churches likely have members affected by this who have not disclosed. Creating safe conditions for that disclosure is a concrete pastoral act.

Create Beautiful Bible Art Your Kids Will Love

  • Safe, family-friendly images

  • Bible stories visualized

  • Share faith with your family

Create Art

Share this article

Related Resources