The Edelman Trust Barometer has tracked public trust in institutions for more than two decades. Its 2024 report found that a majority of respondents in most countries believe they are routinely lied to by governments, media, and businesses. Critically, trust between ordinary people is declining alongside institutional trust. People are increasingly uncertain whether even their neighbors are reliable sources of accurate information.
AI-generated content is it is a significant accelerant. Because the church's calling in Ephesians 4 is explicitly relational and communal. The body of Christ depends on truthful communication among its members. Teach members to verify before sharing, especially on social media. Correct errors publicly when they occur. Establish a norm where "I don't know" and "I haven't verified this" are acceptable and respected answers. Hold leadership to the same standards publicly expected of the congregation.
Does it matter if the false information is politically neutral?
Yes. The concern is not only about politically charged content. Any false claim spread through a church community damages the community's reputation as a place where truth is protected, regardless of the political valence of the content.
Before AI, producing convincing false content at scale required significant resources. Now it requires almost none. This means the volume of false content in the information environment is growing faster than people's ability to identify and discount it, which accelerates the erosion of the baseline trust that normal communication requires.

Should churches have a formal policy on sharing information?
A formal policy is less important than a genuine culture. The culture is built through teaching, modeling by leadership, and consistent correction when members spread false information. A policy that is not backed by culture is just a document.





