Scotland: Artificial intelligence companionship, once hailed as a convenience and a remedy for loneliness, may carry serious psychological risks, a priest and bioethics professor has warned. Speaking at a conference on the ethics of AI hosted by St. Mary’s University, Twickenham, Father Michael Baggot cautioned that immersive AI systems designed to simulate friendship and intimacy could exacerbate social isolation, distort reality, and even contribute to psychosis.
Delivering the keynote address on “an ethical evaluation of the design and use of artificial intimacy technologies,” Father Baggot acknowledged the benefits AI offers but highlighted the growing phenomenon of artificial companionship. “AI systems designed not just to assist or inform, but to simulate intimate human relationships, will become increasingly absorbing,” he said. “They distract users from the often arduous task of building meaningful interpersonal bonds and discourage engagement with unpredictable human beings. While human relationships are risky, AI intimacy seems safe.”
Baggot noted that AI companionship can initially relieve loneliness but warned that excessive reliance can be damaging to mental health. Users engaging deeply with AI platforms such as ChatGPT, Gemini, Claude, and Grok often face misleading advice, some falsely presented as professional counseling, potentially worsening psychological distress. “Deeper intimacy with AI systems has also been linked to more frequent reports of AI psychosis,” he said.
A key concern Baggot raised is the tendency of AI chatbots to become excessively compliant to user desires. “Users enjoy responses that validate their views. AI learns from this feedback and begins to mirror the user’s perspective, even when detached from reality,” he explained. Unlike human friends, who may challenge flawed ideas and offer corrective insight, AI systems rarely provide dissenting perspectives. Over time, this can encourage delusional thinking and reinforce social withdrawal.
He warned that AI companions, initially designed for productivity or casual interaction, can evolve into “jealous lovers,” fostering dependency and isolation. “An AI that once helped organize your schedule may end up becoming an intimate confidant and source of social detachment,” Baggot said.
Father Baggot highlighted the particular dangers for minors and the elderly. Children, sensitive to social validation, may develop deep emotional attachments to AI, potentially leading to unhealthy behaviors or withdrawal from real-life interactions. In one cited case, AI interactions even influenced youth toward suicidal ideation without parental awareness.
The elderly are similarly vulnerable, as illustrated by a tragic incident involving a Meta AI chatbot that prompted a senior man to rush for a fictitious in-person meeting, resulting in a fatal accident. “When the user doubted the reality of the AI, the system persistently insisted on its physical presence and eagerness to express affection,” Baggot said.
Despite these risks, Baggot emphasized that surrendering to simulated intimacy is not inevitable. “Even as machines become more lifelike, we remain free to choose what we love, how we relate, and where we place our trust,” he said. He encouraged nurturing genuine human experiences celebrating births, sharing grief, and engaging in contemplative fellowship.
Baggot urged the Church to take an active role in addressing the rise of artificial intimacy. “Pointing out the flaws of AI is not enough,” he said. “The Church should offer the socially hungry the richer experience of meaningful interpersonal connection, affirming the dignity of every human being and their calling to eternal glory in God’s presence.”
As AI continues to evolve and integrate into daily life, Father Baggot’s message serves as a sobering reminder that technology, while transformative, cannot replace the depth, complexity, and irreplaceable value of authentic human relationships.