In December, the AI Research Group for the Vatican Centre for Digital Culture released Encountering AI: Ethical and Anthropological Investigations, a study of artificial intelligence and its role in human life and society. The result is a book intended to guide conversation and discernment about the ethical uses of AI for Catholics and non-Catholics. Noreen Herzfeld, the Nicholas and Bernice Reuter Professor of Science and Religion at St. John’s University and the College of St. Benedict, and John P. Slattery, the director of Carl G. Grefenstette Center for Ethics in Science, Technology, and Law at Duquesne University, are two of the book’s authors. Bishop Paul Tighe is the secretary of the Dicastery for Culture and Education and the founder of the Centre for Digital Culture. They spoke recently with Associate Editor Regina Munch. Their conversation has been edited for clarity and length.
Regina Munch: The book is split into two parts: the first is a Catholic anthropology that addresses questions of personhood and intelligence, and the second is a practical outline of the role that AI is playing or could play in the world. The first section takes up the question of whether AI can have consciousness or an interior life. Something like the Turing test—which looks at whether an AI can imitate human communication—sets the bar fairly low. What’s the difference between imitating thought processes and actually having a thought, and why does that matter?
Noreen Herzfeld: There’s a big difference between imitation and reality. In the later sections of this anthropological discussion, we talked about relationship. One of the big parts of relationship is having empathy for the other person. To truly have empathy, you need to actually feel something, to have emotion, not just mimic it. It turns out that computers are pretty good these days at recognizing emotional cues, and they can also mimic emotional responses or say the right words. But think about it in human relationships. There are people who don’t feel a lot of empathy, and yet they’re often quite socially adept at recognizing emotional cues and giving back the proper response. But we call those people sociopaths. For a computer to imitate human responses is ultimately going to be equally lacking.
RM: The importance of that interpersonal encounter shows up in the title of the book, Encountering AI. Could you say more about that?
NH: We used Francis’s understanding of encounter to center our book because we found that it was applicable to both sections. In the anthropological section, if we set encounter as the center of our relational existence, we then had to ask: What do we need to have an authentic encounter? For one thing, consciousness is a sine qua non; without consciousness, you can’t have a full encounter with another. That’s why we took so much time to explore what consciousness really is and why machines as we know them cannot have it. We also chose the theme of encounter because it was the center of our ethical concerns in the second half of the book.
John P. Slattery: As Pope Francis often says, we are human through our encounter with others. Within that encounter, you can start to see a more pastoral ethic of AI and technology come through. Rather than simply saying it’s a technocratic paradigm that’s turning us all into machines and we have to push back against it, we can recognize that we all live with technology all the time, and it’s not all horrible. It lets us have conversations across continents! In the ethics part of the book, we mined Francis’s work and Catholic social teaching from the past few decades to determine the unique Catholic and Christian approaches to considering AI ethically. Where can that lead us?
NH: AI ethics has suddenly blossomed as a field. But the approach is often legal or somewhat philosophical. What is unique about this book is what John pointed out: this is a religious approach from a specific religious tradition. That adds a new angle to the growing field.
Paul Tighe: So many people working in AI were determined that it be human-centric. But the real discussion emerges when we ask what “human-centric” means. How inclusive is our vision of what is human-centric? The book comes out of a tradition that has reflected long and hard on what it means to be human, and it’s doing that in a way that’s intended to invite others to the table. There’s a real danger in assuming that AI ethics has to be led by experts or left to lawyers and politicians. People have legitimate concerns about AI, and offering them a language and a framework will empower them to bring their insights into the discussion.
RM: As you point out in the book, we all use AI already. We’re all involved in these questions; we don’t really have a choice. How did the anthropological considerations in the book inform the ethical discussions of particular topics?
JS: One of the sections is about work, the economy, and the nature of the person. There’s a lot being written now on the nature of labor and the role of generative AI in work, especially with the actors’ and writers’ strikes. We can talk about the role that AI has in the job market, when it allows companies to push only for greater efficiency, leading to a further loss of human dignity. But there’s also a potential to use AI to take onerous work off people’s shoulders. Whether technology is used for the sake of human dignity or against it rests on whether we have a strong theological anthropology. Otherwise, it’s too easy to say, well, what’s the limit of human dignity? How much more efficiency can we get from people—do people really need this much sleep? We need to be grounded in an anthropological tradition to be able to make ethical claims.
PT: The people we talked to from Silicon Valley recognize that workers may lose their jobs because of AI, and they want assurance that there will be some sort of universal benefit to compensate them. But our tradition would respond that work isn’t just where you earn your living. It’s where we find our identity and our sense of sociality, where we give expression to our creativity. The dignity of work is rooted in our being made in the image and likeness of God and in communion with God. Pope Francis was recently speaking to people from Silicon Valley about the use of predictive AI to decide parole hearings for people who had been convicted of crimes. Very often, past behavior is the best predictor of how people are going to behave. But our tradition wants to talk about the potential for conversion, the presence of grace for change within people. How do we ensure we keep alive that richer sense of humanity that is nourished by our own theological and philosophical traditions?
RM: One thing that comes up often in the book is the risk of deskilling—that by using AI we might lose the ability or interest in doing certain kinds of work. There’s also a kind of ethical deskilling that can happen when we outsource ethical decisions to AI. Could you say more about that risk?
JS: One of the more obvious places where this can come into play is the realm of education. Are we putting the tasks of nurturing and educating children in the faith into the hands of technology—which is really into the hands of whatever developer or company is putting out that technology? In doing so, have we lost some of the ability to sit down with students and talk to them about what growing in knowledge means? That doesn’t mean that all technology-guided education isn’t valuable. But there is an impetus to give more and more of education over to generative AI because it seems like an easy thing to do. But what it really does is remove our responsibility. Obviously, this is even more extreme in the military, when it comes to wartime decisions of targeting and killing. But in both areas, we can readily give up something and don’t realize it’s gone until we remember we used to have more of a hand in these decisions.
NH: In education, we have to be aware that especially younger children are extremely mimetic. They’re not just receiving information from their teacher; they’re also taking in their teacher’s enthusiasm and love for a subject. They’ll be watching the patience that the teacher exhibits as they work with other students. We often think of education as just transferring sets of information, but it’s about forming the whole person.
Pope Francis recently coined a new term, at least in English: rapidification. Computers work at a speed that humans do not. We saw in the 2008 stock-market crash how this can be a problem—when computers start talking to each other, they can move so quickly that humans can’t stay in the loop. In the military, it’s a very definite fear that soon, computers will be launching cyberattacks that will be parried by other computers so fast that a human simply can’t keep up. This is also true in conventional warfare as strategic decisions are being made by computers at a much more rapid pace than we have previously experienced.
JS: Warfare has become less and less personal over the years, and that has led to a separation of causality and responsibility. In some ways, AI is simply continuing down that track, but thankfully, people are realizing that we need to pause before we completely remove all humans from the equation. It’s sometimes assumed that the less humans are involved, the less bloody the war will be. But if we’re going to wage a just war, we need to have humans in the loop as moral claimants over wartime actions.
RM: There’s a risk of making humans in the image of AI. In the book you talk about how AI is meant to optimize, so it prioritizes appearance over reality. But that’s not what being a good human is. How do we make sure that we aren’t led into a life of optimization?
NH: I hope this book will help people to recognize how they are putting appearance above reality. It doesn’t take AI to do that. Just think about the culture of Instagram or TikTok and how much of it is manufactured appearance rather than the actual reality of people’s lives. I’m hoping the book will help people recognize that the center of their lives must be relationships and human encounters. Insofar as technology puts appearance above reality or distances those encounters, we need to fight against it.
JS: In the book, we extrapolate the notion of encounter to the notion of integral ecology. Encounter is not just human to human; it’s also human to the non-human natural world. This encounter is central to maintaining our humanity and our relationship with God. This framing of encounter does not condemn all technology. Technology is made of the pieces of the Earth; it’s a product of nature and human labor. It’s a matter of owning that responsibility and not forgetting where those resources come from. We have to maintain those small encounters with non-human animals, with plants and nature, as well as with other humans, because they help to define who we are as humans in relation to God and other living things.
PT: A lot of our technology we judge according to its utility—if it’s functioning well and delivers what we want and value. My worry is that we could let those criteria contaminate our ability to recognize people’s intrinsic dignity. We must remember that dignity is not dependent on performance or utility. Noreen mentioned social media, where young people, in particular, have to be able to project an attractive image of themselves—one that’s not necessarily true, but that will command the attention of others. I think we have to rediscover that sense of the intrinsic worth and value of a person that’s not simply down to their functionality or usefulness.