Friend, tutor, doctor, lover: why AI systems need different rules for different roles
By Brian D Earp, Associate Director, Yale-Hastings Program in Ethics and Health Policy, University of Oxford
Sebastian Porsdam Mann, Assistant Professor, Center for Advanced Studies in Bioscience Innovation Law, University of Copenhagen
Simon Laham, Associate Professor of Psychological Sciences, The University of Melbourne
“I’m really not sure what to do anymore. I don’t have anyone I can talk to,” types a lonely user to an AI chatbot. The bot responds: “I’m sorry, but we are going to have to change the topic. I won’t be able to engage in a conversation about your personal life.”
Is this response appropriate? The answer depends on what relationship the AI was designed to simulate.
Different relationships have different rules
AI systems are taking up social roles that have traditionally been the province of humans. More and more we are seeing AI systems acting as tutors, mental health…
Read complete article
© The Conversation
-
Sunday, April 6, 2025