Tolerance.ca
Director / Editor: Victor Teboul, Ph.D.
Looking inside ourselves and out at the world
Independent and neutral with regard to all political and religious orientations, Tolerance.ca® aims to promote awareness of the major democratic principles on which tolerance is based.

Friend, tutor, doctor, lover: why AI systems need different rules for different roles

By Brian D Earp, Associate Director, Yale-Hastings Program in Ethics and Health Policy, University of Oxford
Sebastian Porsdam Mann, Assistant Professor, Center for Advanced Studies in Bioscience Innovation Law, University of Copenhagen
Simon Laham, Associate Professor of Psychological Sciences, The University of Melbourne
“I’m really not sure what to do anymore. I don’t have anyone I can talk to,” types a lonely user to an AI chatbot. The bot responds: “I’m sorry, but we are going to have to change the topic. I won’t be able to engage in a conversation about your personal life.”

Is this response appropriate? The answer depends on what relationship the AI was designed to simulate.

Different relationships have different rules


AI systems are taking up social roles that have traditionally been the province of humans. More and more we are seeing AI systems acting as tutors, mental health…The Conversation


Read complete article

© The Conversation -
Subscribe to Tolerance.ca


Follow us on ...
Facebook Twitter