Computer says "No": The Case Against Empathetic Conversational AI

Findings of the Association for Computational Linguistics: Acl 2023 (2023)
  Copy   BIBTEX

Abstract

Emotions are an integral part of human cognition and they guide not only our understanding of the world but also our actions within it. As such, whether we soothe or flame an emotion is not inconsequential. Recent work in conversational AI has focused on responding empathetically to users, validating and soothing their emotions without a real basis. This AI-aided emotional regulation can have negative consequences for users and society, tending towards a one-noted happiness defined as only the absence of "negative" emotions. We argue that we must carefully consider whether and how to respond to users' emotions.

Links

PhilArchive

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Meeting with the Depicted Other.Marge Paas - 2015 - Philosophy Study 5 (10).
The Empathetic Soldier.Kevin Cutright - 2019 - International Journal of Philosophical Studies 27 (2):265-285.
Violent computer games, empathy, and cosmopolitanism.Mark Coeckelbergh - 2007 - Ethics and Information Technology 9 (3):219-231.
Resisting Empathy Bias with Pragmatist Ethics.William Kidder - 2019 - Contemporary Pragmatism 16 (1):65-83.
Teaching Empathy in Medical Ethics.Deborah R. Barnbaum - 2001 - Teaching Philosophy 24 (1):63-75.
In Defense of the Moral Significance of Empathy.Aaron Simmons - 2014 - Ethical Theory and Moral Practice 17 (1):97-111.

Analytics

Added to PP
2023-04-15

Downloads
146 (#129,467)

6 months
91 (#52,080)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Alba Curry
University of Leeds

Citations of this work

No citations found.

Add more citations

References found in this work

No references found.

Add more references