"I don't trust you, you faker!" On Trust, Reliance, and Artificial Agency

Teoria 39 (1):63-80 (2019)
  Copy   BIBTEX

Abstract

The aim of this paper is to clarify the extent to which relationships between Human Agents (HAs) and Artificial Agents (AAs) can be adequately defined in terms of trust. Since such relationships consist mostly in the allocation of tasks to technological products, particular attention is paid to the notion of delegation. In short, I argue that it would be more accurate to describe direct relationships between HAs and AAs in terms of reliance, rather than in terms of trust. However, as mediums of human actions to which tasks are delegated, AAs indirectly mediate trust between users and other social actors involved in their design, manufacture, commercialisation and deployment. In this sense, AAs mediate social trust. My conclusion is that relationships between HAs and AAs are thus to be understood directly in terms of reliance and indirectly in terms of social trust mediation.

Links

PhilArchive



    Upload a copy of this work     Papers currently archived: 91,139

External links

Setup an account with your affiliations in order to access resources via your University's proxy server

Through your library

Similar books and articles

Levels of Trust in the Context of Machine Ethics.Herman T. Tavani - 2015 - Philosophy and Technology 28 (1):75-90.
A Metacognitive Approach to Trust and a Case Study: Artificial Agency.Ioan Muntean - 2019 - Computer Ethics - Philosophical Enquiry (CEPE) Proceedings.
Trusting the (ro)botic other: By assumption?Paul B. de Laat - 2015 - SIGCAS Computers and Society 45 (3):255-260.
Trust without Reliance.Christopher Thompson - 2017 - Ethical Theory and Moral Practice 20 (3):643-655.

Analytics

Added to PP
2020-04-06

Downloads
52 (#287,506)

6 months
10 (#187,567)

Historical graph of downloads
How can I increase my downloads?

Author's Profile

Fabio Fossa
Politecnico di Milano

References found in this work

No references found.

Add more references