Ethics and Information Technology 8 (4):195-204 (2006)

Abstract
After discussing the distinction between artifacts and natural entities, and the distinction between artifacts and technology, the conditions of the traditional account of moral agency are identified. While computer system behavior meets four of the five conditions, it does not and cannot meet a key condition. Computer systems do not have mental states, and even if they could be construed as having mental states, they do not have intendings to act, which arise from an agent’s freedom. On the other hand, computer systems have intentionality, and because of this, they should not be dismissed from the realm of morality in the same way that natural objects are dismissed. Natural objects behave from necessity; computer systems and other artifacts behave from necessity after they are created and deployed, but, unlike natural objects, they are intentionally created and deployed. Failure to recognize the intentionality of computer systems and their connection to human intentionality and action hides the moral character of computer systems. Computer systems are components in human moral action. When humans act with artifacts, their actions are constituted by the intentionality and efficacy of the artifact which, in turn, has been constituted by the intentionality and efficacy of the artifact designer. All three components – artifact designer, artifact, and artifact user – are at work when there is an action and all three should be the focus of moral evaluation.
Keywords action theory  artifact  artificial moral agent  intentionality  moral agent  technology
Categories (categorize this paper)
DOI 10.1007/s10676-006-9111-5
Options
Edit this record
Mark as duplicate
Export citation
Find it on Scholar
Request removal from index
Revision history

Download options

PhilArchive copy


Upload a copy of this paper     Check publisher's policy     Papers currently archived: 69,160
Through your library

References found in this work BETA

On the Morality of Artificial Agents.Luciano Floridi & J. W. Sanders - 2004 - Minds and Machines 14 (3):349-379.
Computers as Surrogate Agents.Deborah G. Johnson & Thomas M. Powers - 2008 - In M. J. van den Joven & J. Weckert (eds.), Information Technology and Moral Philosophy. Cambridge University Press. pp. 251.

Add more references

Citations of this work BETA

Mind the Gap: Responsible Robotics and the Problem of Responsibility.David J. Gunkel - 2020 - Ethics and Information Technology 22 (4):307-320.
The Other Question: Can and Should Robots Have Rights?David J. Gunkel - 2018 - Ethics and Information Technology 20 (2):87-99.
In AI We Trust: Ethics, Artificial Intelligence, and Reliability.Mark Ryan - 2020 - Science and Engineering Ethics 26 (5):2749-2767.

View all 62 citations / Add more citations

Similar books and articles

Analytics

Added to PP index
2009-01-28

Total views
113 ( #102,372 of 2,499,436 )

Recent downloads (6 months)
11 ( #67,534 of 2,499,436 )

How can I increase my downloads?

Downloads

My notes