Un-making artificial moral agents
Ethics and Information Technology
Beyond the skin bag: on the moral responsibility of extended agencies
Ethics and Information Technology
Ethics and Information Technology
Limits to the Autonomy of Agents
Proceedings of the 2008 conference on Current Issues in Computing and Philosophy
Ethics and the Practice of Software Design
Proceedings of the 2008 conference on Current Issues in Computing and Philosophy
Sharing Moral Responsibility with Robots: A Pragmatic Approach
Proceedings of the 2008 conference on Tenth Scandinavian Conference on Artificial Intelligence: SCAI 2008
A Challenge for Machine Ethics
Minds and Machines
The Panopticon reaches within: how digital technology turns us inside out
Identity in the Information Society
Ethics and Information Technology
Can we Develop Artificial Agents Capable of Making Good Moral Decisions?
Minds and Machines
Ethics and Information Technology
The Functional Morality of Robots
International Journal of Technoethics
On the moral responsibility of military robots
Ethics and Information Technology
Negotiating autonomy and responsibility in military robots
Ethics and Information Technology
Lockbox: mobility, privacy and values in cloud storage
Ethics and Information Technology
Hi-index | 0.00 |
After discussing the distinction between artifacts and natural entities, and the distinction between artifacts and technology, the conditions of the traditional account of moral agency are identified. While computer system behavior meets four of the five conditions, it does not and cannot meet a key condition. Computer systems do not have mental states, and even if they could be construed as having mental states, they do not have intendings to act, which arise from an agent's freedom. On the other hand, computer systems have intentionality, and because of this, they should not be dismissed from the realm of morality in the same way that natural objects are dismissed. Natural objects behave from necessity; computer systems and other artifacts behave from necessity after they are created and deployed, but, unlike natural objects, they are intentionally created and deployed. Failure to recognize the intentionality of computer systems and their connection to human intentionality and action hides the moral character of computer systems. Computer systems are components in human moral action. When humans act with artifacts, their actions are constituted by the intentionality and efficacy of the artifact which, in turn, has been constituted by the intentionality and efficacy of the artifact designer. All three components --- artifact designer, artifact, and artifact user --- are at work when there is an action and all three should be the focus of moral evaluation.