You are not Special - The Challenge of Artificial Moral Agents to Human Exceptionalism
Summary
This thesis examines whether artificial systems can genuinely possess moral agency and what this possibility means for human exceptionalism. It asks: Can moral agency remain a foundation for human exceptionalism in light of advancing artificial intelligence?
The thesis argues that moral agency is not an exclusively human trait but an algorithmic process that artificial systems could, in principle, instantiate. After establishing a working definition of moral agency as the ability to recognize and understand moral reasons, deliberate upon them, and act accordingly with sufficient autonomy to be held responsible for the outcome, it reconstructs key objections to machine moral agency. Drawing on William Hasselberger’s Ethics Beyond Computation (2019) and Robert Sparrow’s Why Machines Cannot Be Moral (2021), these objections are framed as the input problem—whether machines could perceive and interpret morally salient features of a situation—and the output problem—whether their actions could bear genuine moral significance.
In response, the thesis develops the Computational Identity Theory of Moral Agency (CITMA), which integrates mind–brain identity theory with the computational theory of mind. CITMA holds that moral reasoning and decision-making are algorithmic in nature; therefore, if such processes can be instantiated by artificial systems, moral agency cannot be restricted to humans and other biological entities.
The final chapter demonstrates that this conclusion exposes the historical and ethical fragility of human exceptionalism. Across history, boundaries drawn to mark human uniqueness have proven porous and unstable. Traits once taken as uniquely human—rationality, language, tool use, creativity, and moral capacity—have repeatedly been reassigned, eroded, or shown to exist in other beings to varying degrees. The challenge of artificial moral agents continues this pattern, compelling us to reconsider how responsibility, accountability, and moral status are distributed. Human exceptionalism has always been fragile. Artificial moral agents force us to confront that fragility once more. If so much of our ethical self-understanding rests on the belief that humans are special, then the challenge posed by AI is not only about machines—it is about us. Who are we, if not special?
Collections
Related items
Showing items related by title, author, creator and subject.
-
Potentially morally injurious events and moral injury symptoms in healthcare professionals: Age, work experience, moral reasoning style and work at a COVID-19 department as predictors.
Peterman, Shanna (2023)Background: Since the COVID-19 pandemic healthcare delivery systems have been facing ethically challenging situations. As a result, healthcare professionals might experience more potentially morally injurious events (PMIEs), ... -
Navigating the Moral Landscape: Moral Identity Threat and Stress Responses Examining the relationship between Moral Identity, Group Identification, and Stress Outcomes in Dutch-International Student Dialogues
Boermans, Stijn (2023)This study explores the relationship between moral identity threat, stress responses and group identification in the context of intergroup discussions between Dutch and international students. Focusing on the interconnected ...
