Call for Papers: A Workshop On Constructing an Ethical or Moral Approach to State Responsibility

March 31, 2023 – April 1, 2023

Image via Pixabay

This workshop will consider (comparing and contrasting) Vulnerability Theory and Human Rights as ways of thinking about state responsibility.

Human rights and human vulnerability theory are both concerned with the achievement of social justice and the role of the State in this endeavor. Their approaches to the State, however, are markedly different. Human rights in today’s neoliberal climate advocates for a “restrained State” while human vulnerability theory calls for a “responsive State”. We are pleased to announce a workshop on human rights and human vulnerability, bringing in dialogue the work of Professor Michael Perry on human rights and Professor Martha Albertson Fineman on human vulnerability.

Continue reading Call for Papers: A Workshop On Constructing an Ethical or Moral Approach to State Responsibility

Call for Papers – A Workshop on Vulnerability and Digital Intimacy

March 24, 2023 at Emory University School of Law

Has human interaction with social robots and other forms of artificial intelligence evolved to the point where such interaction could constitute an “intimate relationship?” If so, how should these interactions be regarded and regulated?  On the other hand, how might this type of interaction ultimately affect the form, nature, and need for intimacy between humans? This workshop will explore how vulnerability theory can be applied to these and other questions arising from digital intimacy, considering how state and social responsibility for the technological future should be defined and incorporated into an ethical framework for the development and use of AI.

Current scholarship on ethical AI often engages with notions of privacy and trust through a consumer protection framework. Considering users of AI systems as mere “consumers” obscures the many social and environmental constraints on consumer choice, including the deep emotional, and sometimes even romantic, attachments that humans may form with AI. Voice assistants such as Siri and Alexa, chatbots, and therapeutic and home robots have increasingly become our companions, caregivers, and confidants. Instead, centering possible emotional connections allows us to view the sharing of information with AI as a potentially intimate or trusting act, rather than a mere conferral of personal data. Manipulation or coercion on the part of a commercialized AI system could be characterized as a form of “intimate deception” rather than merely consumer fraud. This conceptual shift broadens the traditional consideration of harm in human-AI interaction to include the emotional and psychological harm caused by the betrayal of intimate trust.

Continue reading Call for Papers – A Workshop on Vulnerability and Digital Intimacy