The article deals with a phenomenon of increasing social relevance, namely the interaction between robots and human beings. Specific applications - namely robot companions - are researched and designed primarily for the purpose of interacting with humans for the widest range of purposes, to deliver services, to cure and provide care, to teach and entertain. After discussing the notion of «robot» to exclude the possibility of deeming existing applications «agents», the issue of deception is addressed, starting with the very definition of «artificial intelligence» emerging from the Turing test. If current machines are not truly intelligent, do not possess a conscience, nor are aware of their very existence, still they may simulate those characteristics, eliciting emotional reactions on the side of human beings. Engineering studies are presented that address the use of such applications with children and the elderly for a wide range of purposes, actively pursuing some form of emotional engagement that might amount to deception. The ethical framework is briefly introduced, as well as the legal, to deny validity to purely utilitarian stances as well as to the self-sufficiency of the criterion of freedom of self-determination.

Human-Robot Interaction and Deception

Andrea Bertolini
2018-01-01

Abstract

The article deals with a phenomenon of increasing social relevance, namely the interaction between robots and human beings. Specific applications - namely robot companions - are researched and designed primarily for the purpose of interacting with humans for the widest range of purposes, to deliver services, to cure and provide care, to teach and entertain. After discussing the notion of «robot» to exclude the possibility of deeming existing applications «agents», the issue of deception is addressed, starting with the very definition of «artificial intelligence» emerging from the Turing test. If current machines are not truly intelligent, do not possess a conscience, nor are aware of their very existence, still they may simulate those characteristics, eliciting emotional reactions on the side of human beings. Engineering studies are presented that address the use of such applications with children and the elderly for a wide range of purposes, actively pursuing some form of emotional engagement that might amount to deception. The ethical framework is briefly introduced, as well as the legal, to deny validity to purely utilitarian stances as well as to the self-sufficiency of the criterion of freedom of self-determination.
2018
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11382/525909
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 7
social impact