z-logo
open-access-imgOpen Access
Expectations towards the Morality of Robots: An Overview of Empirical Studies
Author(s) -
Aleksandra Wasielewska
Publication year - 2021
Publication title -
ethics in progress
Language(s) - English
Resource type - Journals
ISSN - 2084-9257
DOI - 10.14746/eip.2021.1.10
Subject(s) - robot , morality , autonomy , blame , social psychology , psychology , empirical research , moral character , human–robot interaction , social robot , computer science , artificial intelligence , political science , law , epistemology , robot control , mobile robot , philosophy
The main objective of this paper is to discuss people’s expectations towards social robots’ moral attitudes. Conclusions are based on the results of three selected empirical studies which used stories of robots (and humans) acting in hypothetical scenarios to assess the moral acceptance of their attitudes. The analysis indicates both the differences and similarities in expectations towards the robot and human attitudes. Decisions to remove someone’s autonomy are less acceptable from robots than from humans. In certain circumstances, the protection of a human’s life is considered more morally right than the protection of the robot’s being. Robots are also more strongly expected to make utilitarian choices than human agents. However, there are situations in which people make consequentialist moral judgements when evaluating both the human and the robot decisions. Both robots and humans receive a similar overall amount of blame. Furthermore, it can be concluded that robots should protect their existence and obey people, but in some situations, they should be able to hurt a human being. Differences in results can be partially explained by the character of experimental tasks. The present findings might be of considerable use in implementing morality into robots and also in the legal evaluation of their behaviours and attitudes.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here