z-logo
open-access-imgOpen Access
Live human–robot interactive public demonstrations with automatic emotion and personality prediction
Author(s) -
Hatice Güneş,
Oya Çeliktutan,
Evangelos Sarıyanidi
Publication year - 2019
Publication title -
philosophical transactions of the royal society b biological sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.753
H-Index - 272
eISSN - 1471-2970
pISSN - 0962-8436
DOI - 10.1098/rstb.2018.0026
Subject(s) - robot , personality , psychology , human–robot interaction , human–computer interaction , computer science , applied psychology , cognitive psychology , artificial intelligence , social psychology
Communication with humans is a multi-faceted phenomenon where the emotions, personality and non-verbal behaviours, as well as the verbal behaviours, play a significant role, and human–robot interaction (HRI) technologies should respect this complexity to achieve efficient and seamless communication. In this paper, we describe the design and execution of five public demonstrations made with two HRI systems that aimed at automatically sensing and analysing human participants’ non-verbal behaviour and predicting their facial action units, facial expressions and personality in real time while they interacted with a small humanoid robot. We describe an overview of the challenges faced together with the lessons learned from those demonstrations in order to better inform the science and engineering fields to design and build better robots with more purposeful interaction capabilities. This article is part of the theme issue ‘From social brains to social robots: applying neurocognitive insights to human–robot interaction’.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom