
Exploring Pupil Position as An Eye-Tracking Feature for Four-Class Emotion Classification In VR
Author(s) -
Jia Zheng Lim,
James Mountstephens,
Jason Teo
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2129/1/012069
Subject(s) - pupil , eye tracking , quadrant (abdomen) , artificial intelligence , support vector machine , computer science , classifier (uml) , computer vision , class (philosophy) , feature (linguistics) , position (finance) , emotion recognition , emotion classification , pupil size , pattern recognition (psychology) , psychology , medicine , linguistics , philosophy , finance , pathology , neuroscience , economics
This paper presented a preliminary investigation of a novel approach on emotion recognition using pupil position in Virtual Reality (VR). We explore pupil position as an eye-tracking feature for four-class emotion classification according to the four-quadrant model of emotions via a presentation of 360° videos in VR. A total of ten subjects participated in this emotional experiment. A 360° video with four sessions of stimulation of emotions will be presented in VR to evoke the user’s emotions. The eye data were recorded and collected using Pupil Labs eye-tracker and the emotion classification was done by using pupil position solely. The classifier used in this investigation is the Support Vector Machine (SVM) machine learning algorithm. The results showed that the best accuracy achieved from this four-class random classification was 59.19%.