z-logo
open-access-imgOpen Access
An Affect-Based Multimodal Video Recommendation System
Author(s) -
Artūras Kaklauskas,
Renaldas Gudauskas,
Matas Kozlovas,
Lina Pečiūrė,
Natalija Lepkova,
Justas Čerkauskas,
Audrius Banaitis
Publication year - 2016
Publication title -
studies in informatics and control
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.321
H-Index - 22
eISSN - 1841-429X
pISSN - 1220-1766
DOI - 10.24846/v25i1y201601
Subject(s) - computer science , affect (linguistics) , human–computer interaction , multimedia , information retrieval , communication , psychology
People watching a video can almost always suppress their speech but they cannot suppress their body language and manage their physiological and behavioral parameters. Affects/emotions, sensory processing, actions/motor behavior and motivation link to the limbic system responsible for instinctive and instantaneous human reactions to their environment or to other people. Limbic reactions are immediate, sure, time-tested and occur among all people. Such reactions are highly spontaneous and reflect the video viewer’s real feelings and desires, rather than deliberately calculated ones. The limbic system also links to emotions, usually conveyed by facial expressions and movements of legs, arms and/or other body parts. All physiological and behavioral parameters require consideration to determine a video viewer’s emotions and wishes. This is the reason an Affect-based multimodal video recommendation system (ARTIST), developed by the authors of the article, is very suitable. The ARTIST was developed and fine-tuned during the course of conducting the TEMPUS project “Reformation of the Curricula on Built Environment in the Eastern Neighbouring Area”. ARTIST can analyze the facial expressions and physiological parameters of a viewer while watching a video. An analysis of a video viewer’s facial expressions and physiological parameters leads to better control over alternative sequences of film clips for a video clips. It can even prompt ending the video, if nothing suitable for the viewer is available in the database. This system can consider a viewer’s emotions (happy, sad, angry, surprised, scared, disgusted and neutral) and choose rational video clips in real time. The analysis of a video viewer’s facial expressions and physiological parameters can indicate possible offers to viewers for video clips they prefer at the moment.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom