Premium
Multimodal teaching analytics: Automated extraction of orchestration graphs from wearable sensor data
Author(s) -
Prieto L.P.,
Sharma K.,
Kidzinski Ł.,
RodríguezTriana M.J.,
Dillenbourg P.
Publication year - 2018
Publication title -
journal of computer assisted learning
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.583
H-Index - 93
eISSN - 1365-2729
pISSN - 0266-4909
DOI - 10.1111/jcal.12232
Subject(s) - orchestration , wearable computer , computer science , session (web analytics) , variety (cybernetics) , analytics , learning analytics , human–computer interaction , multimedia , data science , artificial intelligence , world wide web , art , musical , visual arts , embedded system
Abstract The pedagogical modelling of everyday classroom practice is an interesting kind of evidence, both for educational research and teachers' own professional development. This paper explores the usage of wearable sensors and machine learning techniques to automatically extract orchestration graphs (teaching activities and their social plane over time) on a dataset of 12 classroom sessions enacted by two different teachers in different classroom settings. The dataset included mobile eye‐tracking as well as audiovisual and accelerometry data from sensors worn by the teacher. We evaluated both time‐independent and time‐aware models, achieving median F1 scores of about 0.7–0.8 on leave‐one‐session‐out k‐fold cross‐validation. Although these results show the feasibility of this approach, they also highlight the need for larger datasets, recorded in a wider variety of classroom settings, to provide automated tagging of classroom practice that can be used in everyday practice across multiple teachers.