z-logo
open-access-imgOpen Access
A Screen-based Multimodal Virtual Classroom Interface for Understanding Behavioral Sensory Responses in Autistic Adolescents: A Pilot Study
Author(s) -
Zhiwei Yu,
Suzannah Iadarola,
Samantha Daley,
Zhi Zheng
Publication year - 2025
Publication title -
ieee transactions on neural systems and rehabilitation engineering
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 1.093
H-Index - 140
eISSN - 1558-0210
pISSN - 1534-4320
DOI - 10.1109/tnsre.2025.3619484
Subject(s) - bioengineering , computing and processing , robotics and control systems , signal processing and analysis , communication, networking and broadcast technologies
Autism impacts at least 1 in 100 children worldwide, with about 90% experiencing sensory processing difficulties. Virtual Reality (VR), which can precisely deliver controlled sensory stimuli, has emerged as a promising tool for studying sensory experiences. However, VR systems using a head-mounted display may cause discomfort and exacerbate sensory challenges for autistic children. Screen-based VR could offer a viable alternative, but research on designing multimodal sensory delivery systems that simulate real-life experiences remains limited. As a result, the impact of on-screen VR on children’s behavioral sensory responses is not well understood. Therefore, as a pilot study to fill this gap, we designed a novel screen-based Multimodal Virtual Classroom Interface (MVCI) system. MVCI was designed to deliver well-controlled visual, auditory, and tactile stimuli that closely mimic a real classroom environment. The pilot study involved 9 autistic adolescents and 17 typically developing (TD) adolescents, all of whom reported 100% acceptance of the system. Quantitative behavioral analysis demonstrated that, even with the small sample size, the on-screen interaction showed significant differences ( p < 0.05) between the two groups in terms of eye gaze, fine motor movements, and eye-hand alignment. Additionally, several behavioral patterns were strongly correlated with participants’ sensory profiles and ADHD symptom severity ( p < 0.05, r s > 0.7). Using a novel Fixation Sequence Modeling (FSM) framework, we were able to predict participants’ near-future performance with high accuracy (97-98% proximity) based on their granular behavioral responses.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom