
A Simple, Inexpensive, Wearable Glove with Hybrid Resistive‐Pressure Sensors for Computational Sensing, Proprioception, and Task Identification
Author(s) -
Hughes Josie,
Spielberg Andrew,
Chounlakone Mark,
Chang Gloria,
Matusik Wojciech,
Rus Daniela
Publication year - 2020
Publication title -
advanced intelligent systems
Language(s) - English
Resource type - Journals
ISSN - 2640-4567
DOI - 10.1002/aisy.202000002
Subject(s) - wearable computer , computer science , wired glove , wearable technology , task (project management) , identification (biology) , resistive touchscreen , human–computer interaction , pressure sensor , artificial intelligence , computer vision , virtual reality , embedded system , engineering , mechanical engineering , botany , systems engineering , biology
Wearable devices have many applications ranging from health analytics to virtual and mixed reality interaction, to industrial training. For wearable devices to be practical, they must be responsive, deformable to fit the wearer, and robust to the user's range of motion. Signals produced by the wearable must also be informative enough to infer the precise physical state or activity of the user. Herein, a fully soft, wearable glove is developed, which is capable of real‐time hand pose reconstruction, environment sensing, and task classification. The design is easy to fabricate using low cost, commercial off‐the‐shelf items in a manner that is amenable to automated manufacturing. To realize such capabilities, resisitive and fluidic sensing technologies with machine learning neural architectures are merged. The glove is formed from a conductive knit which is strain sensitive, providing information through a network of resistance measurements. Fluidic sensing captured via pressure changes in fibrous sewn‐in flexible tubes, measuring interactions with the environment. The system can reconstruct user hand pose and identify sensory inputs such as holding force, object temperature, conductability, material stiffness, and user heart rate, all with high accuracy. The ability to identify complex environmentally dependent tasks, including held object identification and handwriting recognition is demonstrated.