
Perception‐based 3D tactile rendering from a single image for human skin examinations by dynamic touch
Author(s) -
Kim K.,
Lee S.
Publication year - 2015
Publication title -
skin research and technology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.521
H-Index - 69
eISSN - 1600-0846
pISSN - 0909-752X
DOI - 10.1111/srt.12173
Subject(s) - haptic technology , rendering (computer graphics) , computer vision , computer science , artificial intelligence , perception , tactile perception , haptic perception , computer graphics (images) , psychology , neuroscience
Background/aims Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Methods Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real‐time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. Results We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Conclusion Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real‐time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications.