CamForensics
Author(s) -
Animesh Srivastava,
Puneet Jain,
Soteris Demetriou,
Landon P. Cox,
Kyu-Han Kim
Publication year - 2017
Publication title -
spiral (imperial college london)
Language(s) - English
Resource type - Conference proceedings
ISBN - 978-1-4503-5459-2
DOI - 10.1145/3131672.3131683
Subject(s) - computer science , augmented reality , mobile apps , mobile device , information sensitivity , camera phone , human–computer interaction , computer vision , artificial intelligence , computer graphics (images) , world wide web , computer security
Many mobile apps, including augmented-reality games, bar-code readers, and document scanners, digitize information from the physical world by applying computer-vision algorithms to live camera data. However, because camera permissions for existing mobile operating systems are coarse (i.e., an app may access a camera's entire view or none of it), users are vulnerable to visual privacy leaks. An app violates visual privacy if it extracts information from camera data in unexpected ways. For example, a user might be surprised to find that an augmented-reality makeup app extracts text from the camera's view in addition to detecting faces. This paper presents results from the first large-scale study of visual privacy leaks in the wild. We build CamForensics to identify the kind of information that apps extract from camera data. Our extensive user surveys determine what kind of information users expected an app to extract. Finally, our results show that camera apps frequently defy users' expectations based on their descriptions.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom