z-logo
open-access-imgOpen Access
New Stereovision for Human-Robot Communications
Author(s) -
Junichi Takeno,
Zichuan Xu
Publication year - 2000
Publication title -
journal of robotics and mechatronics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.257
H-Index - 19
eISSN - 1883-8049
pISSN - 0915-3942
DOI - 10.20965/jrm.2000.p0231
Subject(s) - sight , computer science , artificial intelligence , robot , computer vision , human–computer interaction , physics , astronomy
Better communications between robots and humans requires an understanding of human senses, and at the same time, the ability to express these senses. Robot functions must be able to approximate as much as possible the five human senses of sight, hearing, smell, taste and touch. Further, the system in which these five senses are integrated must be understood. The expression of such senses by robots should not be confined to merely simulating human behaviors but needs to address unknown and difficult problems related to the manifestation of the inner aspects of human mentality. In this paper, we report on a new artificial vision system that is similar to the human sense of sight, with the target of realizing functions for understanding the processes of human sight. First, we introduce the new artificial visual sensing system and explain problems of virtual images and occlusion, blind spots in conventional binocular stereovision systems. We then discuss a technique for solving these problems that utilizes the new functions of the artificial vision system we developed and introduce a prototype of the new system and its configuration.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom