z-logo
open-access-imgOpen Access
Interactive Perception of Rigid and Non-Rigid Objects
Author(s) -
Bryan Willimon,
Stan Birchfield,
Ian D. Walker
Publication year - 2012
Publication title -
international journal of advanced robotic systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.394
H-Index - 46
eISSN - 1729-8814
pISSN - 1729-8806
DOI - 10.5772/53810
Subject(s) - computer science , object (grammar) , artificial intelligence , computer vision , robot , perception , context (archaeology) , flexibility (engineering) , variety (cybernetics) , service robot , human–computer interaction , socks , mathematics , paleontology , computer network , statistics , neuroscience , biology
This paper explores the concept of interactive perception, in which sensing guides manipulation, in the context of extracting and classifying unknown objects within a cluttered environment. In the proposed approach, a pile of objects lies on a flat background, and the goal of the robot is to isolate, interact with, and classify each object so that its properties can be obtained. The algorithm considers each object to be classified using color, shape, and flexibility. The approach works with a variety of objects relevant to service robot applications, including both rigid objects such as bottles, cans, and pliers as well as non‐rigid objects such as soft toy animals, socks, and shoes. Experiments on a number of different piles of objects demonstrate the ability of efficiently isolating and classifying each item through interaction

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom