Grasp Planning and Visual Servoing for an Outdoors Aerial Dual Manipulator
Author(s) -
Pablo Ramón Soria,
B.C. Arrúe,
Anı́bal Ollero
Publication year - 2019
Publication title -
engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.376
H-Index - 45
eISSN - 2096-0026
pISSN - 2095-8099
DOI - 10.1016/j.eng.2019.11.003
Subject(s) - grasp , computer vision , artificial intelligence , object (grammar) , computer science , task (project management) , visual servoing , pose , position (finance) , kalman filter , robot , engineering , programming language , systems engineering , finance , economics
This paper describes a system for grasping known objects with unmanned aerial vehicles (UAVs) provided with dual manipulators using an RGB-D camera. Aerial manipulation remains a very challenging task. This paper covers three principal aspects for this task: object detection and pose estimation, grasp planning, and in-flight grasp execution. First, an artificial neural network (ANN) is used to obtain clues regarding the object’s position. Next, an alignment algorithm is used to obtain the object’s six-dimensional (6D) pose, which is filtered with an extended Kalman filter. A three-dimensional (3D) model of the object is then used to estimate an arranged list of good grasps for the aerial manipulator. The results from the detection algorithm—that is, the object’s pose—are used to update the trajectories of the arms toward the object. If the target poses are not reachable due to the UAV’s oscillations, the algorithm switches to the next feasible grasp. This paper introduces the overall methodology, and provides the experimental results of both simulation and real experiments for each module, in addition to a video showing the results.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom