z-logo
open-access-imgOpen Access
A Relative Pose Estimation Method of Non-Cooperative Space Targets
Author(s) -
Ke Liu,
Ling Wang,
Hanhan Liu,
Xiang Zhang
Publication year - 2022
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2228/1/012029
Subject(s) - artificial intelligence , computer vision , initialization , pose , computer science , 3d pose estimation , monocular , monocular vision , articulated body pose estimation , scale (ratio) , geography , cartography , programming language
In many space missions such as fly-around observing and approaching the targets, the relative pose estimation of non-cooperative space targets is one of the key technologies. In this paper, a relative pose estimation method of non-cooperative space targets based on a monocular camera and a laser rangefinder is proposed. The monocular camera is used to obtain the sequence image of the target. The laser rangefinder is used to solve the scale fuzziness problem of the monocular camera and construct the world coordinate system in real scale during initialization. The camera data and the laser rangefinder data are fused in a tightly coupled form to optimize the estimated pose in continuous pose estimation. The non-cooperative space target images generated by Blender are used for simulation. The results show that the method proposed in this paper has good real-time performance and can estimate the relative pose of non-cooperative space targets accurately and robustly. Compared with the existing methods based on monocular vision, the proposed method does not require the initial pose assumption and can effectively improve the accuracy of the pose estimation.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here