Research Library

open-access-imgOpen AccessCineMPC: A Fully Autonomous Drone Cinematography System Incorporating Zoom, Focus, Pose, and Scene Composition
Author(s)
Pablo Pueyo,
Juan Dendarieta,
Eduardo Montijano,
Ana C. Murillo,
Mac Schwager
Publication year2024
We present CineMPC, a complete cinematographic system that autonomouslycontrols a drone to film multiple targets recording user-specified aestheticobjectives. Existing solutions in autonomous cinematography control only thecamera extrinsics, namely its position, and orientation. In contrast, CineMPCis the first solution that includes the camera intrinsic parameters in thecontrol loop, which are essential tools for controlling cinematographic effectslike focus, depth-of-field, and zoom. The system estimates the relative posesbetween the targets and the camera from an RGB-D image and optimizes atrajectory for the extrinsic and intrinsic camera parameters to film theartistic and technical requirements specified by the user. The drone and thecamera are controlled in a nonlinear Model Predicted Control (MPC) loop byre-optimizing the trajectory at each time step in response to currentconditions in the scene. The perception system of CineMPC can track thetargets' position and orientation despite the camera effects. Experiments in aphotorealistic simulation and with a real platform demonstrate the capabilitiesof the system to achieve a full array of cinematographic effects that are notpossible without the control of the intrinsics of the camera. Code for CineMPCis implemented following a modular architecture in ROS and released to thecommunity.
Language(s)English

Seeing content that should not be on Zendy? Contact us.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here