z-logo
Premium
A PatchMatch‐based Approach for Matte Propagation in Videos
Author(s) -
Backes Marcos H.,
Oliveira Manuel M.
Publication year - 2019
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/cgf.13868
Subject(s) - computer science , video editing , artificial intelligence , computer vision , enhanced data rates for gsm evolution , sequence (biology) , computer graphics (images) , biology , genetics
Abstract Despite considerable advances in natural image matting over the last decades, video matting still remains a difficult problem. The main challenges faced by existing methods are the large amount of user input required, and temporal inconsistencies in mattes between pairs of adjacent frames. We present a temporally‐coherent matte‐propagation method for videos based on PatchMatch and edge‐aware filtering. Given an input video and trimaps for a few frames, including the first and last, our approach generates alpha mattes for all frames of the video sequence. We also present a user scribble‐based interface for video matting that takes advantage of the efficiency of our method to interactively refine the matte results. We demonstrate the effectiveness of our approach by using it to generate temporally‐coherent mattes for several natural video sequences. We perform quantitative comparisons against the state‐of‐the‐art sparse‐input video matting techniques and show that our method produces significantly better results according to three different metrics. We also perform qualitative comparisons against the state‐of‐the‐art dense‐input video matting techniques and show that our approach produces similar quality results while requiring only about 7% of the amount of user input required by such techniques. These results show that our method is both effective and user‐friendly, outperforming state‐of‐the‐art solutions.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here