Real-Time High Resolution Background Matting

*equal contribution
{{author.name}}
University of Washington

Current video conferencing tools like Zoom can take an input feed (left) and replace the background, often introducing artifacts, as shown in the center result with close-ups of hair and glasses that still have the residual of the original background. Leveraging a frame of video without the subject (far left inset), our method produces real-time, high-resolution background matting without those common artifacts. The image on the right is our result with the corresponding close-ups, screenshot from our Zoom plugin implementation.
mdi-file [Paper Arxiv]

Main video Other video results https://www.youtube.com/watch?v=oMfPTeYDF9g {{video.label}}

Abstract

We introduce a real-time, high-resolution background replacement technique which operates at 30fps in 4K resolution, and 60fps for HD on a modern GPU. Our technique is based on background matting, where an additional frame of the background is captured and used in recovering the alpha matte and the foreground layer. The main challenge is to compute a high-quality alpha matte, preserving strand-level hair details, while processing high-resolution images in real-time. To achieve this goal, we employ two neural networks; a base network computes a low-resolution result which is refined by a second network operating at high-resolution on selective patches. We introduce two large-scale video and image matting datasets: VideoMatte240K and PhotoMatte13K/85. Our approach yields higher quality results compared to the previous state-of-the-art in background matting, while simultaneously yielding a dramatic boost in both speed and resolution.

Code

Inference code released on Github

Image results

See image results with interactive zoom-in

Dataset

PhotoMatte85 and VideoMatte240K (are coming soon). See our paper for more details.
See Background Matting (v1)

Acknowledgments

The authors thank Aleksander Holynski, Zeynep Toprakbasti, and the labmates from UW GRAIL lab for their support and helpful discussions. This work was supported by NSF/Intel Visual and Experimental Computing Award #1538618, the UW Reality Lab, Facebook, Google, Futurewei.

Citation

@article{BGMv2, title={Real-Time High-Resolution Background Matting}, author={Lin, Shanchuan and Ryabtsev, Andrey and Sengupta, Soumyadip and Curless, Brian and Seitz, Steve and Kemelmacher-Shlizerman, Ira}, journal={arXiv}, pages={arXiv--2012}, year={2020} }