Real Time Video Matting with Multichannel Poisson Equations

A new matting algorithm is developed in this project for processing video sequences in real-time and online. The algorithm is based on a set of novel Poisson equations that are derived for handling multichannel color vectors. A simple yet effective approach is also proposed to compute an initial alpha matte in the RGB color space. Real-time processing speed is achieved through optimizing the algorithm for parallel processing on the GPUs. To process live video sequences online, a modified background cut algorithm is implemented to separate foreground and background, the result of which guides the automatic trimap generation. Quantitative evaluation on still images shows that the alpha mattes extracted using the presented algorithm is much more accurate than the ones obtained using the global Poisson matting algorithm and are comparable to that of other state-of-the-art offline image matting techniques.

Reference

M. Gong, L. Wang, R. Yang, and Y.H. Yang, "Real-time video matting using multichannel Poisson Equations," Graphics Interface, May 31 - June 2, Ottawa, 2010.

near real time matting

Near Real Time Image Matting with Known Background

In this project, we develop a novel matting algorithm for handling cases where foreground objects are in front of complex (non-smooth) but known background. The
algorithm is based on a Poisson equation that takes the gradient of the known background into consideration. The quantitative evaluation based on ground truth shows that the matting results obtained using the proposed algorithm is more accurate than the ones generated by the global Poisson matting [9]. The proposed algorithm is also optimized for parallel processing and runs in near-real-time on programmable graphics hardware.

Reference

M. Gong and Y.H. Yang, "Near-real-time image matting with known background," Sixth Canadian Conference on Computer and Robot Vision, Kelowna, BC, May 25-27, 2009.

frequency based matting

Frequency Based Environment Matting

In this project, we have developed a novel method to obtain the environment matte of a scene. Previous methods use different backdrops as the calibration patterns and use color similarity to search for the matte. In our method, the series of background images displayed on a screen sequentially in time are interpreted as signals. The frequency similarity of these signals is used as the searching criterion.
The frequencies of these signals are not changed when they interact with the foreground objects and thus can be used to extract the environment matte. While using color correspondence in existing approaches is prone to error, using frequency correspondence is not. Thus, our approach is robust to noise and can easily handle the problem of folds which cannot be easily handled using current methods. The experimental results are very encouraging.

Reference

J. Zhu and Y.H. Yang, "Frequency-based environment matting," Pacific Graphics, Seoul, Korea, Oct. 6-8, 2004.