Laplacian Margin Distribution Boosting for Learning from Sparsely Labeled Data

Abstract

Boosting algorithms attract much attention in computer vision and image processing because of their strong performance in a variety of applications. Recent progress on the theory of boosting algorithms suggests a close link between good generalization and the margin distribution of the classifier w.r.t. a dataset. In this paper, we propose a novel data-dependent margin distribution learning criterion for boosting, termed Laplacian MDBoost, which utilizes the intrinsic geometric structure of dataset. One key aspect of our method is that it can seamlessly incorporate unlabeled data by including a graph Laplacian regularizer. We derive a dual formulation of the learning problem that can be efficiently solved by column generation. Experiments on various datasets validate the effectiveness of the new graph Laplacian based learning criterion on both supervised and unsupervised learning settings. We also show that the performance of our algorithm outperforms the state-of-the-art semi-supervised learning algorithms on a variety of inductive inference tasks, including real world video segmentation.

Publication
IEEE International Conference on Digital Image Computing: Techniques and Applications (DICTA)

Related