S. Merled, B. Caprile, and C. Furlanello

Giving Adaboost a Parallel Boost

Abstract

Adaboost is one of the most successful classification methods in use. Differently from other popular ensemble methods (e.g., Bagging), however, Adaboost is inherently sequential. In many data intensive, real world applications this may represent a fatal limitation.
In this paper, a method is presented for the parallelization of the Adaboost. The procedure builds upon earlier results concerning the dynamics of Adaboost weights, and yields approximations to the standard Adaboost models that can be easily and efficiently distributed over a network of computing nodes.

Submitted.