MIT Press Open Access

The MIT Press has been a leader in open access book publishing for two decades, beginning in 1995 with the publication of William Mitchell's City of Bits, which appeared simultaneously in print and in a dynamic, open web edition. We support a variety of open access funding models for select books, including monographs, trade books, and textbooks.

The MIT Press is a leading publisher of books and journals at the intersection of science, technology, and the arts. MIT Press books and journals are known for their intellectual daring, scholarly standards, and distinctive design.

Open Access Title

Overview

Author(s)

Praise

Summary

An accessible introduction and essential reference for an approach to machine learning that creates highly accurate prediction rules by combining many weak and inaccurate ones.

Boosting is an approach to machine learning based on the idea of creating a highly accurate predictor by combining many weak and inaccurate “rules of thumb.” A remarkably rich theory has evolved around boosting, with connections to a range of topics, including statistics, game theory, convex optimization, and information geometry. Boosting algorithms have also enjoyed practical success in such fields as biology, vision, and speech processing. At various times in its history, boosting has been perceived as mysterious, controversial, even paradoxical.

This book, written by the inventors of the method, brings together, organizes, simplifies, and substantially extends two decades of research on boosting, presenting both theory and applications in a way that is accessible to readers from diverse backgrounds while also providing an authoritative reference for advanced researchers. With its introductory treatment of all material and its inclusion of exercises in every chapter, the book is appropriate for course use as well.

The book begins with a general introduction to machine learning algorithms and their analysis; then explores the core theory of boosting, especially its ability to generalize; examines some of the myriad other theoretical viewpoints that help to explain and understand boosting; provides practical extensions of boosting for more complex learning problems; and finally presents a number of advanced theoretical topics. Numerous applications and practical illustrations are offered throughout.

Hardcover

Paperback

Share

Authors

Robert E. Schapire

Robert E. Schapire is Principal Researcher at Microsoft Research in New York City. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.

Yoav Freund

Yoav Freund is Professor of Computer Science at the University of California, San Diego. For their work on boosting, Freund and Schapire received both the Gödel Prize in 2003 and the Kanellakis Theory and Practice Award in 2004.

Reviews

This excellent book is a mind-stretcher that should be read and reread, even by nonspecialists.

For those who wish to work in the area, it is a clear and insightful view of the subject that deserves a place in the canon of machine learning and on the shelves of those who study it.

Giles Hooker

Journal of the American Statistical Association

Endorsements

Robert Schapire and Yoav Freund made a huge impact in machine and statistical learning with their invention of boosting, which has survived the test of time. There have been lively discussions about alternative explanations of why it works so well, and the jury is still out. This well-balanced book from the 'masters' covers boosting from all points of view, and gives easy access to the wealth of research that this field has produced.

Trevor Hastie

Statistics Department, Stanford University

Boosting has provided a platform for thinking about and designing machine learning algorithms for over 20 years. The simple and elegant idea behind boosting is a 'Mirror of Erised' that researchers view from many different perspectives. This book beautifully ties together these views, using the same limpid style found in Robert Schapire and Yoav Freund's original research papers. It's an important resource for machine learning research.

John Lafferty

University of Chicago and Carnegie Mellon University

An outstanding text, which provides an authoritative, self-contained, broadly accessible and very readable treatment of boosting methods, a widely applied family of machine learning algorithms pioneered by the authors. It nicely covers the spectrum from theory through methodology to applications.

Peter Bartlett

University of California, Berkeley

Boosting is an amazing machine learning algorithm of 'intelligence' with much success in practice. It allows a weak learner to adapt to the data at hand and become 'strong'; it seamlessly integrates statistical estimation and computation. In this book, Robert Schapire and Yoav Freund, two inventors of the field, present multiple, fascinating views of boosting to explain why and how it works.