2001 GreedyFunctionApprox

From GM-RKB
(Redirected from J. H. Friedman 2001)
Jump to navigation Jump to search

Subject Headings: Gradient Boosted Decision Tree, Partial Dependence Plot.

Notes

Cited By

2002

Quotes

Abstract

Function estimation/approximation is viewed from the perspective of numerical optimization in function space, rather than parameter space. A connection is made between stagewise additive expansions and steepest-descent minimization. A general gradient descent "boosting" paradigm is developed for additive expansions based on any fitting criterion. Specific algorithms are presented for least-squares, least absolute deviation, and Huber-M loss functions for regression, and multiclass logistic likelihood for classification. Special enhancements are derived for the particular case where the individual additive components are regression trees, and tools for interpreting such "TreeBoost" models are presented. Gradient boosting of regression trees produces competitive, highly robust, interpretable procedures for both regression and classification, especially appropriate for mining less than clean data. Connections between this approach and the boosting methods of Freund and Shapire and Friedman 1996, Hastie and Tibshirani 1998 are discussed.


,

 AuthorvolumeDate ValuetitletypejournaltitleUrldoinoteyear
2001 GreedyFunctionApproxJerome H. FriedmanGreedy Function Approximation: A gradient boosting machineThe Annals of Statisticshttp://www.salford-systems.com/doc/GreedyFuncApproxSS.pdf10.1214/aos/10132034512001