Tools

"... The generalized linear model framework is often used in classification problems and the importance and effect of the predictor variables on the response is generally judged by examination of the relevant regression coefficients. This chapter describes classification trees which can also be used for ..."

The generalized linear model framework is often used in classification problems and the importance and effect of the predictor variables on the response is generally judged by examination of the relevant regression coefficients. This chapter describes classificationtrees which can also be used

"... Algorithms for learning cIassification trees have had successes in ar-tificial intelligence and statistics over many years. This paper outlines how a tree learning algorithm can be derived using Bayesian statis-tics. This iutroduces Bayesian techniques for splitting, smoothing, and tree averaging. T ..."

Algorithms for learning cIassification trees have had successes in ar-tificial intelligence and statistics over many years. This paper outlines how a tree learning algorithm can be derived using Bayesian statis-tics. This iutroduces Bayesian techniques for splitting, smoothing, and tree averaging

"... Classification trees based on exhaustive search algorithms tend to be biased towards selecting variables that afford more splits. As a result, such trees should be interpreted with caution. This article presents an algorithm called QUEST that has negligible bias. Its split selection strategy shares ..."

Classificationtrees based on exhaustive search algorithms tend to be biased towards selecting variables that afford more splits. As a result, such trees should be interpreted with caution. This article presents an algorithm called QUEST that has negligible bias. Its split selection strategy shares

"... Abstract. We propose a new algorithm for learning isotonic classification trees. It relabels non-monotone leaf nodes by performing the isotonic regression on the collection of leaf nodes. In case two leaf nodes with a common parent have the same class after relabeling, the tree is pruned in the pare ..."

Abstract. We propose a new algorithm for learning isotonic classificationtrees. It relabels non-monotone leaf nodes by performing the isotonic regression on the collection of leaf nodes. In case two leaf nodes with a common parent have the same class after relabeling, the tree is pruned

"... Abstract—The separation of source coding into two stages, modeling and encoding, is a highly successful approach. We propose meta-modeling as an additional stage. As an application, we use this paradigm to deduce an efficient and optimal algorithm for a novel and powerful model set: the classificati ..."

"... Abstract. For classification problems with ordinal attributes very often the class attribute should increase with each or some of the explanatory attributes. These are called classification problems with monotonicity constraints. Standard classification tree algorithms such as CART or C4.5 are not g ..."

Abstract. For classification problems with ordinal attributes very often the class attribute should increase with each or some of the explanatory attributes. These are called classification problems with monotonicity constraints. Standard classificationtree algorithms such as CART or C4

"... The most efficient speciation methods suffer from a quite high complexity from O(n c(n)) to O(n 2 ), where c(n) is a factor that can be proportional to n, the population size. In this paper, a speciation method based on a classification tree is presented, having a complexity of O(n log n). The pop ..."

The most efficient speciation methods suffer from a quite high complexity from O(n c(n)) to O(n 2 ), where c(n) is a factor that can be proportional to n, the population size. In this paper, a speciation method based on a classificationtree is presented, having a complexity of O(n log n

"... Besides serving as prediction models, classification trees are useful for finding important predictor variables and identifying interesting subgroups in the data. These functions can be compromised by weak split selection algorithms that have variable selection biases or that fail to search beyond l ..."

Besides serving as prediction models, classificationtrees are useful for finding important predictor variables and identifying interesting subgroups in the data. These functions can be compromised by weak split selection algorithms that have variable selection biases or that fail to search beyond

"... The construction of classification trees is nearly always top-down, locally optimal and data-driven. Such recursive designs are often globally inefficient, for instance in terms of the mean depth necessary to reach a given classification rate. We consider statistical models for which exact global op ..."

The construction of classificationtrees is nearly always top-down, locally optimal and data-driven. Such recursive designs are often globally inefficient, for instance in terms of the mean depth necessary to reach a given classification rate. We consider statistical models for which exact global

by
Hyunjoong Kim, Wei-yin Loh
- Journal of the American Statistical Association, 2001

"... Two univariate split methods and one linear combination split method are proposed for the construction of classification trees with multiway splits. Examples are given where the trees are more compact and hence easier to interpret than binary trees. A major strength of the univariate split methods i ..."

Two univariate split methods and one linear combination split method are proposed for the construction of classificationtrees with multiway splits. Examples are given where the trees are more compact and hence easier to interpret than binary trees. A major strength of the univariate split methods