In this paper, we propose gcForest, a decision tree ensemble approach with
performance highly competitive to deep neural networks. In contrast to deep
neural networks which require great effort in hyper-parameter tuning, gcForest
is much easier to train. Actually, even when gcForest is applied to different
data from different domains, excellent performance can be achieved by almost
same settings of hyper-parameters. The training process of gcForest is
efficient and scalable. In our experiments its training time running on a PC is
comparable to that of deep neural networks running with GPU facilities, and the
efficiency advantage may be more apparent because gcForest is naturally apt to
parallel implementation. Furthermore, in contrast to deep neural networks which
require large-scale training data, gcForest can work well even when there are
only small-scale training data. Moreover, as a tree-based approach, gcForest
should be easier for theoretical analysis than deep neural networks.

Captured tweets and retweets: 10

Made with a human heart + one part enriched uranium + four parts unicorn blood