18 June 2018

Large-Scale Random Forests

Research paper

In a paper just accepted at the prestigious KDD conference (ACM SIGKDD Conference on Knowledge Discovery and Data Mining), two researchers from Department of Computer Science present Woody, a simple yet effective framework that allows to efficiently construct random forests of large trees from hundreds of millions or even billions of training instances using a cheap desktop computer with commodity hardware.

The basic idea is to consider a multi-level construction scheme, which builds top trees for small random subsets of the available data and which subsequently distributes all training instances to the top trees' leaves for further processing.

Content not available due to cookie preferences

You cannot see the content of this field because of your cookie preferences.

Click here to change your cookie settings.

Category: Statistics, Marketing

The paper will be presented at the 24rd SIGKDD Conference on Knowledge Discovery and Data Mining on 19-23 August 2018 in London, UK.

Title

Training Big Random Forests with Little Resources

Authors

Fabian Gieseke and Christian Igel

Abstract

Without access to large compute clusters, building random forests on large datasets is still a challenging problem. This is, in particular, the case if fully-grown trees are desired. We propose a simple yet effective framework that allows to efficiently construct ensembles of huge trees for hundreds of millions or even billions of training instances using a cheap desktop computer with commodity hardware. The basic idea is to consider a multi-level construction scheme, which builds top trees for small random subsets of the available data and which subsequently distributes all training instances to the top trees' leaves for further processing. While being conceptually simple, the overall efficiency crucially depends on the particular implementation of the different phases. The practical merits of our approach are demonstrated using dense datasets with hundreds of millions of training instances.