ThunderGBM: Fast GBDTs and Random Forests on GPUs

Zeyi Wen, Hanfeng Liu, Jiashuai Shi, Qinbin Li, Bingsheng He, Jian Chen

Research output: Contribution to journalArticlepeer-review

3 Citations (Scopus)


Gradient Boosting Decision Trees (GBDTs) and Random Forests (RFs) have been used in many real-world applications. They are often a standard recipe for building state-of-the-art solutions to machine learning and data mining problems. However, training and prediction are very expensive computationally for large and high dimensional problems. This article presents an efficient and open source software toolkit called ThunderGBM which exploits the high-performance Graphics Processing Units (GPUs) for GBDTs and RFs. ThunderGBM supports classification, regression and ranking. It uses identical command line options and configuration files as XGBoost—one of the most popular GBDT and RF libraries. ThunderGBM can be used through multiple language interfaces including C/C++ and Python, and can run on single or multiple GPUs of a machine. Our experimental results show that ThunderGBM outperforms the existing libraries while producing similar models, and can handle high dimensional problems which existing GPU based libraries fail. Documentation, examples, and more details about ThunderGBM are available at
Original languageEnglish
Pages (from-to)1-5
JournalJournal of Machine Learning Research
Publication statusPublished - Apr 2020


Dive into the research topics of 'ThunderGBM: Fast GBDTs and Random Forests on GPUs'. Together they form a unique fingerprint.

Cite this