The only thing that you need to know is the regression modeling!”I remember thinking myself, “I got this!”.

I knew regression modeling; both linear and logistic regression.

My boss was right.

In my tenure, I exclusively built regression-based statistical models.

I wasn’t alone.

In fact, at that time, regression modeling was the undisputed queen of predictive analytics.

Fast forward fifteen years, the era of regression modeling is over.

The old queen has passed.

Long live the new queen with a funky name; XGBoost or Extreme Gradient Boosting!XGBoost is a decision-tree-based ensemble Machine Learning algorithm that uses a gradient boosting framework.

In prediction problems involving unstructured data (images, text, etc.

) artificial neural networks tend to outperform all other algorithms or frameworks.

However, when it comes to small-to-medium structured/tabular data, decision tree based algorithms are considered best-in-class right now.

Please see the chart below for the evolution of tree-based algorithms over the years.

Evolution of XGBoost Algorithm from Decision TreesXGBoost algorithm was developed as a research project at the University of Washington.

Tianqi Chen and Carlos Guestrin presented their paper at SIGKDD Conference in 2016 and caught the Machine Learning world by fire.

Since its introduction, this algorithm has not only been credited with winning numerous Kaggle competitions but also for being the driving force under the hood for several cutting-edge industry applications.

As a result, there is a strong community of data scientists contributing to the XGBoost open source projects with ~350 contributors and ~3,600 commits on GitHub.

The algorithm differentiates itself in the following ways:Decision trees, in their simplest form, are easy-to-visualize and fairly interpretable algorithms but building intuition for the next-generation of tree-based algorithms can be a bit tricky.

See below for a simple analogy to better understand the evolution of tree-based algorithms.

Photo by rawpixel on UnsplashImagine that you are a hiring manager interviewing several candidates with excellent qualifications.

Each step of the evolution of tree-based algorithms can be viewed as a version of the interview process.

XGBoost and Gradient Boosting Machines (GBMs) are both ensemble tree methods that apply the principle of boosting weak learners (CARTs generally) using the gradient descent architecture.

However, XGBoost improves upon the base GBM framework through systems optimization and algorithmic enhancements.

How XGBoost optimizes standard GBM algorithmSystem Optimization:Algorithmic Enhancements:We used Scikit-learn’s ‘Make_Classification’ data package to create a random sample of 1 million data points with 20 features (2 informative and 2 redundant).

We tested several algorithms such as Logistic Regression, Random Forest, standard Gradient Boosting, and XGBoost.

XGBoost vs.

Other ML Algorithms using SKLearn’s Make_Classification DatasetAs demonstrated in the chart above, XGBoost model has the best combination of prediction performance and processing time compared to other algorithms.

Other rigorous benchmarking studies have produced similar results.

No wonder XGBoost is widely used in recent Data Science competitions.

“When in doubt, use XGBoost” — Owen Zhang, Winner of Avito Context Ad Click Prediction competition on KaggleWhen it comes to Machine Learning (or even life for that matter), there is no free lunch.

As Data Scientists, we must test all possible algorithms for data at hand to identify the champion algorithm.

Besides, picking the right algorithm is not enough.

We must also choose the right configuration of the algorithm for a dataset by tuning the hyper-parameters.

Furthermore, there are several other considerations for choosing the winning algorithm such as computational complexity, explainability, and ease of implementation.

This is exactly the point where Machine Learning starts drifting away from science towards art, but honestly, that’s where the magic happens!Machine Learning is a very active research area and already there are several viable alternatives to XGBoost.

Microsoft Research recently released LightGBM framework for gradient boosting that shows great potential.

CatBoost developed by Yandex Technology has been delivering impressive bench-marking results.

It is a matter of time when we have a better model framework that beats XGBoost in terms of prediction performance, flexibility, explanability, and pragmatism.

However, until a time when a strong challenger comes along, XGBoost will continue to reign over the Machine Learning world!Original.

Reposted with permission.

Bio: Vishal Morde is a seasoned executive and transformational leader with 15+ years of progressive experience in Data Science, Machine Learning, Artificial Intelligence, Big Data, and Strategic Analytics.

Resources:Related: var disqus_shortname = kdnuggets; (function() { var dsq = document.

createElement(script); dsq.

type = text/javascript; dsq.

async = true; dsq.

src = https://kdnuggets.

disqus.

com/embed.

js; (document.

getElementsByTagName(head)[0] || document.

getElementsByTagName(body)[0]).

appendChild(dsq); })();.