Machine Learning Times
EXCLUSIVE HIGHLIGHTS
For Managing Business Uncertainty, Predictive AI Eclipses GenAI
  Originally published in Forbes The future is the ultimate...
AI Business Value Is Not an Oxymoron: How Predictive AI Delivers Real ROI for Enterprises
  Originally published in AI Realized Now “Shouldn’t a great...
How To Un-Botch Predictive AI: Business Metrics
  Originally published in Forbes Predictive AI offers tremendous potential...
2 More Ways To Hybridize Predictive AI And Generative AI
  Originally published in Forbes Predictive AI and generative AI...
SHARE THIS:

10 months ago
XGBoost is All You Need, Part 3 – Gradient Boosted Trees

 

Originally published on XGBlog, January 30, 2025.

This is the third part in the series of blog posts about XGBoost, based on my 2024 GTC presentation you can find Part 1 here, and Part 2 here.

Today we want to talk about gradient boosted trees. Even though XGBoost has an option for purely linear boosting, it’s the non-linear version – based on the decision tree algorithms – that gives this library and algorithm its predictive power.

Decision trees are perhaps the simplest predictive algorithm to describe. Say you want to know if it’s raining outside. The only “feature” that you have is whether it’s cloudy or not. You build an algorithm that predicts it will not rain when there are no clouds in the sky, and it will rain if it’s overcast. Your prediction for the clear sky will be completely accurate, with the predictions for the overcast being fairly accurate. You can potentially add other features – such as the density of the cloud coverage, windy conditions, temperature, etc., and build an algorithm based on all of those features that gives you a simple yes or no answer. That’s essentially what a decision tree is. Decision trees are very easy to understand and implement for simple binary choice classifications, but with a bit of work and ingenuity they can also work for regression problems.

To continue reading this article, click here.

Comments are closed.