Machine Learning Times
Machine Learning Times
EXCLUSIVE HIGHLIGHTS
How Predictive AI Will Solve GenAI’s Deadly Reliability Problem
  Originally published in Forbes Generative AI too unreliable to...
5 Ways To Hybridize Predictive AI And Generative AI
  Originally published in Forbes AI is in trouble. Both...
This Simple Arithmetic Can Optimize Your Main Business Operations
 Originally published in Forbes Deep down, we all know that...
Predictive AI Usually Fails Because It’s Not Usually Valuated
 Originally published in Forbes Why in the world would the...
SHARE THIS:

3 months ago
XGBoost is All You Need, Part 3 – Gradient Boosted Trees

 

Originally published on XGBlog, January 30, 2025.

This is the third part in the series of blog posts about XGBoost, based on my 2024 GTC presentation you can find Part 1 here, and Part 2 here.

Today we want to talk about gradient boosted trees. Even though XGBoost has an option for purely linear boosting, it’s the non-linear version – based on the decision tree algorithms – that gives this library and algorithm its predictive power.

Decision trees are perhaps the simplest predictive algorithm to describe. Say you want to know if it’s raining outside. The only “feature” that you have is whether it’s cloudy or not. You build an algorithm that predicts it will not rain when there are no clouds in the sky, and it will rain if it’s overcast. Your prediction for the clear sky will be completely accurate, with the predictions for the overcast being fairly accurate. You can potentially add other features – such as the density of the cloud coverage, windy conditions, temperature, etc., and build an algorithm based on all of those features that gives you a simple yes or no answer. That’s essentially what a decision tree is. Decision trees are very easy to understand and implement for simple binary choice classifications, but with a bit of work and ingenuity they can also work for regression problems.

To continue reading this article, click here.

Comments are closed.