#GradientBoosting

Valeriy M., PhD, MBA, CQFpredict_addict@sigmoid.social
2025-03-02

How does it compare to XGBoost and LightGBM in your work? Drop a comment below!

#MachineLearning #DataScience #GradientBoosting #CatBoost #AI #XGBoost #LightGBM

---

Would you like me to tweak the post for a specific audience or style?

2025-01-15

3. #Gradientboosting is VERY interpretable with the #SHAPley method. They are totally misleading by saying their Deep Neural Network is more interpretable and boosting is not interpretable. They are apparently ignorant of these important advanced in interpretability, even though it is more than 5 years old now.

4. Despite a lot of talk about class imbalance, the churn datasets are not very imbalanced - 10%-20% churn rates. Really imbalanced data is low single digit churn rates.

IB Teguh TMteguhteja
2024-09-09

Master for ! Learn to build a powerful model using and $TSLA data. Enhance your trading strategy with data-driven insights.

teguhteja.id/gradient-boosting

Erik JonkerErikJonker
2024-03-26
2023-12-07

I ran a quick Gradient Boosted Trees vs Neural Nets check using scikit-learn's dev branch which makes it more convenient to work with tabular datasets with mixed numerical and categorical features data (e.g. the Adult Census dataset).

Let's start with the GBRT model. It's now possible to reproduce the SOTA number of this dataset in a few lines of code 2 s (CV included) on my laptop.

1/n

#sklearn #PyData #MachineLearning #TabularData #GradientBoosting #DeepLearning #Python

Screenshot of syntax highlighted snippet from: https://nbviewer.org/github/ogrisel/notebooks/blob/master/sklearn_demos/gbdt_vs_neural_nets_on_tabular_data.ipynb

The GBDT cross-validated accuracy is 0.873 +/- 0.002.
2023-05-17

I'm excited to see #gradientboosting making some news! There is so much #aihype around #llms (and before that it was #deeplearning) but I think that for most #datascientists working in industry the development of #gradientboosting #machinelearning algorithms (like #xgboost and #catboost) are the real revolution and will have a much more long lived impact on our work.

nature.com/articles/s41598-022

Marcin Paprzyckimarcinpaprzycki@masto.ai
2022-12-31

Using the #ensemble of #gradientboosting regression models for FedCSIS 2022 Challenge: “Diversified gradient boosting ensembles for prediction of the cost of forwarding contracts” by D. Ruta, M. Liu, L. Cen, QH Vu. @FedCSIS
2022, ACSIS Vol. 30 p. 431–436; tinyurl.com/4vszh6wt

2022-12-01

#MachineLearning lesson of the day: Working with a #gradientboosting model, I got no traction cross-validating hyperparameters like tree depth and number; but different evaluation metrics (e.g. SMSE vs. MAE etc.) had a major impact. Have you tried this?

IMHO #DNN get all the press because they do sexy human jobs like seeing and processing language. But in the business world of tabular data, #gradientboosting is where the real revolution is happening! #xgboost #catboost #lightgbm #DataScience

Client Info

Server: https://mastodon.social
Version: 2025.04
Repository: https://github.com/cyevgeniy/lmst