0

I would like to train my datasets in scikit-learn but export the final Gradient Boosting Regressor elsewhere so that I can make predictions directly on another platform.

I am aware that we can obtain the individual decision trees used by the regressor by accessing regressor.estimators[].tree_. What I would like to know is how to fit these decision trees together to make the final regression predictor.

Ethan
  • 1,657
  • 9
  • 25
  • 39
Chong Lip Phang
  • 231
  • 3
  • 9

1 Answers1

2

There are two estimators i.e. The initial predictor and the sub-estimators

init_estimator
The estimator that provides the initial predictions. Set via the init argument or loss.init_estimator.
estimators_
ndarray of DecisionTreeRegressor of shape (n_estimators, 1)
The collection of fitted sub-estimators.

Prediction after the first (i.e. init) estimator is controlled by the learning rate.

You can get the prediction as done in the below code -

trees = model.estimators_

x = x_test.iloc[10,:].values # A sample X to be predicted y_pred = model.init_.predict(x.reshape(1, -1)) # prediction from init estimator

for tree in trees: pred = tree[0].predict(x.reshape(1, -1)) # prediction from sub-estimator

y_pred = y_pred + model.learning_rate*pred  # Summing with LR

y_pred

Ethan
  • 1,657
  • 9
  • 25
  • 39
10xAI
  • 5,929
  • 2
  • 9
  • 25