@@ -313,9 +313,9 @@ class AdaBoostClassifier(ClassifierMixin, BaseWeightBoosting):
313
313
In case of perfect fit, the learning procedure is stopped early.
314
314
315
315
learning_rate : float, default=1.
316
- Learning rate shrinks the contribution of each classifier by
317
- ``learning_rate``. There is a trade-off between ``learning_rate`` and
318
- `` n_estimators`` .
316
+ Weight applied to each classifier at each boosting iteration. A higher
317
+ learning rate increases the contribution of each classifier. There is
318
+ a trade-off between the `learning_rate` and ` n_estimators` parameters .
319
319
320
320
algorithm : {'SAMME', 'SAMME.R'}, default='SAMME.R'
321
321
If 'SAMME.R' then use the SAMME.R real boosting algorithm.
@@ -898,9 +898,9 @@ class AdaBoostRegressor(RegressorMixin, BaseWeightBoosting):
898
898
In case of perfect fit, the learning procedure is stopped early.
899
899
900
900
learning_rate : float, default=1.
901
- Learning rate shrinks the contribution of each regressor by
902
- ``learning_rate``. There is a trade-off between ``learning_rate`` and
903
- `` n_estimators`` .
901
+ Weight applied to each classifier at each boosting iteration. A higher
902
+ learning rate increases the contribution of each classifier. There is
903
+ a trade-off between the `learning_rate` and ` n_estimators` parameters .
904
904
905
905
loss : {'linear', 'square', 'exponential'}, default='linear'
906
906
The loss function to use when updating the weights after each
0 commit comments