Description
Description
With MLPClassifer, when warm_start is True or coeffs_ are provided, fit doesn’t respect max_iters. The reason for this is, when fitting, max iteration check is equality (==) against self.n_iter_. When warm_start is true or coeffs_ are provided, initialize is not called; this method resets n_iter_ to 0. Based on this implementation, there is doubt as to the meaning of max_iter. Consider, if max_iter is 1 and fit terminates due to reaching maximum iterations, subsequent fittings with warm_start true will never terminate due to reaching maximum iterations. This is bug. An alternate interpretation is max_iter represents the maximum iterations per fit call. In this case, the implementation is also wrong. The later interpretation seems more reasonable.
Steps/Code to Reproduce
import numpy as np
from sklearn.neural_network import MLPClassifier
X = np.random.rand(100,10)
y = np.random.random_integers(0, 1, (100,))
clf = MLPClassifier(max_iter=1, warm_start=True, verbose=True)
for k in range(3):
clf.fit(X, y)
Expected Results
Iteration 1, loss = 0.72311215
ConvergenceWarning: Stochastic Optimizer: Maximum iterations reached and the optimization hasn't converged yet.
Iteration 2, loss = 0.71843526
ConvergenceWarning: Stochastic Optimizer: Maximum iterations reached and the optimization hasn't converged yet.
Iteration 3, loss = 0.71418678
ConvergenceWarning: Stochastic Optimizer: Maximum iterations reached and the optimization hasn't converged yet.
Actual Results
Iteration 1, loss = 0.72311215
ConvergenceWarning: Stochastic Optimizer: Maximum iterations reached and the optimization hasn't converged yet.
Iteration 2, loss = 0.71843526
Iteration 3, loss = 0.71418678
Versions
Windows-7-6.1.7601-SP1
Python 3.6.0 (v3.6.0:41df79263a11, Dec 23 2016, 08:06:12) [MSC v.1900 64 bit (AMD64)]
NumPy 1.12.0
SciPy 0.18.1
Scikit-Learn 0.18.1