-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
FIX ElasticNet.fit does not modify sample_weight in place #19055
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FIX ElasticNet.fit does not modify sample_weight in place #19055
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@thomasjpfan In the name of the person who introduced it, thanks for the fix:blush:
|
||
@pytest.mark.parametrize("check_input", [True, False]) | ||
def test_enet_sample_weight_does_not_overwrite_sample_weight(check_input): | ||
"""Check that elastic next does not overwrite sample_weights.""" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"""Check that elastic next does not overwrite sample_weights.""" | |
"""Check that elastic net does not overwrite sample_weights.""" |
reg = ElasticNet() | ||
reg.fit(X, y, sample_weight=sample_weight, check_input=check_input) | ||
|
||
assert_allclose(sample_weight, sample_weight_1_25) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could we even test with assert_array_equal(sample_weight, sample_weight_1_25)
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
We tend to avoid checking strict equality between floats to avoid failures due to rounding errors, although here it should hold since sample_weight should not be modified at all. I guess it doesn't hurt to be extra cautious.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
lgtm
@thomasjpfan Why the name change in what's new? You submitted this PR and every commit of it. |
@lorentzenchr I misread your comment here: #19055 (review) My name is in https://github.com/scikit-learn/scikit-learn/blob/master/doc/whats_new/_contributors.rst so I can reference it from there with:
|
Reference Issues/PRs
Fixes #19044
What does this implement/fix? Explain your changes.
sample_weight
is no longer modified in place.