Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

[MRG] FIX top_k_accuracy_score ignoring labels for "multiclass" case #19721

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 11 commits into from
Apr 26, 2021
Merged

[MRG] FIX top_k_accuracy_score ignoring labels for "multiclass" case #19721

merged 11 commits into from
Apr 26, 2021

Conversation

joclement
Copy link
Contributor

Reference Issues/PRs

Same changes as #19300, which I accidentally closed, but can not reopen.

What does this implement/fix? Explain your changes.

See #19300 for these changes and review by @thomasjpfan. If wanted, I can copy that content here.

Any other comments?

I'm sorry for this extra work/confusion.

@kyleabeauchamp
Copy link
Contributor

Would love to see this one merge, anything needed to make sure it crosses the finish line?

Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for working on this @flyingdutchman23

doc/whats_new/v1.0.rst Outdated Show resolved Hide resolved
sklearn/metrics/tests/test_ranking.py Show resolved Hide resolved
sklearn/metrics/tests/test_ranking.py Outdated Show resolved Hide resolved
@thomasjpfan thomasjpfan changed the title [MRG] Fix top_k_accuracy_score ignoring labels for "multiclass" case FIX Fix top_k_accuracy_score ignoring labels for "multiclass" case Apr 2, 2021
Copy link
Member

@thomasjpfan thomasjpfan left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@joclement joclement changed the title FIX Fix top_k_accuracy_score ignoring labels for "multiclass" case [MRG] FIX top_k_accuracy_score ignoring labels for "multiclass" case Apr 4, 2021
@kyleabeauchamp
Copy link
Contributor

Do you need help resolving the merge conflict? I'm interested in making sure this lands before the next release :).

@thomasjpfan thomasjpfan changed the title [MRG] FIX top_k_accuracy_score ignoring labels for "multiclass" case FIX top_k_accuracy_score ignoring labels for "multiclass" case Apr 12, 2021
@joclement
Copy link
Contributor Author

joclement commented Apr 13, 2021

Do you need help resolving the merge conflict? I'm interested in making sure this lands before the next release :).

Thank you for pointing that out, Done.

I further improved a commit description and included a fixup commit in another commit to have a cleaner history.

@joclement joclement changed the title FIX top_k_accuracy_score ignoring labels for "multiclass" case [MRG] FIX top_k_accuracy_score ignoring labels for "multiclass" case Apr 20, 2021
@kyleabeauchamp
Copy link
Contributor

kyleabeauchamp commented Apr 22, 2021

I guess this bugfix is not slated to land in the pending release?

Copy link
Member

@jnothman jnothman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM. @glemaitre should it go in 0.24.2 as a fix to a new feature??

@glemaitre
Copy link
Member

We can add it in the upcoming release. You only need to move the entry in 0.24.rst instead of 1.0

@glemaitre glemaitre added this to the 0.24.2 milestone Apr 25, 2021
@glemaitre glemaitre added the To backport PR merged in master that need a backport to a release branch defined based on the milestone. label Apr 25, 2021
joclement and others added 10 commits April 26, 2021 08:45
Currently the last 2 parameters of the added test fail, because the
labels are not considered to decide whether the target is "binary" or
"multiclass".
The labels parameters is only used in later steps.
If a problem is actually "multiclass", and not all classes are contained
in the parameter `y_true` , the function fails, because the
determined type is "binary".
That decision makes sense, if the parameter labels is not passed.
The problem is that the function also fails, if the parameter `labels`
is passed, although it would be possible to determine the type of and
the number of classes in conjunction with this parameter.
This commit fixes that, by checking whether the `labels` parameter has
been set and contains more than 2 classes, if the type has been
determined to be "binary" in the previous step.
This is for the case where `labels` is an `ndarray`.

Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
Co-authored-by: Thomas J. Fan <thomasjpfan@gmail.com>
@joclement
Copy link
Contributor Author

@glemaitre thanks for reviewing. The changelog has been moved to v0.24.rst.

@glemaitre glemaitre merged commit 6927fa2 into scikit-learn:main Apr 26, 2021
@glemaitre
Copy link
Member

Thanks

glemaitre pushed a commit to glemaitre/scikit-learn that referenced this pull request Apr 26, 2021
@joclement joclement deleted the fix-top-k-accuracy branch April 26, 2021 12:25
glemaitre pushed a commit that referenced this pull request Apr 28, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
module:metrics To backport PR merged in master that need a backport to a release branch defined based on the milestone.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants
Morty Proxy This is a proxified and sanitized view of the page, visit original site.