-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
TST add py_loss for tests in _sgd_fast.pyx #18924
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you @TimotheeMathieu !
Could you please change
for p, y, expected in cases: |
def _test_gradient_common(loss_function, cases):
[...]
for p, y, expected_loss, expected_dloss in cases:
assert_almost_equal(loss_function.py_loss(p, y), expected_loss)
assert_almost_equal(loss_function.py_dloss(p, y), expected_dloss)
and then update the tests that use this helper function in that file to also include the expected value of the loss in the "cases" variable?
To other reviewers: those are normally private objects (no docs) , though imported in linear_models/__init__.py
:/ But in any case this would be consistent with what we did for gradient with the py_dloss
, and having checks for loss in addition of the gradient wouldn't hurt.
For scikit-learn-extra if would really help if we didn't have to vendor those loss definitions, and could reuse them (knowing that there is no backward compatibility guarantee).
Somewhat related to #15123
@TimotheeMathieu the only thing is that we don't yet use black in scikit-learn, so the PR should only change the minimal lines needed for this PR. So you would need to revert black related changes in the last commit as unrelated code style changes make using |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @TimotheeMathieu !
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@TimotheeMathieu Thanks for this PR. Only one little nitpick from my side.
Reference Issues/PRs
PR arose because of scikit-learn-contrib/scikit-learn-extra#78
What does this implement/fix? Explain your changes.
I add a py_loss function in the class LossFunction in cython file _sgd_fast.pyx.
_sgd_fast.pyx implement a lot of loss functions for regression and classification and allow fast computation of them. I want to use these loss function by simple import but the current implementation only support py_dloss (the derivative of the loss) and not py_loss which returns the loss function.
Any other comments?