-
-
Notifications
You must be signed in to change notification settings - Fork 25.9k
Fix check_decision_proba_consistency random failure #19225
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix check_decision_proba_consistency random failure #19225
Conversation
Apparently it was necessary for the Gradient Boosting classifiers on 32 bit linux... So apparently the goal is to create deterministic ties rather than close but non-deterministic non-ties at the rounding error level. I will revert this part of the change. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Side note: Rounding to support different platforms may become a pattern: #19221
* FIX more deterministic check_decision_proba_consistency * Trigger [cd build] * Re-add rounding * Trigger [cd build] * Avoid redundant phrasing in comment [ci skip]
* FIX more deterministic check_decision_proba_consistency * Trigger [cd build] * Re-add rounding * Trigger [cd build] * Avoid redundant phrasing in comment [ci skip]
Tentative fix for #19224.
X_test
was generated from the RNG singleton ofnp.random
and therefore the test was not deterministic.X_test
from the same bloby distribution as the training set which is less likely to result in samples close to the decision boundary although I am not sure this has any actual impact. But it seems more natural to do so.I removed the rounding thingy that I did not understand. How rounding could possibly reduce the likelihood of ties?Anyway lets see how the CI likes this including the
[cd build]
.