-
Notifications
You must be signed in to change notification settings - Fork 25.6k
[WIP] Adding SGDW #22466
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[WIP] Adding SGDW #22466
Conversation
Adding SGDW. gh-metadata: pytorch pytorch 22466 gh/vincentqb/16/head
Adding SGDW. gh-metadata: pytorch pytorch 22466 gh/vincentqb/16/head
Adding SGDW. gh-metadata: pytorch pytorch 22466 gh/vincentqb/16/head
Hi @vincentqb! Thank you for your pull request. We require contributors to sign our Contributor License Agreement, and yours needs attention. You currently have a record in our system, but the CLA is no longer valid, and will need to be resubmitted. ProcessIn order for us to review and merge your suggested changes, please sign at https://code.facebook.com/cla. If you are contributing on behalf of someone else (eg your employer), the individual CLA may not be sufficient and your employer may need to sign the corporate CLA. Once the CLA is signed, our tooling will perform checks and validations. Afterwards, the pull request will be tagged with If you have received this in error or have any questions, please contact us at cla@fb.com. Thanks! |
Thank you for signing our Contributor License Agreement. We can now accept your code for this (and any) Meta Open Source project. Thanks! |
Looks like this PR hasn't been updated in a while so we're going to go ahead and mark this as |
Hi @vincentqb, this PR was marked as Do you (still) have plans to add SGDW in the future? |
The weight decay implementation of this code is incorrect. I think it should be # Apply weight decay
if weight_decay != 0:
p.data.mul_(1 - weight_decay)
# Apply momentum
p.data.add_(d_p, alpha=-group['lr']) |
We implement SGDW, following #3790 as proposed in #3740, #4429. This follows the implementation of AdamW in #21250. We should validate this PR against another existing implementation as done in #21250.
This pull request does not include weight schedulers, see #22343.
Stack from ghstack:
Differential Revision: D16096615