Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Conversation

@Laksh1997
Copy link

Hi - I thought I would raise this PR which implements AdaBoundW.

I did so because I was using AdaBound from this repo with weight_decay=0.1 and noticed the training loss couldn't go down. The training also failed with Adam but succeeded with AdamW.

The algorithm is very simple, in fact just 1 line added compared to the original version:

        for group, base_lr in zip(self.param_groups, self.base_lrs):
            for p in group['params']:
                if p.grad is None:
                    continue

                # Perform stepweight decay
                p.mul_(1 - base_lr * group['weight_decay'])

Note, in the original implementation, the weight_decay is not scaled by the base_lr (which I think is not common practice as the official PyTorch Repo does it this way with AdamW (https://pytorch.org/docs/stable/_modules/torch/optim/adamw.html#AdamW)




class AdaBound(Optimizer):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this be named AdaBoundW?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, of course - my bad!

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Have changed to AdaBoundW now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Morty Proxy This is a proxified and sanitized view of the page, visit original site.