Skip to content

Navigation Menu

Sign in
Appearance settings

Search code, repositories, users, issues, pull requests...

Provide feedback

We read every piece of feedback, and take your input very seriously.

Saved searches

Use saved searches to filter your results more quickly

Appearance settings

Preserve weight_g/weight_v accessors on new weight_norm#102999

Copy link
Copy link
@ezyang

Description

@ezyang
Issue body actions

馃悰 Describe the bug

Parametrizations don't let you control what the original parameters are called; they're always original0, original1, etc. For weight_norm, this new naming is a bit obtuse; the original naming of g/v was better. Not sure if this is actually worth fixing, holler if you think it is.

cc @albanD @mruberry @jbschlosser @walterddr @mikaylagawarecki @lezcano

Versions

main

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: nnRelated to torch.nnRelated to torch.nnmodule: nn.utils.parametrizetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate moduleThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    Status

    To pick up
    Show more project fields

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      Morty Proxy This is a proxified and sanitized view of the page, visit original site.