8000 Preserve weight_g/weight_v accessors on new weight_norm · Issue #102999 · pytorch/pytorch · GitHub
[go: up one dir, main page]

Skip to content
Preserve weight_g/weight_v accessors on new weight_norm #102999
@ezyang

Description

@ezyang

🐛 Describe the bug

Parametrizations don't let you control what the original parameters are called; they're always original0, original1, etc. For weight_norm, this new naming is a bit obtuse; the original naming of g/v was better. Not sure if this is actually worth fixing, holler if you think it is.

cc @albanD @mruberry @jbschlosser @walterddr @mikaylagawarecki @lezcano

Versions

main

Metadata

Metadata

Assignees

No one assigned

    Labels

    module: nnRelated to torch.nnmodule: nn.utils.parametrizetriagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module

    Type

    No type

    Projects

    Status

    To pick up

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0