8000 Rope embeddings with complex number generates compile warning...possible options · Issue #15 · pytorch/torchtitan · GitHub
[go: up one dir, main page]

Skip to content
Rope embeddings with complex number generates compile warning...possible options #15
@lessw2020

Description

@lessw2020

Currently torch compiling the default llama model will generate a warning re: being unable to lower complex numbers.

torch/_inductor/lowering.py:1639: UserWarning: Torchinductor does not support code generation for complex operators. Performance may be worse than eager.

This is from the current rope embeddings impl.
This issue is to track it as it may be disconcerting to a user to see this warning.
1 - we could investigate suppressing the error if we confirm there is really no speed change.
2 - could look at non complex /compile friendly rope impl ala:
https://github.com/lessw2020/PyTorch_MultiModal/blob/2d0ab30e0f85c5e1e68d3ab62126a91e41188e82/torchmultimodal/modules/layers/position_embedding.py#L174

current plan is to review this after impl has stabilized at which point we can compare speed and loss curves.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0