10000 OLMo and OLMo 2 models do not match original models for low precisions · Issue #38117 · huggingface/transformers · GitHub
[go: up one dir, main page]

Skip to content
OLMo and OLMo 2 models do not match original models for low precisions #38117
@2015aroras

Description

@2015aroras

System Info

  • transformers version: 4.52.0.dev0
  • Platform: macOS-14.5-arm64-arm-64bit
  • Python version: 3.12.9
  • Huggingface_hub version: 0.31.2
  • Safetensors version: 0.5.2
  • Accelerate version: 1.6.0
  • Accelerate config: not found
  • DeepSpeed version: not installed
  • PyTorch version (GPU?): 2.6.0 (False)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using distributed or parallel set-up in script?: No

Who can help?

@ArthurZucker

Information

  • The official example scripts
  • My own modified scripts

Tasks

  • An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
  • My own task or dataset (give details below)

Reproduction

Steps to reproduce behavior:

  1. Obtain a copy of an OLMo or OLMo 2 model and its corresponding HF model.
  2. Run the original and HF model in a lower precision like float16.

Giving code for a repro is particularly complicated due to running the models in the original codebase, but I intend to fix this bug myself so hopefully it is fine to omit this.

Expected behavior

Expected behavior is that the output logits match or are close (say, all within 1e-4). In practice, this is not the case for bfloat16 and float16.

...
[2025-05-13 16:54:03] INFO     [__main__:246, rank=0] blocks.11.feed_forward_norm|output, model.layers.11.post_feedforward_layernorm|output element diff abs mean: 0.0077401818707585335
[2025-05-13 16:54:03] INFO     [__main__:246, rank=0] lm_head.norm|input, model.norm|input element diff abs mean: 0.015348772518336773
[2025-05-13 16:54:03] INFO     [__main__:246, rank=0] lm_head.norm|output, model.norm|output element diff abs mean: 0.01983717642724514
[2025-05-13 16:54:03] INFO     [__main__:246, rank=0] lm_head.w_out|input, lm_head|input element diff abs mean: 0.01983717642724514
[2025-05-13 16:54:03] INFO     [__main__:234, rank=0] lm_head.w_out|output, lm_head|output shape mismatch: torch.Size([1, 120, 100352]) torch.Size([1, 120, 100278])
[2025-05-13 16:54:03] INFO     [__main__:246, rank=0] lm_head.w_out|output, lm_head|output element diff abs mean: 0.03660527244210243

Metadata

Metadata

Assignees

No one assigned

    Labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions

      0