Skip to content

ImportError: cannot import 'BaseModelOutputWithPastAndCrossAttentions' with transformers >= 4.45 #18

@zdk123

Description

@zdk123

Description

faesm 0.1.1 is incompatible with recent versions of transformers (4.45+). The import of BaseModelOutputWithPastAndCrossAttentions from transformers.models.esm.modeling_esm fails because this class was moved/removed in newer transformers releases.

Environment:

  • faesm version: 0.1.1
  • transformers version: 4.57.3 (also affects 4.45.0+)
  • Python version: 3.11
  • PyTorch version: 2.1

Error Message

>>> from faesm.esm import FAEsmForMaskedLM

          [Warning] Flash Attention not installed.
          By default we will use Pytorch SDPA attention,
          which is slower than Flash Attention but better than official ESM.
    
Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/opt/conda/lib/python3.11/site-packages/faesm/esm.py", line 29, in <module>
    from transformers.models.esm.modeling_esm import (
ImportError: cannot import name 'BaseModelOutputWithPastAndCrossAttentions' from 'transformers.models.esm.modeling_esm' (/opt/conda/lib/python3.11/site-packages/transformers/models/esm/modeling_esm.py)

See the transformers package changelog for 4.45: https://github.com/huggingface/transformers/releases/tag/v4.45.0

Thanks!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions