-
Notifications
You must be signed in to change notification settings - Fork 12
Open
Description
Description
faesm 0.1.1 is incompatible with recent versions of transformers (4.45+). The import of BaseModelOutputWithPastAndCrossAttentions from transformers.models.esm.modeling_esm fails because this class was moved/removed in newer transformers releases.
Environment:
- faesm version: 0.1.1
- transformers version: 4.57.3 (also affects 4.45.0+)
- Python version: 3.11
- PyTorch version: 2.1
Error Message
>>> from faesm.esm import FAEsmForMaskedLM
[Warning] Flash Attention not installed.
By default we will use Pytorch SDPA attention,
which is slower than Flash Attention but better than official ESM.
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/opt/conda/lib/python3.11/site-packages/faesm/esm.py", line 29, in <module>
from transformers.models.esm.modeling_esm import (
ImportError: cannot import name 'BaseModelOutputWithPastAndCrossAttentions' from 'transformers.models.esm.modeling_esm' (/opt/conda/lib/python3.11/site-packages/transformers/models/esm/modeling_esm.py)
See the transformers package changelog for 4.45: https://github.com/huggingface/transformers/releases/tag/v4.45.0
Thanks!
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels