Skip to content

Attention, block and swiglu_ffn modules always raise warnings about xformers #513

@rfezzani

Description

@rfezzani

Thank you first of all for this amazing work.

When importing dinov2 using torch.hub.load I always face warnings concerning xformers:

>>> import torch
>>> model = torch.hub.load('facebookresearch/dinov2', "dinov2_vits14", pretrained=False)
Using cache found in /home/rfez/.cache/torch/hub/facebookresearch_dinov2_main
/home/xxxx/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/swiglu_ffn.py:51: UserWarning: xFormers is not available (SwiGLU)
  warnings.warn("xFormers is not available (SwiGLU)")
/home/xxxx/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/attention.py:33: UserWarning: xFormers is not available (Attention)
  warnings.warn("xFormers is not available (Attention)")
/home/xxxx/.cache/torch/hub/facebookresearch_dinov2_main/dinov2/layers/block.py:40: UserWarning: xFormers is not available (Block)
  warnings.warn("xFormers is not available (Block)")

I tried to debug this and found that, even if xformers is installed, a warning is issued saying that XFormers is available...!

This is a little bit annoying since there is no elegant way to silence these warnings :/

This is related to #151

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions