Skip to content

Conversation

@dependabot
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Nov 13, 2025

Bumps xformers from 0.0.30 to 0.0.33.post1.

Release notes

Sourced from xformers's releases.

v0.0.33.post1

Fixed wheel upload to pypi

Support Pytorch 2.9

Added

  • cutlass fmha Op for Blackwell GPUs
  • Support flash-attention package up to 2.8.3
  • expose FA3 deterministic mode
  • FW+BW pass overlap for DeepSeek-like comms/compute overlap

Improved

  • merge_attentions support for irregular head dimension

v0.0.32.post2

Add ROCM 6.4 build

v0.0.32.post1

No release notes provided.

v0.0.31.post1 Fixing wheels for windows

No release notes provided.

Changelog

Sourced from xformers's changelog.

Changelog

All notable changes to this project will be documented in this file.

The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.

[0.0.34] - 2025-??-??

[0.0.33] - 2025-11-12

Pre-built binary wheels are available for PyTorch 2.9.0.

Added

  • cutlass fmha Op for Blackwell GPUs
  • Support flash-attention package up to 2.8.3
  • expose FA3 deterministic mode
  • FW+BW pass overlap for DeepSeek-like comms/compute overlap

Improved

  • merge_attentions support for irregular head dimension

[0.0.32] - 2025-08-13

Pre-built binary wheels are available for PyTorch 2.8.0.

Added

  • Support flash-attention package up to 2.8.2
  • Speed improvements to python -m xformers.profiler.find_slowest

Removed

  • Removed autograd backward pass for merge_attentions as it is easy to use incorrectly.
  • Attention biases are no longer torch.Tensor subclasses. This is no longer necessary for torch.compile to work, and was adding more complexity

[0.0.31] - 2025-06-25

Pre-built binary wheels are available for PyTorch 2.7.1.

Added

  • xFormers wheels are now python-version agnostic: this means that the same wheel can be used for python 3.9, 3.10, ... 3.13
  • Added support for Flash-Attention 3 on Ampere GPUs

Removed

  • pytorch/pytorch#147607
  • Deprecated support for building Flash-Attention 2 as part of xFormers. For Ampere GPUs, we now use Flash-Attention 3 on windows, and Flash-Attention 2 can still be used through PyTorch on linux.
Commits
  • 3f91ad6fairinternal/xformers#1444
  • c915971fairinternal/xformers#1443
  • aa7bc36fairinternal/xformers#1442
  • e98c69bfairinternal/xformers#1437
  • a562f16fairinternal/xformers#1435
  • a64b139 [NVIDIA] Fix build xformers >= cu129 (torch 2.9.0) (#1344)
  • 51aa071fairinternal/xformers#1428
  • 4c82fc3fairinternal/xformers#1430
  • 4656807 cutlass_blackwell import not crashing in CPU-only environment (fairinternal/x...
  • 1b3a16e Add BlockDiagonalCausalLocalAttentionPaddedKeysMask to FA3 splitk op (fairint...
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [xformers](https://github.com/facebookresearch/xformers) from 0.0.30 to 0.0.33.post1.
- [Release notes](https://github.com/facebookresearch/xformers/releases)
- [Changelog](https://github.com/facebookresearch/xformers/blob/main/CHANGELOG.md)
- [Commits](facebookresearch/xformers@v0.0.30...v0.0.33.post1)

---
updated-dependencies:
- dependency-name: xformers
  dependency-version: 0.0.33.post1
  dependency-type: direct:development
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <[email protected]>
Copy link

@cursor cursor bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comment @cursor review or bugbot run to trigger another review on this PR

[project.optional-dependencies]
stable-fast = [
"xformers==0.0.30",
"xformers==0.0.33.post1",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Xformers Requires Matching PyTorch Version

xformers==0.0.33.post1 is built for PyTorch 2.9.0 but the project uses torch==2.7.0. xformers contains compiled C++/CUDA extensions that require matching PyTorch versions. This mismatch will cause runtime failures when xformers attempts to load, with errors indicating the PyTorch version incompatibility. The correct version for PyTorch 2.7.x is xformers==0.0.31.post1.

Fix in Cursor Fix in Web

]
full = [
"xformers==0.0.30",
"xformers==0.0.33.post1",
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Bug: Xformers-PyTorch Version Conflict

xformers==0.0.33.post1 is built for PyTorch 2.9.0 but the project uses torch==2.7.0. xformers contains compiled C++/CUDA extensions that require matching PyTorch versions. This mismatch will cause runtime failures when xformers attempts to load, with errors indicating the PyTorch version incompatibility. The correct version for PyTorch 2.7.x is xformers==0.0.31.post1.

Fix in Cursor Fix in Web

@github-actions
Copy link

This PR has been inactive for 10 days and is now marked as stale.

@github-actions github-actions bot added the stale label Nov 24, 2025
@github-actions github-actions bot closed this Dec 1, 2025
@dependabot @github
Copy link
Contributor Author

dependabot bot commented on behalf of github Dec 1, 2025

OK, I won't notify you again about this release, but will get in touch when a new version is available. If you'd rather skip all updates until the next major or minor version, let me know by commenting @dependabot ignore this major version or @dependabot ignore this minor version. You can also ignore all major, minor, or patch releases for a dependency by adding an ignore condition with the desired update_types to your config file.

If you change your mind, just re-open this PR and I'll resolve any conflicts on it.

@dependabot dependabot bot deleted the dependabot/pip/xformers-0.0.33.post1 branch December 1, 2025 00:09
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant