Skip to content

Add xpu support for Intel Arc#60

Open
Turijon wants to merge 2 commits into1038lab:mainfrom
Turijon:turijon_qwenvl_xpu_support
Open

Add xpu support for Intel Arc#60
Turijon wants to merge 2 commits into1038lab:mainfrom
Turijon:turijon_qwenvl_xpu_support

Conversation

@Turijon
Copy link
Copy Markdown

@Turijon Turijon commented Dec 7, 2025

Adds support for running on Intel Arc using xpu. Tested on Intel Arc A770M 16GB with pytorch 2.8.0+xpu. Not tested if it breaks cuda support (but it shouldn't).

@1038lab
Copy link
Copy Markdown
Owner

1038lab commented Dec 7, 2025

Thanks for the PR! The XPU support is great to have.

One small thing: in the current implementation, xpu becomes mandatory whenever an XPU device is detected:

"device_map": {"": 0} if (device == "cuda" and torch.cuda.is_available())
                 else {"": "xpu:0"} if xpu_availiable()
                 else device

This overrides the user’s selection, so even if someone chooses cpu / mps / auto, the model will still be forced onto XPU as long as PyTorch detects it.

To keep behavior consistent with CUDA and MPS, could you update it so that:

  • XPU is only used when the user explicitly selects device="xpu", and
  • auto can choose XPU as a recommended option, but shouldn’t override explicit choices.

Something like:

if device == "cuda" and torch.cuda.is_available():
    device_map = {"": 0}
elif device == "xpu" and xpu_availiable():
    device_map = {"": "xpu:0"}
else:
    device_map = device

This way XPU works correctly without changing behavior for existing CUDA/CPU/MPS users.

Let me know if you need help adjusting it — I can prepare a patch if needed!

@Turijon
Copy link
Copy Markdown
Author

Turijon commented Dec 7, 2025

Good catch. Updated the PR.

@WhitePr
Copy link
Copy Markdown

WhitePr commented Feb 19, 2026

Will this pull request still be merged? I modified the code according to the commit file, and it works fine on my B580.

@Turijon
Copy link
Copy Markdown
Author

Turijon commented Feb 19, 2026

I hope so. I don't have access to do the merge. Maintainers have to do that.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants