Skip to content

fix: DeepSpeed device_placement ValueError#2570

Merged
bghira merged 1 commit intobghira:mainfrom
hjinnkim:bugfix/deepspeed-device-placement
Feb 4, 2026
Merged

fix: DeepSpeed device_placement ValueError#2570
bghira merged 1 commit intobghira:mainfrom
hjinnkim:bugfix/deepspeed-device-placement

Conversation

@hjinnkim
Copy link
Contributor

@hjinnkim hjinnkim commented Feb 4, 2026

accelerator.prepare() fails with DeepSpeed due to custom device_placement parameter.

Problem

When using DeepSpeed ZeRO Stage 2/3, training fails with:

ValueError: You can't customize device placements with DeepSpeed or Megatron-LM.

Root Cause

Accelerate raises this error when device_placement is passed with DeepSpeed:

SimpleTuner passes device_placement unconditionally:

Solution

Skip device_placement parameter when using DeepSpeed, as DeepSpeed handles device placement internally.

Test

  • DeepSpeed ZeRO Stage 2, 8 GPUs - training starts without error

@bghira bghira merged commit 00b7e7b into bghira:main Feb 4, 2026
2 checks passed
@hjinnkim hjinnkim deleted the bugfix/deepspeed-device-placement branch February 4, 2026 15:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants