Skip to content

Wan2.2 Lightning LoRA Loading FailsΒ #12147

@ghunkins

Description

@ghunkins

Describe the bug

Loading Wan2.2 LightX2V Lightning LoRAs fails.

Reproduction

import torch
from diffusers import WanImageToVideoPipeline, UniPCMultistepScheduler
from huggingface_hub import hf_hub_download
from diffusers.loaders.lora_conversion_utils import _convert_non_diffusers_wan_lora_to_diffusers
import safetensors.torch

# Load the pipe
pipe = WanImageToVideoPipeline.from_pretrained(
    "Wan-AI/Wan2.2-I2V-A14B-Diffusers",
    torch_dtype=torch.bfloat16,
)
pipe.scheduler = UniPCMultistepScheduler.from_config(pipe.scheduler.config, flow_shift=8.0)

# Download the LoRAs
high_noise_lora_path = hf_hub_download(
    repo_id="lightx2v/Wan2.2-Lightning",
    filename="Wan2.2-I2V-A14B-4steps-lora-rank64-Seko-V1/high_noise_model.safetensors"
)
low_noise_lora_path = hf_hub_download(
    repo_id="lightx2v/Wan2.2-Lightning",
    filename="Wan2.2-I2V-A14B-4steps-lora-rank64-Seko-V1/low_noise_model.safetensors",
)

# LoRA conversion
def load_wan_lora(path: str):
    return _convert_non_diffusers_wan_lora_to_diffusers(
        safetensors.torch.load_file(path)
    )

# Load into the transformers
pipe.transformer.load_lora_adapter(load_wan_lora(high_noise_lora_path), adapter_name="high_noise")
pipe.transformer.set_adapters(["high_noise"], weights=[1.0])

pipe.transformer_2.load_lora_adapter(load_wan_lora(low_noise_lora_path), adapter_name="low_noise")
pipe.transformer_2.set_adapters(["low_noise"], weights=[1.0])

### Logs

```shell
ValueError                                Traceback (most recent call last)
/tmp/ipython-input-85712217.py in <cell line: 0>()
      9     )
     10 
---> 11 pipe.transformer.load_lora_adapter(load_wan_lora(high_noise_lora_path), adapter_name="high_noise")
     12 pipe.transformer.set_adapters(["high_noise"], weights=[1.0])
     13 

1 frames
/usr/local/lib/python3.11/dist-packages/diffusers/loaders/lora_conversion_utils.py in _convert_non_diffusers_wan_lora_to_diffusers(state_dict)
   1995             )
   1996         else:
-> 1997             raise ValueError(f"`state_dict` should be empty at this point but has {original_state_dict.keys()=}")
   1998 
   1999     for key in list(converted_state_dict.keys()):

ValueError: `state_dict` should be empty at this point but has original_state_dict.keys()=dict_keys(['blocks.0.cross_attn.k.alpha', 'blocks.0.cross_attn.o.alpha', 'blocks.0.cross_attn.q.alpha', 'blocks.0.cross_attn.v.alpha', 'blocks.0.ffn.0.alpha', 'blocks.0.ffn.2.alpha', 'blocks.0.self_attn.k.alpha', 'blocks.0.self_attn.o.alpha', 'blocks.0.self_attn.q.alpha', 'blocks.0.self_attn.v.alpha', 'blocks.1.cross_attn.k.alpha', 'blocks.1.cross_attn.o.alpha', 'blocks.1.cross_attn.q.alpha', 'blocks.1.cross_attn.v.alpha', 'blocks.1.ffn.0.alpha', 'blocks.1.ffn.2.alpha', 'blocks.1.self_attn.k.alpha', 'blocks.1.self_attn.o.alpha', 'blocks.1.self_attn.q.alpha', 'blocks.1.self_attn.v.alpha', 'blocks.10.cross_attn.k.alpha', 'blocks.10.cross_attn.o.alpha', 'blocks.10.cross_attn.q.alpha', 'blocks.10.cross_attn.v.alpha', 'blocks.10.ffn.0.alpha', 'blocks.10.ffn.2.alpha', 'blocks.10.self_attn.k.alpha', 'blocks.10.self_attn.o.alpha', 'blocks.10.self_attn.q.alpha', 'blocks.10.self_attn.v.alpha', 'blocks.11.cross_attn.k.alpha', 'blocks.11.cross_attn.o.alpha', 'blocks.11.cross_attn.q.alpha', 'blocks.11.cross_attn.v.alpha', 'blocks.11.ffn.0.alpha', 'blocks.11.ffn.2.alpha', 'blocks.11.self_attn.k.alpha', 'blocks.11.self_attn.o.alpha', 'blocks.11.self_attn.q.alpha', 'blocks.11.self_attn.v.alpha', 'blocks.12.cross_attn.k.alpha', 'blocks.12.cross_attn.o.alpha', 'blocks.12.cross_attn.q.alpha', 'blocks.12.cross_attn.v.alpha', 'blocks.12.ffn.0.alpha', 'blocks.12.ffn.2.alpha', 'blocks.12.self_att...

System Info

  • πŸ€— Diffusers version: 0.35.0.dev0
  • Platform: Linux-6.1.123+-x86_64-with-glibc2.35
  • Running on Google Colab?: Yes
  • Python version: 3.11.13
  • PyTorch version (GPU?): 2.6.0+cu124 (True)
  • Flax version (CPU?/GPU?/TPU?): 0.10.6 (gpu)
  • Jax version: 0.5.3
  • JaxLib version: 0.5.3
  • Huggingface_hub version: 0.34.4
  • Transformers version: 4.55.0
  • Accelerate version: 1.10.0
  • PEFT version: 0.17.0
  • Bitsandbytes version: 0.47.0
  • Safetensors version: 0.6.2
  • xFormers version: not installed
  • Accelerator: NVIDIA A100-SXM4-40GB, 40960 MiB
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?: No

Who can help?

@sayakpaul Thank you for any help!! πŸ™

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions