Skip to content

Commit 03c3f69

Browse files
sayakpaulDN6stevhliu
authored
[docs] diffusers gguf checkpoints (#12092)
* feat: support loading diffusers format gguf checkpoints. * update * update * qwen * up * Apply suggestions from code review Co-authored-by: Steven Liu <[email protected]> Co-authored-by: Dhruv Nair <[email protected]> * up --------- Co-authored-by: DN6 <[email protected]> Co-authored-by: Steven Liu <[email protected]>
1 parent f20aba3 commit 03c3f69

File tree

1 file changed

+41
-0
lines changed

1 file changed

+41
-0
lines changed

docs/source/en/quantization/gguf.md

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -77,3 +77,44 @@ Once installed, set `DIFFUSERS_GGUF_CUDA_KERNELS=true` to use optimized kernels
7777
- Q5_K
7878
- Q6_K
7979

80+
## Convert to GGUF
81+
82+
Use the Space below to convert a Diffusers checkpoint into the GGUF format for inference.
83+
run conversion:
84+
85+
<iframe
86+
src="https://diffusers-internal-dev-diffusers-to-gguf.hf.space"
87+
frameborder="0"
88+
width="850"
89+
height="450"
90+
></iframe>
91+
92+
93+
```py
94+
import torch
95+
96+
from diffusers import FluxPipeline, FluxTransformer2DModel, GGUFQuantizationConfig
97+
98+
ckpt_path = (
99+
"https://huggingface.co/sayakpaul/different-lora-from-civitai/blob/main/flux_dev_diffusers-q4_0.gguf"
100+
)
101+
transformer = FluxTransformer2DModel.from_single_file(
102+
ckpt_path,
103+
quantization_config=GGUFQuantizationConfig(compute_dtype=torch.bfloat16),
104+
config="black-forest-labs/FLUX.1-dev",
105+
subfolder="transformer",
106+
torch_dtype=torch.bfloat16,
107+
)
108+
pipe = FluxPipeline.from_pretrained(
109+
"black-forest-labs/FLUX.1-dev",
110+
transformer=transformer,
111+
torch_dtype=torch.bfloat16,
112+
)
113+
pipe.enable_model_cpu_offload()
114+
prompt = "A cat holding a sign that says hello world"
115+
image = pipe(prompt, generator=torch.manual_seed(0)).images[0]
116+
image.save("flux-gguf.png")
117+
```
118+
119+
When using Diffusers format GGUF checkpoints, it's a must to provide the model `config` path. If the
120+
model config resides in a `subfolder`, that needs to be specified, too.

0 commit comments

Comments
 (0)