File size: 1,930 Bytes
a61b21a
 
b7a6b7f
 
a61b21a
 
b7a6b7f
a61b21a
b7a6b7f
a61b21a
b7a6b7f
 
 
 
a61b21a
b7a6b7f
a61b21a
b7a6b7f
 
a61b21a
b7a6b7f
 
 
 
 
a61b21a
b7a6b7f
a61b21a
b7a6b7f
 
 
 
 
 
 
 
a61b21a
b7a6b7f
 
 
a61b21a
b7a6b7f
 
a61b21a
 
b7a6b7f
a61b21a
b7a6b7f
a61b21a
b7a6b7f
a61b21a
b7a6b7f
 
 
 
 
a61b21a
b7a6b7f
 
 
 
 
 
a61b21a
b7a6b7f
 
 
 
 
a61b21a
b7a6b7f
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
---
library_name: diffusers
tags:
- quantization
---

# Usage with Diffusers

To use this quantized FLUX.1 [dev] checkpoint, you need to install the 🧨 diffusers and bitsandbytes library:

```
pip install -U diffusers
pip install -U bitsandbytes
```

After installing the required library, you can run the following script: 

```python
from diffusers import AutoModel, FluxPipeline

pipe = FluxPipeline.from_pretrained(
    "diffusers/FLUX.1-dev-bnb-4bit",
    torch_dtype=torch.bfloat16
)
pipe.to("cuda")

prompt = "Baroque style, a lavish palace interior with ornate gilded ceilings, intricate tapestries, and dramatic lighting over a grand staircase."

pipe_kwargs = {
    "prompt": prompt,
    "height": 1024,
    "width": 1024,
    "guidance_scale": 3.5,
    "num_inference_steps": 50,
    "max_sequence_length": 512,
}

image = pipe(
    **pipe_kwargs, generator=torch.manual_seed(0),
).images[0]

image.save("flux.png")
```


# How to generate this quantized checkpoint ? 

This checkpoint was created with the following script using "black-forest-labs/FLUX.1-dev" checkpoint:

```python

import torch
from diffusers import AutoModel, FluxPipeline
from diffusers import BitsAndBytesConfig as DiffusersBitsAndBytesConfig
from diffusers.quantizers import PipelineQuantizationConfig
from transformers import BitsAndBytesConfig as TransformersBitsAndBytesConfig

pipeline_quant_config = PipelineQuantizationConfig(
    quant_mapping={
        "transformer": DiffusersBitsAndBytesConfig(load_in_4bit=True, bnb_4bit_quant_type="nf4", bnb_4bit_compute_dtype=torch.bfloat16),
        "text_encoder_2": TransformersBitsAndBytesConfig(load_in_4bit=True, bnb_4bit_quant_type="nf4", bnb_4bit_compute_dtype=torch.bfloat16),
    }
)

pipe = FluxPipeline.from_pretrained(
    "black-forest-labs/FLUX.1-dev",
    quantization_config=pipeline_quant_config,
    torch_dtype=torch.bfloat16
)

pipe.save_pretrained("FLUX.1-dev-bnb-4bit")
```