dtype of flux1-dev-fp8-e5m2.safetensors is float8_e4m3fn ?
1
#39 opened 7 days ago
by
pbouda-signature
Hyper-FLUX.1-dev-8steps-lora.safetensors
#37 opened about 2 months ago
by
LHJ0
FP8 for flux depth and flux canny.
#36 opened 2 months ago
by
samurai-zero

HOW TO USE THE flux-fp8 IN PYTHON?
#35 opened 3 months ago
by
KevinJMChen
Is your dev-fp8-e4m3fn version consistent with the official fp8 version released by Comfyui?
3
#33 opened 4 months ago
by
HUG-NAN
Which model is better for low vram/ram?
#32 opened 5 months ago
by
younyokel

Noisy Output When used in the Flux Inpainting Pipeline
#31 opened 5 months ago
by
VikramSingh178

cuda out of memory for flux1-dev-fp8.safetensors
3
#30 opened 5 months ago
by
pritam-tam
Silent exit when loading via FluxTransformer2DModel.from_single_file and no GPU use
#29 opened 5 months ago
by
nivgo
Inquiry on Latest Update for flux1-dev-fp8
4
#27 opened 5 months ago
by
torealise

fp8 inference
1
#26 opened 6 months ago
by
Melody32768
wrong model
#25 opened 6 months ago
by
sunhaha123
Update README.md
#24 opened 6 months ago
by
WBD8
"model weight dtype torch.float8_e4m3fn, manual cast: torch.bfloat16" with ROCM6.0
4
#23 opened 6 months ago
by
12letter
quite slow to load the fp8 model
11
#21 opened 7 months ago
by
gpt3eth
RuntimeError: "addmm_cuda" not implemented for 'Float8_e4m3fn'
1
#20 opened 7 months ago
by
gradient-diffusion
How to load into VRAM?
2
#19 opened 7 months ago
by
MicahV
What setting to use for flux1-dev-fp8
2
#18 opened 7 months ago
by
fullsoftwares
'float8_e4m3fn' attribute error
6
#17 opened 7 months ago
by
Magenta6
Loading flux-fp8 with diffusers
1
#16 opened 7 months ago
by
8au
FP8 Checkpoint version size mismatch?
2
#15 opened 7 months ago
by
Thireus
Can this model be used on Apple Silicon?
25
#14 opened 7 months ago
by
jsmidt
How to use fp8 models + original flux repo?
#13 opened 7 months ago
by
rolux
Quantization Method?
9
#7 opened 7 months ago
by
vyralsurfer
ComfyUi Workflow
1
#6 opened 7 months ago
by
Jebari
Can you make FP8 version of schnell as well please?
3
#5 opened 7 months ago
by
MonsterMMORPG

Diffusers?
19
#4 opened 7 months ago
by
tintwotin
Minimum vram requirements?
3
#3 opened 7 months ago
by
joachimsallstrom

FP16
1
#2 opened 7 months ago
by
bsbsbsbs112321
Metadata lost from model
4
#1 opened 7 months ago
by
mcmonkey