gguf quantized models for Flux-Krea for usage in ComfyUI
how to use
- put the gguf model in
comfyui/models/diffusion_models - git clone city96 custom node into your
custom_nodesfolder in comfyui - cd into
ComfyUI-GGUFinsidecustom_nodesand runpip install -r requirements.txt - load the model with the unet loader node in any flux workflow
warning: q4, and especially q2 versions may have noticeable quality loss.
- Downloads last month
- 149
Hardware compatibility
Log In
to view the estimation
2-bit
4-bit
6-bit
8-bit
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support