Source: https://huggingface.co/city96/flux.1-lite-8B-alpha-gguf/tree/main from city96
This is a direct GGUF conversion of Freepik/flux.1-lite-8B-alpha
The model files can be used with the ComfyUI-GGUF custom node.
Place model files in ComfyUI/models/unet - see the GitHub readme for further install instructions.
Please refer to this chart for a basic overview of quantization types.
☕ Buy me a coffee: https://ko-fi.com/ralfingerai
🍺 Join my discord: https://discord.gg/g5Pb8qNUuP
Description
Details
Downloads
594
Platform
SeaArt
Platform Status
Available
Created
10/26/2024
Updated
10/26/2024
Deleted
-
Files
Available On (10 platforms)
SeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - BF16SeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q4SeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q4.KSSeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q3.KSSeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q5.1SeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q6.KSeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q8SeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q5SeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q5.KSSeaArt
Flux.1 Lite 8B Alpha GGUF Q3.KS Q4/Q4.1/Q4.KS Q5/Q5.1/Q5.KS Q6.K Q8 BF16 - Q4.1