Instructions to use Granddyser/BigLoveKlein-Collection with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Diffusers
How to use Granddyser/BigLoveKlein-Collection with Diffusers:
pip install -U diffusers transformers accelerate
import torch from diffusers import DiffusionPipeline # switch to "mps" for apple devices pipe = DiffusionPipeline.from_pretrained("Granddyser/BigLoveKlein-Collection", dtype=torch.bfloat16, device_map="cuda") prompt = "Astronaut in a jungle, cold color palette, muted colors, detailed, 8k" image = pipe(prompt).images[0] - Notebooks
- Google Colab
- Kaggle
- Local Apps
- Draw Things
- DiffusionBee
BigLove Klein Collection
Available Files
| File | Format | Size | Use Case |
|---|---|---|---|
bigLove_klein2_Bf16.safetensors |
BF16 | ~18 GB | Full precision, best quality |
bigLove_klein2_bf16_pruned.safetensors |
BF16 (pruned) | ~18 GB | Pruned weights, slightly faster |
bigLove_klein2_fp8_pruned.safetensors |
FP8 (pruned) | ~9 GB | Good balance of quality & VRAM |
bigLove_klein2_nf4.safetensors |
NF4 | ~5 GB | Low VRAM, fast inference |
bigLove_klein2.gguf |
GGUF | varies | For GGUF-compatible loaders |
bigLove_klein1_fp8.safetensors |
FP8 | ~9 GB | First version, FP8 quantized |
Usage
ComfyUI
Place the desired model file in your ComfyUI/models/diffusion_models/ (or unet) folder and select it in the appropriate loader node.
Diffusers
from diffusers import FluxPipeline
import torch
pipe = FluxPipeline.from_pretrained(
"Granddyser/biglove-klein2-fp8",
torch_dtype=torch.bfloat16
)
pipe.to("cuda")
image = pipe(
prompt="your prompt here",
num_inference_steps=4,
guidance_scale=0.0,
).images[0]
image.save("output.png")
Acknowledgments
Special thanks to SubtleShader for the motivation.
License
FLUX.2-klein-base-9B is licensed by Black Forest Labs. Inc. under the FLUX.2-klein-base-9B Non-Commercial License. Copyright Black Forest Labs. Inc.
- Downloads last month
- 4,381
Hardware compatibility
Log In to add your hardware
We're not able to determine the quantization variants.