đ icon-generator
This project is a standard PEFT LoRA derived from black-forest-labs/FLUX.1-dev, which is designed for generating minimalist icons.
đ Quick Start
This icon generator is a standard PEFT LoRA derived from black-forest-labs/FLUX.1-dev. You can use it to generate various minimalist icons.
⨠Features
- Derived from
black-forest-labs/FLUX.1-dev
base model.
- Supports text-to-image generation for minimalist icons.
- Provides specific validation and training settings.
đĻ Installation
The installation process mainly involves loading the base model and the adapter. You can use the following Python code to load the model and adapter:
import torch
from diffusers import DiffusionPipeline
model_id = 'black-forest-labs/FLUX.1-dev'
adapter_id = 'noahyoungs/icon-generator'
pipeline = DiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.bfloat16)
pipeline.load_lora_weights(adapter_id)
đģ Usage Examples
Basic Usage
import torch
from diffusers import DiffusionPipeline
model_id = 'black-forest-labs/FLUX.1-dev'
adapter_id = 'noahyoungs/icon-generator'
pipeline = DiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.bfloat16)
pipeline.load_lora_weights(adapter_id)
prompt = "Minimalist icon, arrow up"
pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu')
image = pipeline(
prompt=prompt,
num_inference_steps=20,
generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(42),
width=1024,
height=1024,
guidance_scale=3.0,
).images[0]
image.save("output.png", format="PNG")
Advanced Usage
import torch
from diffusers import DiffusionPipeline
model_id = 'black-forest-labs/FLUX.1-dev'
adapter_id = 'noahyoungs/icon-generator'
pipeline = DiffusionPipeline.from_pretrained(model_id, torch_dtype=torch.bfloat16)
pipeline.load_lora_weights(adapter_id)
prompt = "Minimalist icon, arrow up"
from optimum.quanto import quantize, freeze, qint8
quantize(pipeline.transformer, weights=qint8)
freeze(pipeline.transformer)
pipeline.to('cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu')
image = pipeline(
prompt=prompt,
num_inference_steps=20,
generator=torch.Generator(device='cuda' if torch.cuda.is_available() else 'mps' if torch.backends.mps.is_available() else 'cpu').manual_seed(42),
width=1024,
height=1024,
guidance_scale=3.0,
).images[0]
image.save("output.png", format="PNG")
đ Documentation
Validation settings
Property |
Details |
CFG |
3.0 |
CFG Rescale |
0.0 |
Steps |
20 |
Sampler |
FlowMatchEulerDiscreteScheduler |
Seed |
42 |
Resolution |
1024x1024 |
Skip-layer guidance |
None |
Note: The validation settings are not necessarily the same as the training settings.
Training settings
Property |
Details |
Training epochs |
0 |
Training steps |
1000 |
Learning rate |
8e-05 |
Learning rate schedule |
polynomial |
Warmup steps |
100 |
Max grad norm |
1.0 |
Effective batch size |
1 |
Micro-batch size |
1 |
Gradient accumulation steps |
1 |
Number of GPUs |
1 |
Gradient checkpointing |
True |
Prediction type |
flow-matching (extra parameters=['shift=3', 'flux_guidance_mode=constant', 'flux_guidance_value=1.0', 'flow_matching_loss=compatible', 'flux_lora_target=all']) |
Optimizer |
adamw_bf16 |
Trainable parameter precision |
Pure BF16 |
Caption dropout probability |
5.0% |
LoRA Rank |
16 |
LoRA Alpha |
None |
LoRA Dropout |
0.1 |
LoRA initialisation style |
default |
Datasets - tabler-icons-1024
Property |
Details |
Repeats |
10 |
Total number of images |
4739 |
Total number of aspect buckets |
1 |
Resolution |
1.048576 megapixels |
Cropped |
False |
Crop style |
None |
Crop aspect |
None |
Used for regularisation data |
No |
đ§ Technical Details
The icon generator is based on the black-forest-labs/FLUX.1-dev
base model and uses the PEFT LoRA technique. During training, specific settings such as learning rate, batch size, and LoRA parameters are configured. The text encoder of the base model is not trained, and it can be reused for inference.
đ License
The license of this project is other.