アニメ風のfluxモデルを目指しました。
We aimed for an anime-style flux model.
Description
forgeで変換してみました。
FAQ
Comments (7)
It's just a merge of lora and the main model. Why are you deceiving people by prescribing "checkpoint train" if it does not correspond to reality?
Are you an administrator of Civitai? Sorry, could you please show me the page where the definition is written?
@Aikimi я обычный пользователь, который не любит обман. Почему вы ставите "Trained", если по факту это Merge с lora?
```{
"modelspec.resolution": "1024x1024",
"modelspec.sai_model_spec": "1.0.0",
"modelspec.merged_from": "Flux.1-dev, animeflux_lora_vN",
"modelspec.architecture": "flux-1-dev",
"modelspec.implementation": "https://github.com/black-forest-labs/flux",
"format": "pt",
"modelspec.date": "2024-08-17T05:14:00",
"modelspec.title": "outputbf16"
}```
@Timmek In my understanding, 'merge' refers to something created by mixing existing uploaded items.
I don't have any strong preferences, so I'm okay with changing it if needed...
The nomenclature of "Merged" vs "Trained" is ambiguous. It would probably have been better to called them "Original" vs "Remixed". Some of the SDXL/Flux models labeled as "Trained" are in fact made by merging a number of original LoRAs into a base.
Besides, technically, training a LoRA is not all that different from training a fine-tuned model.
Since the LoRA merged into Flux-Dev is in fact a trained original LoRA by Aikimi, I really don't see a problem here.
You might only consider "full fine-tuning a checkpoint" as "checkpoint trained". However, if you train a LoRA and then merge it into a checkpoint, you could also call that "checkpoint trained"
This is because training LoRAs and directly fine-tuning checkpoints are essentially the same in that they both require time and money. They are fundamentally different from "checkpoint merged", which only involves combining several existing checkpoints.
@Elysia_Saikou Really not sure what people be gettin so hot and bothered about. This is one of the only flux models small enough for a person with an 8gb GPU. So, I'm happy it exists so I can give flux a shot.



