CivArchive
    T5-XXL/Clip-L Flux Prompt Optimizer and Lora Autoloader by EnragedAntelope - v2.5
    NSFW
    Preview 35103465
    Preview 35103461
    Preview 35103457
    Preview 35103463
    Preview 35103462
    Preview 35103467
    Preview 35103471
    Preview 35103472
    Preview 35103474

    This will take your prompt, enhance it and optimize it for the T5-XXL and CLIP-L text encoders that Flux uses.

    It will also automatically export the trigger word from a lora and include it into the separated prompts.

    Why did I make this?

    People keep asking if they should split their Flux prompts between encoders, and they ask HOW. Now you have a VERY easy way!

    The lora auto-trigger is a fun bonus; I really find it helpful.

    Have fun creating!

    Description

    • Major refactor of the LLM setup to improve coherence between T5XXL and CLIP-L outputs, as well as reduce total API calls and token usage! This was made possible as Plush Nodes now supports additional parameter passing. Be sure to update your nodes!

    • Added helpful notes for several free cloud LLM providers, which show available parameters to modify as well as include documentation references.

    • Changed latent size selection to use Flux Resolution Calculator by ControlAltAI (by request).

    FAQ

    Comments (10)

    EnragedAntelope
    Author
    Oct 18, 20243 reactions
    CivitAI

    Ollama users:

    If you want to unload a model from VRAM as soon as it finished processing, set a system environment variable: OLLAMA_KEEP_ALIVE=0.

    Restart your PC after setting that, so the system variable takes effect before Ollama starts.

    Additionally, here are settings that work with Ollama/Plush. I may add these to notes inside the workflow if I need to update it again.

    docs at https://github.com/ollama/ollama/blob/main/docs/openai.md

    http://127.0.0.1:11434/v1

    response_format:: {"type": "json_object"}

    frequency_penalty:: 0.2

    presence_penalty:: 0

    temperature:: 1.0

    top_p:: 0.9

    EnragedAntelope
    Author
    Dec 8, 2024

    Quick update to this tip: this functionality is now incorporated in Plush, and set as "unload after run" for maximum memory efficiency as of v2.7 of this workflow.

    beitrisOct 19, 2024
    CivitAI

    Does this work ok with the dedistilled model? Been having trouble getting a workflow without gridding artifacts.

    EnragedAntelope
    Author
    Oct 19, 2024

    It should work with any Flux model, though you may have to swap out the checkpoint loader for GGUF etc.

    Grid artifacts are usually caused by generating at too high res in Flux and things start to break down. My trick is to generate small, do a latent upscale and refine, but to make it even larger, do a final upscale without Flux (so just using an upscaler). Check https://civitai.com/models/643719/enragedantelopes-flux-upscale-eliminate-artifactingstreaking to see that in action.

    an303042Dec 8, 20242 reactions
    CivitAI

    Hey, I would appreciate if you could take a look and maybe update this workflow. Recently I had to do a fresh comfy install and I think Plush nodes have updated and I have not been able to get it back to 100% functionality with local ollama. 馃檹

    EnragedAntelope
    Author
    Dec 8, 20243 reactions

    Will see what I can do. Plush updates frequently (which is a GREAT thing!) and the node fields and default values shift and change. I agree with you, that will frequently break older workflows, but usually not in any significant way (usually you can right click, recreate node to populate the new fields with usable defaults, even if not optimal). So let me think if there are other changes to make to this wf at the same time. Thanks for bringing to my attention!

    EnragedAntelope
    Author
    Dec 8, 20242 reactions

    FYI I've completed an update, you should be all set. I tested successfully with multiple providers and Ollama.

    an303042Dec 8, 20241 reaction

    @EnragedAntelope聽Amazing! Thank you 馃ぉ

    Pat_175Mar 3, 2026

    fairly new at this and was wondering where to put your workflow , i don't get it . Thanks for your help . Getting better each day . 69 soon and loving it

    EnragedAntelope
    Author
    Mar 3, 2026

    @Pat_175聽Hi, just FYI Flux1 Dev is now considered an outdated model. So this may not be worth pursuing so much anymore.
    That being said, if you have ComfyUI setup, you should be able to just drag in the .JSON file into your ComfyUI window, and it will show the workflow.
    BUT comfy has changed a lot in the past year so not sure how well this works anymore - I have not used Flux1 at all in a long time now. And if you're just getting started, then you might also have a bit of frustration getting Plush Nodes going (you have to set environment variables with your API keys prior to launching Comfy).
    Not trying to scare you off, just trying to set expectations! It may become a good learning experience. Best of luck!

    Workflows
    Flux.1 D

    Details

    Downloads
    664
    Platform
    CivitAI
    Platform Status
    Available
    Created
    10/17/2024
    Updated
    5/12/2026
    Deleted
    -

    Files

    t5XXLClipLFluxPrompt_v25.zip

    Mirrors

    HuggingFace (1 mirrors)
    CivitAI (1 mirrors)