CivArchive
    T5-XXL/Clip-L Flux Prompt Optimizer and Lora Autoloader by EnragedAntelope - 2.71
    NSFW
    Preview 63192537
    Preview 63192528
    Preview 63192530
    Preview 63192531
    Preview 63192747

    This will take your prompt, enhance it and optimize it for the T5-XXL and CLIP-L text encoders that Flux uses.

    It will also automatically export the trigger word from a lora and include it into the separated prompts.

    Why did I make this?

    People keep asking if they should split their Flux prompts between encoders, and they ask HOW. Now you have a VERY easy way!

    The lora auto-trigger is a fun bonus; I really find it helpful.

    Have fun creating!

    Description

    Very minor changes, primarily updated to work with latest version of Plush Nodes, plus some small cleanups (removed Sambanova information for example, as they no longer offer a free tier).

    FAQ

    Comments (9)

    strong134Mar 13, 2025
    CivitAI

    Where can I get the llama-3.3-70b-versatile ?

    EnragedAntelope
    Author
    Mar 14, 2025· 1 reaction

    I think there's a note within the workflow. You have several options. Easiest/fastest: You have to go to Groq, make an account and get a free API key. You set an environment variable with that API key before launching Comfy, and then Plush Nodes "Advanced Prompt Enhancer" will work with Groq, which includes that LLM you listed.

    To set the environment variable you'd run (on Windows) before launching comfy:
    set GROQ_API_KEY=gskXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX

    GemneyeIIApr 5, 2025
    CivitAI

    I am trying to use Grok. I got an API key and exported that variable as explained in the documentation. In the Add parameters node I cut/and past the values for Grok.

    I am getting these errors:

    ➤ Begin Log for: Advanced Prompt Enhancer, Node #258:

    ✦ INFO: Server returned response code: 404

    ✦ ERROR: Unable to create LLM client object using URL. Unable to communicate with LLM: Client.__init__() got an unexpected keyword argument 'proxies'

    ✦ WARNING: Open Source LLM server is not running. Aborting request.

    Unable to process request. Make sure the local Open Source Server is running, and you've provided a valid URL. If you're using a remote service (e.g.: ChaTGPT, Groq) make sure your key is valid, and a model is selected

    Am I doing something wrong?

    EnragedAntelope
    Author
    Apr 5, 2025· 1 reaction

    Hi - I did not make the Plush nodes (Advanced Prompt Enhancer), but I see in the issues here https://github.com/glibsonoran/Plush-for-ComfyUI/issues/172 that this error was resolved by a user downgrading httpx. I'd post there if you continue having issues.

    2480278Apr 13, 2025· 1 reaction

    Google released a new API that created some incompatibilities with Plush. There are a couple of script files in the latest upload that should help with this: https://github.com/glibsonoran/Plush-for-ComfyUI/issues/215

    TONY_OFMApr 16, 2025

    did you manage to fix it?

    cobra838337Aug 10, 2025
    CivitAI

    Hi, where can i find these nodes?

    concat

    Text Multiline

    Seed String

    Checkpoint Loader Simple Mikey

    FluxResolutionNode

    Save Images No Display

    Textbox

    Workflows
    Flux.1 D

    Details

    Downloads
    2,682
    Platform
    CivitAI
    Platform Status
    Available
    Created
    3/13/2025
    Updated
    5/12/2026
    Deleted
    -

    Files

    t5XXLClipLFluxPrompt_271.zip

    Mirrors

    CivitAI (1 mirrors)