dn6's picture
dn6 HF Staff
Upload folder using huggingface_hub
56f2217 verified

Cached Flux Prompt Encoding

This is a custom block designed to cache input prompts for the Flux model. Prompts encoded prompts are stored in a safetensors file using the hashed prompt string as the key. Prompts existing in the cache are loaded directly from the file, while new prompts are encoded and added to the cache.

How to use

import torch
from diffusers.modular_pipelines import ModularPipelineBlocks, SequentialPipelineBlocks
from diffusers.modular_pipelines.flux.modular_blocks import TEXT2IMAGE_BLOCKS

prompt_encoder_block = ModularPipelineBlocks.from_pretrained(
    "diffusers/flux-cached-prompt-encoder-custom-block",
    trust_remote_code=True
)

blocks = TEXT2IMAGE_BLOCKS.copy().insert("text_encoder", prompt_encoder_block, 0)
blocks = SequentialPipelineBlocks.from_blocks_dict(blocks)

repo_id = "diffusers/modular-FLUX.1-dev"
pipe = blocks.init_pipeline(repo_id)
pipe.load_components(torch_dtype=torch.bfloat16, device_map="cuda")

output = pipe(
    prompt=prompt,
    num_inference_steps=35,
    guidance_scale=3.5,
    output_type="pil",
)
image = output.values['image']