Skip to content

Instantly share code, notes, and snippets.

@sayakpaul
Last active September 30, 2024 06:25
Show Gist options
  • Save sayakpaul/b664605caf0aa3bf8585ab109dd5ac9c to your computer and use it in GitHub Desktop.
Save sayakpaul/b664605caf0aa3bf8585ab109dd5ac9c to your computer and use it in GitHub Desktop.
This document enlists resources that show how to run Black Forest Lab's Flux with Diffusers under limited resources.

Running Flux under limited resources with Diffusers

Flux: https://blackforestlabs.ai/announcing-black-forest-labs/

The first resource even allows you to run the pipeline under 16GBs of GPU VRAM.

Additionally, you can use the various memory optimization tricks that are discussed in the following resources:

Enjoy 🤗

@maxin9966
Copy link

Testing with 16GB of graphics memory, the process of generating images requires a significant amount of time for reloading the model each time. Could you specify how much graphics memory would be needed for the model(flux-dev-fp8 and t5xxl-fp8) to remain permanently in the gpu memory?

@nachoal
Copy link

nachoal commented Sep 5, 2024

Any tips on how to speed up inference time?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment