Skip to content

Instantly share code, notes, and snippets.

@qpwo
Last active January 12, 2025 12:36
Show Gist options
  • Save qpwo/ea167c0f761f68a3b484ae4e3e0fdb35 to your computer and use it in GitHub Desktop.
Save qpwo/ea167c0f761f68a3b484ae4e3e0fdb35 to your computer and use it in GitHub Desktop.
fastest huggingface parallel download ever
pip install huggingface_hub[hf_transfer]
export HF_HUB_ENABLE_HF_TRANSFER=1
model_name=meta-llama/Llama-3.1-405B
localdir=$(realpath ~/hff/405b)
huggingface-cli download --max-workers=8 --include="model-???[02468][02468]-of-?????.safetensors" --local-dir=$localdir $model_name & sleep 6
huggingface-cli download --max-workers=8 --include="model-???[02468][13579]-of-?????.safetensors" --local-dir=$localdir $model_name & sleep 6
huggingface-cli download --max-workers=8 --include="model-???[13579][02468]-of-?????.safetensors" --local-dir=$localdir $model_name & sleep 6
huggingface-cli download --max-workers=8 --include="model-???[13579][13579]-of-?????.safetensors" --local-dir=$localdir $model_name & sleep 0
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment