Skip to content

Instantly share code, notes, and snippets.

@brandonbryant12
Last active July 24, 2025 01:03
Show Gist options
  • Select an option

  • Save brandonbryant12/b7297b77775548bf686ff88f788fb77b to your computer and use it in GitHub Desktop.

Select an option

Save brandonbryant12/b7297b77775548bf686ff88f788fb77b to your computer and use it in GitHub Desktop.
ARG PY_BASE=fcr.fmr.com/python:3.11-slim
FROM ${PY_BASE} as runtime
ENV HOME=/home/litellm
RUN useradd -m -s /bin/bash litellm && python -m venv ${HOME}/venv
ENV PATH="${HOME}/venv/bin:${PATH}" PYTHONHTTPSVERIFY=0
USER litellm
WORKDIR /app
ARG LITELLM_VERSION=1.74.3
ARG EXTRAS="proxy,prometheus,prisma,langfuse"
RUN --mount=type=cache,target=${HOME}/.cache/pip pip install --no-cache-dir --trusted-host pypi.org --trusted-host files.pythonhosted.org "litellm[${EXTRAS}]==${LITELLM_VERSION}"
COPY --chown=litellm config/ ./config/
EXPOSE 4000
ENTRYPOINT ["litellm"]
CMD ["--config","/app/config/litellm.yaml","--port","4000"]
version: "3.9"
services:
gateway:
image: litellm-proxy:local
build:
context: .
args:
PY_BASE: ${PY_BASE:-fcr.fmr.com/python:3.11-slim}
restart: unless-stopped
ports:
- "4000:4000"
volumes:
- ./config/litellm.yaml:/app/config/litellm.yaml:ro
#!/usr/bin/env bash
set -euo pipefail
export PY_BASE="fcr.fmr.com/python:3.11-slim"
echo "Building and launching LiteLLM gateway..."
docker compose up --build --remove-orphans --detach
echo "Gateway live at http://localhost:4000"
proxy_server:
host: 0.0.0.0
port: 4000
general_settings:
master_key: sk-admin
enable_jwt_auth: false
model_list:
- model_name: llama3
litellm_params:
model: vllm/meta-llama-3-8b-instruct
api_base: http://host.docker.internal:8000/v1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment