I'm a Rust newbie, and one of the things that I've found frustrating is that the default docker build
experience is extremely slow. As it downloads crates, then dependencies, then finally my app - I often get distracted, start doing something else, then come back several minutes later and forget what I was doing
Recently, I had the idea to make it a little better by combining multistage builds with some of the amazing features from BuildKit. Specifically, cache mounts, which let a build container cache directories for compilers & package managers. Here's a quick annotated before & after from a real app I encountered.
This is a standard enough multistage Dockerfile. Nothing seemingly terrible or great here - just a normal build stage, and a smaller runtime stage.
FROM rust:1.55 AS build
WORKDIR /app
# The app has 2 parts: an application named "api", and a lib named "game"
# Copy the sources
COPY ./api ./api
COPY ./game ./game
# Build the app
WORKDIR /app/api
RUN cargo build --release
# Use a slim Dockerfile with just our app to publish
FROM debian:buster-slim AS app
COPY --from=build /app/target/release/my-app /
CMD ["/my-app"]
This corresponds to the following build times
# Let's pre-pull the bases so we don't unnecessarily penalize the first build
docker pull rust:1.55
docker pull debian:buster-slim
# First build from scratch
time docker build .
real 5m43.506s
user 0m1.239s
sys 0m0.872s
# Change a file in api/src, and build again
time docker build .
real 5m44.731s
user 0m1.199s
sys 0m0.938s
Wow, 5 minutes. Yes, I'm probably doing cargo build
outside of Docker and the real effects aren't this drastic, but this is an eternity for my short attention span. This is our baseline - let's see if we can improve it.
Here we're going to keep multistage builds, but we'll make a few changes:
- Split layers so that we cache compiled dependencies. Turns out this is harder in Rust than other languages.
- Use BuildKit + cache mounts. This will save us some download time when we have to rebuild dependencies
# syntax=docker/dockerfile:1.3-labs
# The above line is so we can use can use heredocs in Dockerfiles. No more && and \!
# https://www.docker.com/blog/introduction-to-heredocs-in-dockerfiles/
FROM rust:1.55 AS build
# Capture dependencies
COPY Cargo.toml Cargo.lock /app/
# We create a new lib and then use our own Cargo.toml
RUN cargo new --lib /app/game
COPY game/Cargo.toml /app/game/
# We do the same for our app
RUN cargo new /app/api
COPY api/Cargo.toml /app/api/
# This step compiles only our dependencies and saves them in a layer. This is the most impactful time savings
# Note the use of --mount=type=cache. On subsequent runs, we'll have the crates already downloaded
WORKDIR /app/api
RUN --mount=type=cache,target=/usr/local/cargo/registry cargo build --release
# Copy our sources
COPY ./api /app/api
COPY ./game /app/game
# A bit of magic here!
# * We're mounting that cache again to use during the build, otherwise it's not present and we'll have to download those again - bad!
# * EOF syntax is neat but not without its drawbacks. We need to `set -e`, otherwise a failing command is going to continue on
# * Rust here is a bit fiddly, so we'll touch the files (even though we copied over them) to force a new build
RUN --mount=type=cache,target=/usr/local/cargo/registry <<EOF
set -e
# update timestamps to force a new build
touch /app/game/src/lib.rs /app/api/src/main.rs
cargo build --release
EOF
CMD ["/app/target/release/my-app"]
# Again, our final image is the same - a slim base and just our app
FROM debian:buster-slim AS app
COPY --from=build /app/target/release/my-app /my-app
CMD ["/my-app"]
And the big test - did it help at all? Let's see
# We have rust / debian pulled from before
# We need to use BuildKit for these features, so let's turn that on
export DOCKER_BUILDKIT=1
# Build from scratch!
time docker build .
real 5m51.538s
user 0m1.209s
sys 0m0.933s
# The big moment - change a file in src and rebuild
time docker build .
real 0m36.053s
user 0m0.148s
sys 0m0.145s
Great success! Container build times dropped from 5m44s
to 0m36s
!
Depends on
CARGO_HOME
env var.