-
-
Save perrygeo/1eea522b283baf91dbca497150155695 to your computer and use it in GitHub Desktop.
FROM python:3.6-slim-stretch | |
ADD requirements.txt /tmp/requirements.txt | |
RUN apt-get update && \ | |
apt-get install -y \ | |
build-essential \ | |
make \ | |
gcc \ | |
locales \ | |
libgdal20 libgdal-dev && \ | |
python -m pip install numpy cython --no-binary numpy,cython && \ | |
python -m pip install \ | |
"rasterio>=1.0a12" fiona shapely \ | |
--pre --no-binary rasterio,fiona,shapely && \ | |
python -m pip install -r /tmp/requirements.txt && \ | |
python -m pip uninstall -y cython && \ | |
rm -r /root/.cache/pip && \ | |
apt-get remove -y --purge libgdal-dev make gcc build-essential && \ | |
apt-get autoremove -y && \ | |
rm -rf /var/lib/apt/lists/* | |
RUN dpkg-reconfigure locales && \ | |
locale-gen C.UTF-8 && \ | |
/usr/sbin/update-locale LANG=C.UTF-8 | |
ENV LC_ALL C.UTF-8 | |
CMD ["/bin/python"] |
default: image test notebook | |
.PHONY: image shell | |
image: | |
docker build --tag slimpy:latest . | |
test: | |
docker run -it --rm slimpy:latest python -c "import rasterio, fiona, shapely, numpy, sys; print('Python:', sys.version); print('Rasterio:', rasterio.__version__); print('Fiona:', fiona.__version__); print('Shapely:', shapely.__version__); print('Numpy', numpy.__version__);" | |
shell: image | |
docker run -it --rm slimpy:latest /bin/bash | |
notebook: | |
docker run -it --rm \ | |
-p 0.0.0.0:8888:8888 \ | |
--rm \ | |
--interactive \ | |
--tty \ | |
--volume $(shell pwd)/notebooks/:/notebooks \ | |
slimpy:latest /bin/bash -c "cd /notebooks && jupyter notebook --ip=0.0.0.0 --allow-root" | |
rasterio>=1.0a12 | |
fiona | |
shapely | |
pandas | |
geopandas | |
pyproj | |
mapbox | |
rasterstats | |
mapboxgl | |
jupyter | |
mercantile | |
scipy | |
scikit-learn | |
seaborn | |
statsmodels | |
ggplot |
Hi
Does it work in Heroku ?
regards
Thanks for this.
@Biaggio74 you could install each package as a separate docker RUN
command, with packages more likely to need updating towards the end. What I'm going for in this docker image is a single RUN command - optimized for minimal container size.
For applications that need frequent python package updates, you could automate the build step by e.g. having Travis CI build and push the container on each commit. Ideally, if you were using this in production, you would have the versions explicitly pinned in the requirements.txt
such that "upgrading" was a matter of pushing a change to this file.
Thank you! This was exactly what I needed. I made a fork that uses docker-compose
in place of the Makefile, here: https://gist.github.com/johnniehard/90a7f4fc1b0701360f67ba77b9b50c7a
@perrygeo Is there a way to install new dependencies without building the whole docker image? For instance, a new package comes at the end of the list in the requirements.txt file, but nothing else changed. Or this should be run from the Dockerfile, so the build will see that the req file is unchanged and doesn't install the existing layers?
(when working in Jupyter notebook, many times a new packages comes up as need, so the process takes a bit time to be completed)