Created
December 20, 2019 23:57
-
-
Save leehanchung/8ca3a6a3a64b878b4f215e6d50ec554e to your computer and use it in GitHub Desktop.
Dockerfile for using Tensorflow Serving Docker images on AWS Sagemaker
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
# Using the official tensorflow serving image from docker hub as base image | |
FROM tensorflow/serving | |
# Installing NGINX, used to rever proxy the predictions from SageMaker to TF Serving | |
RUN apt-get update && apt-get install -y --no-install-recommends nginx git | |
# Copy our model folder to the container | |
COPY <<local model directory>> /<<model directory inside Docker image>> | |
# Copy NGINX configuration to the container | |
COPY nginx.conf /etc/nginx/nginx.conf | |
# starts NGINX and TF serving pointing to our model | |
ENTRYPOINT service nginx start | tensorflow_model_server \ | |
--model_base_path=/<<model directory inside Docker image>> \ | |
--model_name=<<name your model here>> \ | |
--rest_api_port=8501 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment