-
(Optional) Grab the image URI using the SageMaker Python SDK:
In [1]: import sagemaker In [2]: sagemaker.image_uris.retrieve("pytorch", "us-east-1", version="1.10", image_scope="inference", instance_type="ml.p3.2xlarge") Out[2]: '763104351884.dkr.ecr.us-east-1.amazonaws.com/pytorch-inference:1.10-gpu-py38'
-
Docker login (docker login is made with the registry URI, portion before slash):
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin 763104351884.dkr.ecr.us-east-1.amazonaws.com
-
Docker pull:
docker pull 763104351884.dkr.ecr.us-east-1.amazonaws.com/pytorch-inference:1.10-gpu-py38
docker run -it --rm 763104351884.dkr.ecr.us-east-1.amazonaws.com/pytorch-inference:1.10-gpu-py38 bash
Here, we're also mounting a local src/
directory on /opt/ml
inside the container. Any changes on src/
are automatically reflected to the container.
docker run -d -it \
--name my_container_name \
--rm \
--volume $(pwd)/src:/opt/ml \
--workdir /opt/ml \
763104351884.dkr.ecr.us-east-1.amazonaws.com/pytorch-inference:1.10-gpu-py38
docker exec -it my_container_name python3 run.py
docker stop $(docker ps -qa)
docker rm $(docker ps -qa)
docker rmi $(docker images -qa)
docker system prune --volumes --all --force