I have a docker image that is 1.46GB on my local machine, but when this is pushed to AWS ECR (either via my local machine or via CicleCI deployment) it is only 537.05MB. I'm pretty new to Docker and to AWS, so any help in figuring out as to why this may be would be appreciated!
I have a feeling that it has not fully uploaded to ECR for whatever reason, as I am trying to use this container for a Batch job, but for some reason the same command which works when used locally does not work when used in the job definition. The command is simply python app.py
, but I have also tried with absolute path python /usr/local/src/app/app.py
, both of which result in [Errno 2] No such file or directory
.
Commands used in my Makefile deployment are as below:
docker build --force-rm=true -t $(EXTRACTOR_IMAGE_NAME) ./extractor
docker tag $(EXTRACTOR_IMAGE_NAME) $(EXTRACTOR_ECR_IMAGE_NAME)
$(shell aws ecr get-login --no-include-email)
docker push ${AWS_ACCOUNT_ID}.dkr.ecr.${AWS_REGION}.amazonaws.com/$(EXTRACTOR_ECR_REPO)
Edit 1: I think this might be to do with the size of the base image, which is python:2.7 in this case. The base image is 914MB, plus the size of my ECR image 537.05MB = 1451.05MB, i.e. approx 1.46GB. Still not sure what the issue is with the Batch command though...
Edit 2: I've been mounting code into my container using a volume, which is why this has been working locally. At build time I've forgotten to copy the code into the container, which I assume is the only reason why this is not working in Batch!