Issues downloading very large docker images
Asked Answered
D

0

2

I just thought, that caching is the solution.

I am trying to download an extremely big file on a slightly slow and unstable connection. Today I managed to download 12GB at very high financial cost and too much time used. It managed to fail after getting about 10GB (due to rain). When the download fails, with the normal command (below), I have to start over from scratch. Does anyone know a better way to do this with current software? I am using Mac OSX Sierra and the latest version of Docker.

Command: docker run --rm -it kaggle/rstats

Durman answered 8/8, 2017 at 21:58 Comment(3)
Did you see the docker daemon preferences? You can improve the max concurrent downloads for each pull. See here: docs.docker.com/engine/reference/commandline/dockerd and search for --max-concurrent-downloadsTypecast
One possible solution I can think of is to use a server and save the image to a tar file docs.docker.com/engine/reference/commandline/save/… and then use a download manager on your system to download it and then import the image.Blissful
thanks. I didnt think of that. let me find a way of doing that.Durman

© 2022 - 2024 — McMap. All rights reserved.