I've had a lot of trouble getting repositories over a high latency satellite connection. I had no success with the various config options that are commonly suggested which seem to be repeated without any real explanation of why they ought to work. My bandwidth is a couple of Mb/s and is sufficient to download large files (hundreds of MB) in a reasonable amount of time, but it seems some other instability in the connection causes fetch to fail.
One solution that I haven't seen mentioned here is to try with SSH instead of HTTPS (in combination with other suggestions like shallow cloning). This has been a lot more successful for me in cases where HTTPS would reliably fail. I imagine most people reading this are trying to clone from Github i.e. try setting up a key and using git clone --depth=1 --no-tags [email protected]:organisation/repo.git
A backup solution is to clone the repository somewhere else; ideally your own server, but since many folks don't have access to that I've found Google Colab is very serviceable. This also works if SSH is blocked on your network:
!git clone --depth=1 --no-tags https://github.com/some/repo.git
!tar -czf repo.tar.gz repo
and then download the tarball via the file explorer in the browser. You could also copy to Google Drive, scp/rsync or even cloud storage if you have the means. Running git fetch --unshallow
on the extracted tarball also seems to generally work.
depth -1
is a solution? – Cleotildeclepekernel/git/torvalds/linux.git
. And a resumable git clone is being discussed (March 2016). See https://mcmap.net/q/111139/-if-a-git-fetch-is-cancelled-half-way-will-it-resume. – Bumpgit init
, setting a remote and then doing fetch until it succeeds do the trick? I don't think fetch discards successfully downloaded objects if the connection fails. – Flippant