I am trying to upload many thousands of files to Google Cloud Storage, with the following command:
gsutil -m cp *.json gs://mybucket/mydir
But I get this error:
-bash: Argument list too long
What is the best way to handle this? I can obviously write a bash script to iterate over different numbers:
gsutil -m cp 92*.json gs://mybucket/mydir
gsutil -m cp 93*.json gs://mybucket/mydir
gsutil -m cp ...*.json gs://mybucket/mydir
But the problem is that I don't know in advance what my filenames are going to be, so writing that command isn't trivial.
Is there either a way to handle this with gsutil
natively (I don't think so, from the documentation), or a way to handle this in bash where I can list say 10,000 files at a time, then pipe them to the gsutil
command?
gs://my-bucket1/*
. The shell will still treat that string as a pattern to match, and although it will almost certainly fail to match anything, it is possible to set a shell option to treat non-matching patterns as an error rather than as a literal string. – Hadfield