I used the following command:
aws s3 ls s3://mybucket/mydir --recursive > bigfile
The resulting file was too huge (9.5MB) to conveniently work with, since I need to eyeball the info I'm looking for.
All I really need is the information three levels down. Is it possible to adjust this command so that I only recurse down N number of levels instead of all the way down every directory? I don't see any thing like -maxdepth
for S3 CLI ls commands
Update: Here is the command I ended up using to get the info I needed, though I'm not satisfied with it. It still gave me 77000 results when I only wanted the 40 or so unique values, but it was short enough to port into excel and whittle down with text-to-columns and remove duplicates.
aws s3 ls s3://mybucket/mydir --human-readable --summarize --recursive | egrep '*_keytext_*' | tr -s ' ' | cut -d' ' -f5 >smallerfile
PRE <my-prefix>
and that's it – Surra