How do I list all the files in a specific S3 "directory" using Fog?
I know that S3 doesn't store files in folders but I need a way to limit the returned files to specific "folder" instead of retrieving the entire list in the bucket.
How do I list all the files in a specific S3 "directory" using Fog?
I know that S3 doesn't store files in folders but I need a way to limit the returned files to specific "folder" instead of retrieving the entire list in the bucket.
Use the prefix
option on the directory.get method. Example:
def get_files(path, options)
connection = Fog::Storage.new(
provider: 'AWS',
aws_access_key_id: options[:key],
aws_secret_access_key: options[:secret]
)
connection.directories.get(options[:bucket], prefix: path).files.map do |file|
file.key
end
end
.map
will not return ALL, but only a single page as returned by the AWS api. Calling .each on the files will let Fog manage memory consumption as there could be ALOT of files. –
Langan /
from the prefix
path - doing so returns a seemingly empty directory. –
Putrescible © 2022 - 2024 — McMap. All rights reserved.
prefix
is actually suffix, at least structurally speaking. If the path to your nested bucket is 'foo/bar', then your method call would be:.get('foo', prefix: 'bar')
. – Steinman