How to list all files in an S3 folder using Fog in Ruby
Asked Answered
E

1

23

How do I list all the files in a specific S3 "directory" using Fog?

I know that S3 doesn't store files in folders but I need a way to limit the returned files to specific "folder" instead of retrieving the entire list in the bucket.

Evita answered 11/4, 2013 at 18:18 Comment(0)
E
41

Use the prefix option on the directory.get method. Example:

def get_files(path, options)
  connection = Fog::Storage.new(
    provider: 'AWS',
    aws_access_key_id: options[:key],
    aws_secret_access_key: options[:secret]
  )
  connection.directories.get(options[:bucket], prefix: path).files.map do |file|
    file.key
  end
end
Evita answered 11/4, 2013 at 18:18 Comment(6)
It's probably worth noting that prefix is actually suffix, at least structurally speaking. If the path to your nested bucket is 'foo/bar', then your method call would be: .get('foo', prefix: 'bar').Steinman
pdoherty926’s note is a bit confusing if you don’t think of the bucket name as being part of the path.Tobin
There is an edge case here that calling .map will not return ALL, but only a single page as returned by the AWS api. Calling .each on the files will let Fog manage memory consumption as there could be ALOT of files.Langan
@Langan good point. I think in the case of a very large folder using .each and passing a block for what you would like to do with the file would be the best pattern to handle this scenario.Evita
Hi, I know this is old, but does anyone know how to control the sort order? I am use UUIDs for my file names and realizing it comes back alphabetical not time uploaded :(Superordinate
Make sure to omit the leading / from the prefix path - doing so returns a seemingly empty directory.Putrescible

© 2022 - 2024 — McMap. All rights reserved.