Mount s3fs as docker volume
Asked Answered
C

1

7

So I just want to add my s3 bucket from amazon to my docker swarm. I've saw many "possible" solutions on the internet but I can't connect them to add the content of my bucket as volume.

So the last thing I've tried was the command statet here (Is s3fs not able to mount inside docker container?):

docker run --rm -t -i --privileged -e AWS_ACCESS_KEY_ID=XXXX -e AWS_SECRET_ACCESS_KEY=XXXX -e AWS_STORAGE_BUCKET_NAME=XXXX docker.io/panubo/s3fs bash

It's working pretty well but if I now exit bash the container stops and I can't do anything with it. Is it possible to make this thing to stay and add it as a volume?

Or would it be the better solution if I just mount the bucket on my Docker instance and then add it as a local volume? Would this be the better idea?

Carrara answered 20/3, 2018 at 8:56 Comment(0)
C
28

I've made it!

The configuration looks like this:

docker-compose.yml

volumes:
  s3data:
    driver: local

services:
  s3vol:
    image: elementar/s3-volume
    command: /data s3://{BUCKET NAME}
    environment:
      - BACKUP_INTERVAL={INTERVALL IN MINUTES (2m)}
      - AWS_ACCESS_KEY_ID={KEY}
      - AWS_SECRET_ACCESS_KEY={SECRET}
    volumes:
      - s3data:/data

And after inserting this into the docker-compose file you can use the s3 storage as volume. Like this:

docker-compose.yml

linux:
  image: {IMAGE}
  volumes:
    - s3data:/data

Hope this helps some of you in the future!

Cheers.

Carrara answered 20/3, 2018 at 9:37 Comment(4)
awesome! s3vol is great! but i think i have a one problem, s3vol will be download file from s3 and copy to server disk. but my aws s3 bucket size is about above 10 tera byte. and i want use s3vol on multiple server. my multiple server must each mounted disk that above 10 terabyte? i worry about cost, time for large s3 size.Sternson
elementar/s3-volume is built to "run short lived processes that work with and persist data to and from S3". It is especially well suited to synchronise and restore backups from and to S3 to disk. Indeed a transparent S3 proxy to Docker volumes also sounds promising.Edmonton
Does this work if the bucket is not hosted on AWS?Bushtit
Do you think if its possible to run multiple commands on the "command" property? I need to fetch from two s3 buckets and copy into two separate directoriesAutopilot

© 2022 - 2024 — McMap. All rights reserved.