How can I transfer data from an EBS volume to a S3 bucket? [closed]
Asked Answered
E

3

9

I have about 400 GB data on an Amazon EBS volume and I need this data in a S3 bucket for Hadoop EMR usage.

How can I move/copy data from an EBS volume to a S3 bucket (both S3 bucket and EBS volume are in the same AWS region)?

Thanks

Ethelstan answered 10/7, 2014 at 9:45 Comment(0)
U
4

The AWS Command Line Interface is meanwhile the recommended choice for all things AWS:

The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts.

On top of this unified approach to all AWS APIs, it also adds a new set of simple file commands for efficient file transfers to and from Amazon S3, with characteristics similar to the well known Unix commands, e.g. for the task at hand:

  • cp - Copies a local file or S3 object to another location locally or in S3.
  • sync - Syncs directories and S3 prefixes.
  • ...

So cp would be sufficient for your use case, but be sure to check out sync as well, it is particularly powerful for many frequently encountered scenarios (and sort of implies cp depending on the arguments).

Unquote answered 10/7, 2014 at 12:5 Comment(0)
V
9

Firstly get your credentials at AWS via IAM:

Go to AWS/Services/IAM select Users and create an Administrator. After that, download your credentials.csv and run in shell:

aws configure

Enter your Access key ID and Secret access key.

Then copy files and folders from EBS in EC2 to S3:

aws s3 sync /ebs-directory/ s3://your-bucket​
Vascular answered 10/5, 2018 at 14:39 Comment(0)
U
4

The AWS Command Line Interface is meanwhile the recommended choice for all things AWS:

The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts.

On top of this unified approach to all AWS APIs, it also adds a new set of simple file commands for efficient file transfers to and from Amazon S3, with characteristics similar to the well known Unix commands, e.g. for the task at hand:

  • cp - Copies a local file or S3 object to another location locally or in S3.
  • sync - Syncs directories and S3 prefixes.
  • ...

So cp would be sufficient for your use case, but be sure to check out sync as well, it is particularly powerful for many frequently encountered scenarios (and sort of implies cp depending on the arguments).

Unquote answered 10/7, 2014 at 12:5 Comment(0)
F
2

Create a IAM user with permission AmazonEC2FullAccess & AdministratorAccess, Then Download Access key ID and Secret access key Like this.

User - ec2tos3-data-transfer 
Access key ID - AKIAWTUBRTSRTHMZI4
Secret access key - 8BDArkKhkt6k7fnt9n4552mFl+PGNyOKx8

// Copy Data from EBS to S3

sudo apt install awscli
aws configure
AWS Access Key ID [None]: AKIAWTUBRTSRTHMZI4
AWS Secret Access Key [None]: 8BDArkKhkt6k7fnt9n4552mFl+PGNyOKx8
Default region name [None]: us-east-2       // Ec2 instance region
Default output format [None]: json

Sync data Ebs to S3

aws s3 sync uploads/ s3://bucketname

Or for read public

aws s3 sync uploads/ s3://bucketname --acl public-read

Copy data Ebs to S3

aws s3 cp uploads/ s3://bucketname

Or for read public

aws s3 cp uploads/ s3://bucketname --acl public-read
Faithfaithful answered 10/10, 2019 at 13:4 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.