I have about 400 GB data on an Amazon EBS volume and I need this data in a S3 bucket for Hadoop EMR usage.
How can I move/copy data from an EBS volume to a S3 bucket (both S3 bucket and EBS volume are in the same AWS region)?
Thanks
I have about 400 GB data on an Amazon EBS volume and I need this data in a S3 bucket for Hadoop EMR usage.
How can I move/copy data from an EBS volume to a S3 bucket (both S3 bucket and EBS volume are in the same AWS region)?
Thanks
The AWS Command Line Interface is meanwhile the recommended choice for all things AWS:
The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts.
On top of this unified approach to all AWS APIs, it also adds a new set of simple file commands for efficient file transfers to and from Amazon S3, with characteristics similar to the well known Unix commands, e.g. for the task at hand:
So cp
would be sufficient for your use case, but be sure to check out sync
as well, it is particularly powerful for many frequently encountered scenarios (and sort of implies cp
depending on the arguments).
Firstly get your credentials at AWS via IAM:
Go to AWS/Services/IAM select Users and create an Administrator. After that, download your credentials.csv and run in shell:
aws configure
Enter your Access key ID and Secret access key.
Then copy files and folders from EBS in EC2 to S3:
aws s3 sync /ebs-directory/ s3://your-bucket
The AWS Command Line Interface is meanwhile the recommended choice for all things AWS:
The AWS Command Line Interface (CLI) is a unified tool to manage your AWS services. With just one tool to download and configure, you can control multiple AWS services from the command line and automate them through scripts.
On top of this unified approach to all AWS APIs, it also adds a new set of simple file commands for efficient file transfers to and from Amazon S3, with characteristics similar to the well known Unix commands, e.g. for the task at hand:
So cp
would be sufficient for your use case, but be sure to check out sync
as well, it is particularly powerful for many frequently encountered scenarios (and sort of implies cp
depending on the arguments).
Create a IAM user with permission AmazonEC2FullAccess & AdministratorAccess, Then Download Access key ID and Secret access key Like this.
User - ec2tos3-data-transfer
Access key ID - AKIAWTUBRTSRTHMZI4
Secret access key - 8BDArkKhkt6k7fnt9n4552mFl+PGNyOKx8
// Copy Data from EBS to S3
sudo apt install awscli
aws configure
AWS Access Key ID [None]: AKIAWTUBRTSRTHMZI4
AWS Secret Access Key [None]: 8BDArkKhkt6k7fnt9n4552mFl+PGNyOKx8
Default region name [None]: us-east-2 // Ec2 instance region
Default output format [None]: json
Sync data Ebs to S3
aws s3 sync uploads/ s3://bucketname
Or for read public
aws s3 sync uploads/ s3://bucketname --acl public-read
Copy data Ebs to S3
aws s3 cp uploads/ s3://bucketname
Or for read public
aws s3 cp uploads/ s3://bucketname --acl public-read
© 2022 - 2024 — McMap. All rights reserved.