amazon-data-pipeline Questions
3
Solved
When I navigate to aws datapipeline console it shows this banner,
Please note that Data Pipeline service is in maintenance mode and we are not planning to expand the service to new regions. We pla...
Strident asked 13/12, 2022 at 9:22
2
I'd like to define some parameters in the console of AWS DataPipeline, but am not able to do so. The parameters are going to be called in a SqlActivity, so when I try to refer to them in the in-lin...
Thymic asked 8/6, 2015 at 16:33
4
Automatic AWS DynamoDB to S3 export failing with "role/DataPipelineDefaultRole is invalid"
Precisely following the step-by-step instructions on this page I am trying to export contents of one of my DynamoDB tables to an S3 bucket. I create a pipeline exactly as instructed but it fails to...
Kellene asked 6/3, 2015 at 20:21
3
I have been using UNLOAD statement in Redshift for a while now, it makes it easier to dump the file to S3 and then allow people to analysie.
The time has come to try to automate it. We have Amazon...
Glendon asked 14/4, 2016 at 7:47
4
Solved
I wanted to use AWS Data Pipeline to pipe data from a Postgres RDS to AWS S3. Does anybody know how this is done?
More precisely, I wanted to export a Postgres Table to AWS S3 using data Pipeline....
Cusec asked 6/10, 2016 at 14:51
2
I'm using the Load S3 data into RDS MySql table template in AWS Data Pipeline to import csv's from a S3 bucket into our RDS MySql.
However I (as IAM user with full-admin rights) run into a warning ...
Varico asked 1/2, 2019 at 9:27
1
I'm trying to run simple AWS Data Pipeline for my POC. The case that I have is following: get data from CSV stored on S3, perform simple hive query on them and put results back to S3.
I've created...
Zilvia asked 25/2, 2017 at 2:0
5
I have setup ETL pipeline in AWS as follows
input_rawdata -> s3 -> lambda -> trigger spark etl script (via aws glue )-> output(s3,parquet files )
My question is lets assume the above is initial l...
Reubenreuchlin asked 6/9, 2017 at 4:23
2
I'm using AWS DataPipeline to run an aws-cli command that creates an EMR Cluster, but I'm getting the following error when the command runs:
user ... is not authorized to perform: elasticmapreduce:...
Windstorm asked 17/6, 2016 at 17:26
2
Solved
I used to use the Data Pipeline template called Export DynamoDB table to S3 to export a DynamoDB table to file. I recently updated all of my DynamoDB tables to have on-demand provision and the temp...
Aerology asked 13/2, 2019 at 9:35
1
My goal is to copy a table in a postgreSQL database running on AWS RDS to a .csv file on Amazone S3. For this I use AWS data pipeline and found the following tutorial however when I follow all step...
Paddy asked 17/7, 2018 at 9:6
5
Solved
I am looking to Copy data within databases on Amazon Redshift. Before this, I was copying data from a Redshift database to a PostgreSQL hosted on an EC2 instance for analytical purpose. I had ruby ...
Tho asked 1/6, 2015 at 12:50
1
Solved
I am trying to submit a spark job to AWS EMR cluster using AWS console. But it fails with:
Cannot load main class from JAR. The job runs successfully when I specify main class as --class in Argum...
Keegan asked 23/1, 2018 at 17:40
4
I am trying to transfer CSV data from S3 bucket to DynamoDB using AWS pipeline, following is my pipe line script, it is not working properly,
CSV file structure
Name, Designation,Company
A,TL,C...
Quamash asked 3/8, 2013 at 16:44
0
I would like to upgrade my AWS data pipeline definition to EMR 4.x or 5.x, so I can take advantage of Hive's latest features (version 2.0+), such as CURRENT_DATE and CURRENT_TIMESTAMP, etc.
The c...
Brooklynese asked 17/12, 2017 at 18:17
1
Solved
I'm trying to export existing AWS Data Pipeline task to Terraform infrastructure somehow.
Accordingly, to this issue, there is no direct support for Data Pipelines, but it still seems achievable ...
Haddington asked 18/7, 2017 at 10:48
1
I am creating a data pipeline to export dynamoDB table to S3 bucket.I used the standard template to use for this in data pipeline console. I ha verified that the runsOn field is set to the name of ...
Master asked 8/5, 2014 at 7:21
2
Solved
Basically I want to pg_dump my RDS database to S3 using AWS Data Pipeline,
I am not 100% sure if this is possible I got up to the stage where the SqlDataNode wants a selectQuery at which point i a...
Awn asked 15/5, 2017 at 23:30
1
Solved
There is possibility to dump DynamoDb via Data Pipeline and also import data in DynamoDb. Import is going well, but all the time data appends to already exists data in DynamoDb.
For now I found wo...
Carroty asked 17/2, 2017 at 16:4
1
Solved
During the ETL we do the following operations:
begin transaction;
drop table if exists target_tmp;
create table target_tmp like target;
insert into target_tmp select * from source_a inner jo...
Marj asked 17/2, 2017 at 12:15
3
I am new to AWS datapipeline. I created a successful datapipeline to pull all the content from RDS to S3 bucket. Everything works. I see my .csv file in S3 bucket. But I am storing spanish names in...
Rucksack asked 12/1, 2017 at 21:19
1
I am trying to copy a bunch of csv files from S3 to Redshift using the RedShiftCopyActivity and a datapipeline.
This works fine as long as the csv structure matches the table structure. In my case...
Landtag asked 4/12, 2014 at 14:4
3
Solved
When trying to use a Script Argument in the sqlActivity:
{
"id" : "ActivityId_3zboU",
"schedule" : { "ref" : "DefaultSchedule" },
"scriptUri" : "s3://location_of_script/unload.sql",
"name" : "...
Selle asked 15/12, 2014 at 9:49
3
Solved
I'm using AWS data pipeline service to pipe data from a RDS MySql database to s3 and then on to Redshift, which works nicely.
However, I also have data living in an RDS Postres instance which I wo...
Twotime asked 6/11, 2014 at 14:21
1
Solved
I am attempting to move a file, from on s3 location to another, using an activity in a AWS data pipeline.
The command I am using is:
(aws s3 mv s3://foobar/Tagger/out//*/lastImage.txt s3://foobar/T...
Noelyn asked 25/7, 2015 at 18:6
1 Next >
© 2022 - 2024 — McMap. All rights reserved.