amazon-kinesis-firehose Questions
2
Solved
I am using aws since last 6 months and I developed application that puts batch request to firehose. It was working fine till today but when I redeployed in my local system it is saying java.lang.Cl...
Antecedent asked 22/4, 2016 at 14:18
5
Solved
Before sending the data I am using JSON.stringify to the data and it looks like this
{"data": [{"key1": value1, "key2": value2}, {"key1": value1, "key2": value2}]}
But once it passes through AWS...
Feminine asked 12/1, 2018 at 12:38
14
I am writing record to Kinesis Firehose stream that is eventually written to a S3 file by Amazon Kinesis Firehose.
My record object looks like
ItemPurchase {
String personId,
String itemId
}
...
Cacophonous asked 26/12, 2015 at 3:48
5
I have used to be able to send a record to firehose without any problem like this
aws firehose put-record --delivery-stream-name my-stream --record='Data="{\"foor\":\"bar\"...
Cubital asked 8/7, 2020 at 21:19
11
Solved
Let's say that I have a machine that I want to be able to write to a certain log file stored on an S3 bucket.
So, the machine needs to have writing abilities to that bucket, but, I don't want it t...
Crabber asked 21/1, 2017 at 20:4
2
Solved
I have an apiGateway endpoint and I am sending some post request to the endpoint. The integration type for the apigateway is lambda function. I want the lambda function to listen to the post data c...
Antonyantonym asked 11/7, 2016 at 9:58
2
Solved
I am currently using Athena along with Kinesis Firehose, Glue Crawler. Kinesis Firehose is saving JSON to single line files as below
{"name": "Jone Doe"}{"name": "Jane Doe"}{"name": "Jack Doe"}
...
Filial asked 7/6, 2020 at 16:4
4
Solved
I'm trying to set up Lambda transformations with a Firehose delivery stream. I have an IAM role defined for the Firehose which includes the following policy document:
{
"Statement": {
"Action": ...
Mobocracy asked 8/3, 2018 at 19:8
1
I am using Kinesis Analytics to read in JSON from Kinesis Firehose. I am successfully filtering out some of the records and writing a subset of the JSON properties to another Firehose.
I wanted to...
Perfectionism asked 6/10, 2017 at 12:34
6
Solved
Firehose->S3 uses the current date as a prefix for creating keys in S3. So this partitions the data by the time the record is written. My firehose stream contains events which have a specific event...
Skep asked 9/2, 2017 at 6:22
4
I have an AWS Kinesis Firehose stream putting data in s3 with the following config:
S3 buffer size (MB)* 2
S3 buffer interval (sec)* 60
Everything works fine. The only problem is that Firehose c...
Hinz asked 28/4, 2016 at 17:9
3
Solved
I am trying to setup a sync between AWS Aurora and Redshift. What is the best way to achieve this sync?
Possible ways to sync can be: -
Query table to find changes in a table(since I am only doi...
Bullnose asked 16/6, 2017 at 21:53
2
Solved
I have a column in Athena with Map type. I have defined the schema in Glue as Map .
I have defined a firehose stream that refers to the glue schema and converts it to parquet format. However, I a...
Cottingham asked 9/7, 2019 at 10:52
6
Solved
I have a Firehose stream that is intended to ingest millions of events from different sources and of different event-types. The stream should deliver all data to one S3 bucket as a store of raw\una...
Singlehanded asked 12/7, 2018 at 20:27
4
I have a Kinesis Firehose configuration in Terraform, which reads data from Kinesis stream in JSON, converts it to Parquet using Glue and writes to S3.
There is something wrong with data format con...
Enclose asked 25/6, 2021 at 4:36
5
Solved
I try to have a Kinesis Firehose pushing data in a Redshift table.
The firehose stream is working and putting data in S3.
But nothing arrive in the destination table in Redshift.
In the metrics...
Dilettante asked 10/12, 2015 at 16:46
2
Is there a way to manually set an ElasticSearch document id when inserting via AWS Kinesis Firehose?
I have an AWS Kinesis Firehose Stream set up to feed data into an AWS ElasticSearch cluster, and I can successfully insert documents by sending them to the Firehose Stream, which loads them into El...
Gutshall asked 10/5, 2016 at 18:1
3
Solved
This is somewhat of a shallow-level question. However, I perplexed by this trio of services.
I understand that KPL produces fast data and KCL consumes fast data produced by Kinesis. However, what ...
Incertitude asked 2/6, 2020 at 22:23
1
Solved
I'm trying to create a Kinesis Firehose using terraform with dynamic partitioning using two partition queries from the JSON I'm recieving, my processing configuration looks like this
processing_con...
Celt asked 6/5, 2022 at 14:58
3
We primarily do bulk transfer of incoming click stream data through Kinesis Firehose service. Our system is a multi tenant SaaS platform. The incoming click stream data are stored S3 through Fireho...
Benniebenning asked 18/10, 2017 at 5:48
3
I'm executing a Flink Job with this tools.
I think both can do exactly the same with the proper configuration. Does Kinesis Data Analytics do something that EMR can not do or vice versa?
Amazon Ki...
Brosine asked 17/5, 2019 at 12:26
2
Solved
Can we have multiple destinations from single Kinesis Firehose? I saw this picture
From this, it looks like it is possible to add s3, redshift and elastic search from single firehose. I exactly w...
Aubine asked 16/6, 2017 at 18:19
3
Solved
I've read a lot of similar questions around adding newline characters to firehose, but they're all around adding the newline character to the source. The problem is that I don't have access to the ...
Ravenravening asked 8/5, 2019 at 15:18
1
Solved
My Firehose reads from Eventbridge events that look something like:
{
"detail": {
"key1": "some value",
"key2": "some value",
"Timestamp&q...
Presto asked 15/2, 2022 at 10:33
1
Solved
I'm using kinesis delivery stream to send stream, from event bridge to s3 bucket. But i can't seem to find which class have the option to configure dynamic partitioning?
this is my code for deliver...
Proustite asked 3/9, 2021 at 2:57
1 Next >
© 2022 - 2024 — McMap. All rights reserved.