apache-kafka-connect Questions
3
I am trying to interpret a Avro record stored by Debezium in Kafka, using Python
{
"name": "id",
"type": {
"type": "bytes",
"scale": 0,
"precision": 64,
"connect.version": 1,
"connect.para...
Loleta asked 23/10, 2017 at 8:1
3
Solved
I'm using the Debezium (0.7.5) MySQL connector and I'm trying to understand what is the best approach if I want to update this configuration with the option table.whitelist.
Let's say I create a c...
Holle asked 28/11, 2018 at 1:26
1
Solved
We're using MirrorMaker2 to replicate some topics from one kerberized kafka cluster to another kafka cluster (strictly unidirectional). We don't control the source kafka cluster and we're given onl...
Antonomasia asked 24/11, 2020 at 19:3
4
I am trying to use the Kafka Connect JDBC Source Connector with following properties in BULK mode.
connector.class=io.confluent.connect.jdbc.JdbcSourceConnector
timestamp.column.name=timestamp
con...
Kurth asked 4/4, 2019 at 17:6
3
Solved
I'm trying to figure out whether it's possible to transform JSON values that are stored as strings into actual JSON structures using Kafka Connect.
I tried looking for such a transformation but co...
Verso asked 24/9, 2017 at 10:58
2
I am using these installation instructions for getting confluent hub client https://docs.confluent.io/current/connect/managing/confluent-hub/client.html
But, when I get to the line to install the ...
Cagliari asked 9/7, 2019 at 18:8
2
I am trying to convert all timestamp fields to a string type with the format yyyy-MM-dd HH:mm:ss.
To transform multiple fields, I have to create a transform for each one individually.
...
"transf...
Showily asked 1/4, 2019 at 17:46
3
Solved
How do I use Kafka Connect adapters with Amazon MSK?
As per the AWS documentation, it supports Kafka connect but not documented about how to setup adapters and use it.
Krakow asked 5/12, 2019 at 12:5
3
I am using the confluent community edition for a simple setup consisting a rest client calling the Kafka rest proxy and then pushing that data into an oracle database using the provided jdbc sink c...
Illjudged asked 15/3, 2019 at 4:24
2
Solved
I'm currently working with the Kafka Connect S3 Sink Connector 3.3.1 to copy Kafka messages over to S3 and I have OutOfMemory errors when processing late data.
I know it looks like a long question...
Twinscrew asked 21/6, 2018 at 14:30
1
How do I retrieve incoming headers from the kafka message with Kafka Connect to store them as additional data fields with MongoDB Sink Connector to mongodb.
I have a kafka topic "PROJECT_EXAMP...
Partridgeberry asked 11/10, 2020 at 22:59
3
Solved
I am trying to connect to Kafka 3.0 with SSL but facing issue with loading SSL keystore
I have tried many possible values, but no help
I have tried changing the locations, changing the value of t...
Birkett asked 7/11, 2019 at 11:42
2
I am new to Kafka and I want to see if I can sync MongoDb data with another system using Kafka.
My set up:
I am running AWS MSK Cluster and I have created an EC2 instance with Kafka client manuall...
Milamilady asked 17/10, 2020 at 14:17
3
Solved
I get an error when running kafka-mongodb-source-connect
I was trying to run connect-standalone with connect-avro-standalone.properties and MongoSourceConnector.properties so that Connect write dat...
Abscind asked 3/1, 2020 at 0:56
1
Solved
I'm trying to run a local kafka-connect cluster using docker-compose.
I need to connect on a remote database and i'm also using a remote kafka and schema-registry.
I have enabled access to these re...
Oleoresin asked 1/7, 2021 at 21:34
2
I'm using Kafka connect HDFS.
When I'm trying to run my connector I'm got the following exception:
ERROR Failed creating a WAL Writer: Failed to create file[/path/log] for [DFSClient_NONMAPREDUC...
Cheston asked 14/8, 2018 at 13:6
3
Log compacted topics are not supposed to keep duplicates against the same key. But in our case, when a new value with the same key is sent, the previous one isn't deleted. What could be the issue?
...
Mihe asked 10/4, 2020 at 13:2
1
Solved
We've been using Kafka Connect for a while on a project, currently entirely using only the Confluent Kafka Connect JDBC connector. I'm struggling to understand the role of 'tasks' in Kafka Connect,...
Gert asked 26/4, 2021 at 10:36
3
Solved
My kafka sink connector reads from multiple topics (configured with 10 tasks) and processes upwards of 300 records from all topics. Based on the information held in each record, the connector may p...
Crochet asked 1/5, 2019 at 9:29
1
Solved
I have been trying to get data from postgres sql to kafka topics using the following command /bin connect-standalone.properties config/connect-standalone.properties postgres.sproperties, but am fac...
Arrowworm asked 7/4, 2021 at 11:46
1
I configured debezium with postgres as the below code, when i started my spring boot application a get an error Creation of replication slot failed
@Bean
public io.debezium.config.Configuration con...
Diaphoretic asked 2/4, 2021 at 12:12
2
I would like to know the compressed size of a message in kafka.
I use kafka 1.1.0 and java kafka-connect 1.1.0 to send messages from my producer to a topic.
If the message is too large for my pr...
Canine asked 9/5, 2018 at 10:54
6
Solved
Are there any alerting options for scenarios where a Kafka Connect Connector or a Connector task fails or experiences errors?
We have Kafka Connect running, it runs well, but we've had errors that...
Worm asked 10/8, 2017 at 20:19
4
The REST API for Kafka Connect is not secured and authenticated.
Since its not authenticated, the configuration for a connector or Tasks are easily accessible by anyone. Since these configurations ...
Convolution asked 22/7, 2017 at 4:11
2
I want to read only 5000 records in a batch through jdbc sink, for which I've used the batch.size in the jdbc sink config file:
name=jdbc-sink
connector.class=io.confluent.connect.jdbc.JdbcSinkCon...
Cant asked 25/10, 2019 at 4:52
© 2022 - 2025 — McMap. All rights reserved.