kafka-avro-console-producer Quick Start fails
Asked Answered
E

3

10

I'm using kafka-avro-console-producer from confluent-3.0.0 and error occurs when I execute the following:

./bin/kafka-avro-console-producer --broker-list localhost:9092 --topic test1234 --property value.schema='{"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}'
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/Users/tonydao/dev/bin/confluent-3.0.0/share/java/kafka-serde-tools/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/tonydao/dev/bin/confluent-3.0.0/share/java/confluent-common/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/Users/tonydao/dev/bin/confluent-3.0.0/share/java/schema-registry/slf4j-log4j12-1.7.6.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
{"f1":"value1"}
{"f1":"value2"}

org.apache.kafka.common.errors.SerializationException: Error deserializing json  to Avro of schema {"type":"record","name":"myrecord","fields":[{"name":"f1","type":"string"}]}
Caused by: java.io.EOFException
    at org.apache.avro.io.JsonDecoder.advance(JsonDecoder.java:138)
    at org.apache.avro.io.JsonDecoder.readString(JsonDecoder.java:219)
    at org.apache.avro.io.JsonDecoder.readString(JsonDecoder.java:214)
    at org.apache.avro.io.ResolvingDecoder.readString(ResolvingDecoder.java:201)
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:363)
    at org.apache.avro.generic.GenericDatumReader.readString(GenericDatumReader.java:355)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:157)
    at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:193)
    at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:183)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:151)
    at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:142)
    at io.confluent.kafka.formatter.AvroMessageReader.jsonToAvro(AvroMessageReader.java:189)
    at io.confluent.kafka.formatter.AvroMessageReader.readMessage(AvroMessageReader.java:157)
    at kafka.tools.ConsoleProducer$.main(ConsoleProducer.scala:55)
    at kafka.tools.ConsoleProducer.main(ConsoleProducer.scala)
Edulcorate answered 21/9, 2016 at 20:13 Comment(2)
Did you ever solve this? I am having the exact same problem as above. I have all of the services running. I type: {"f1":"value1"}<Enter> and then receive the exception. I also have the console consumer running and it never gets the message. This is my first time trying kafka and have never downloaded another version. I have tried with multiple values and tried creating a new topic. All with the same exception.Chicory
It's awhile back so I don't remember now. But instead of pressing <Enter> after you type your message, try doing Ctrl+C instead.Edulcorate
B
3
  1. Make sure that you are running all the required services(zookeeper kafka server and schema registry) from the confluent kafka package only.
  2. You might have used some other version of kafka earlier on the same server and might need to clean the logs directory (/tmp/kafka is the default one)
  3. Make sure you are not hitting Enter without providing data as it is considered as null and results into exception.
  4. Try with a completely new topic
Behr answered 6/10, 2016 at 14:47 Comment(0)
P
3

This occurs when you enter a NULL value in the producer message. Looks like it can't convert NULL from json->avro. Need to just enter the json and press enter, then ctrl+d when complete.

I noticed my avro producer didn't have a '>' character to specify it was accepting messages. So I was pressing enter to get some response from the script when I hit this.

Pronty answered 10/5, 2019 at 18:41 Comment(0)
R
1

Firstly, when you are running this command, please add escape characters to the double quotes as follows and press enter once:

./bin/kafka-avro-console-producer --broker-list localhost:9092 --topic test1234 --property value.schema='{\"type\":\"record\",\"name\":\"myrecord\",\"fields\":[{\"name\":\"f1\",\"type\":\"string\"}]}'

Once you run this and have pressed enter once, type in a json object without escape characters for double quotes as follows:

{"f1":"value1"}

After each json object, press enter only once, if you press twice, it will take null as the next json object because of which you got that error. There is no acknowledgement that you get after entering json object, but it would have been sent to kafka already, wait for a min or two and check in control center of kafka, the messages should be there in the given topic. Just press command c and come out of the producer console.

Rattler answered 1/12, 2020 at 6:28 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.