avro Questions

3

I have two questions: Is it possible to use the same reader and parse records that were written with two schemas that are compatible, e.g. Schema V2 only has an additional optional field compared...
Fossick asked 11/3, 2013 at 23:20

2

Solved

My KafkaProducer is able to use KafkaAvroSerializer to serialize objects to my topic. However, KafkaConsumer.poll() returns deserialized GenericRecord instead of my serialized class. MyKafkaProduc...

1

i am new to python and i was trying to write a simple code for converting a text file to avro. i am getting this error that module not found. I could clearly see in the schema.py file that the pars...
Iceman asked 31/12, 2016 at 5:22

1

I'm using Avro to serialize objects and then add them to Kafka messages that will be consumed and deserialized by clients. I've tried several different approaches for serialization but none of them...
Helga asked 2/2, 2017 at 16:48

4

Solved

Is there a way to use a schema to convert avro messages from kafka with spark to dataframe? The schema file for user records: { "fields": [ { "name": "firstName", "type": "string" }, { "name": ...
Dolhenty asked 20/8, 2016 at 1:30

3

I am trying to avro binary encode my JSON String. Below is my JSON String and I have created a simple method which will do the conversion but I am not sure whether the way I am doing is correct or ...
Tumid asked 24/2, 2014 at 2:9

2

I am working with Avro and I have a GenericRecord. I want to extract clientId, deviceName, holder from it. In the Avro Schema, clientId is Integer, deviceName is String and holder is a Map. client...
Vennieveno asked 24/11, 2016 at 18:18

3

Solved

I have this avro schema { "namespace": "xx.xxxx.xxxxx.xxxxx", "type": "record", "name": "MyPayLoad", "fields": [ {"name": "filed1", "type": "string"}, {"name": "filed2", "type": "long"}, {"...
Informative asked 12/1, 2016 at 0:26

2

I am using Confluent's JDBC connector to send data into Kafka in the Avro format. I need to store this schema in the schema registry, but I'm not sure what format it accepts. I've read the document...
Drumfire asked 14/9, 2016 at 18:3

1

Solved

I try to use Kafka Stream to convert a topic with String/JSON messages to another topic as Avro messages. Stream main method: streamsConfiguration.put(StreamsConfig.KEY_SERDE_CLASS_CONFIG, Serde...
Vulgar asked 28/10, 2016 at 11:4

2

Solved

I am creating avro RDD with following code. def convert2Avro(data : String ,schema : Schema) : AvroKey[GenericRecord] = { var wrapper = new AvroKey[GenericRecord]() var record = new GenericData...
Corabelle asked 9/2, 2015 at 15:11

1

I am using avro-maven-plugin 1.8.1 to generate java code from schema,and all the fields are public and deprecated,like this: public class data_elements extends org.apache.avro.specific.SpecificRe...
Possie asked 21/9, 2016 at 9:49

2

Solved

I'm relatively new to spark and I'm trying to group data by multiple keys at the same time. I have some data that I map so it ends up looking like this: ((K1,K2,K3),(V1,V2)) My goal is to group ...
Hopehopeful asked 27/9, 2016 at 16:27

1

I have gone through the C document at avro and I see that I can get only avro output to file. How do I get the serialized output to a buffer so that I can send over a tcp socket. Any help is much a...
Pasqualepasqueflower asked 6/8, 2015 at 10:50

1

Solved

I am new to Hadoop and programming, and I am a little confused about Avro schema evolution. I will explain what I understand about Avro so far. Avro is a serialization tool that stores binary data...
Gareth asked 25/8, 2016 at 1:45

1

Solved

I am trying to deserialize/read an Avro file, the avro data file doesn't have the new field. Even though the new field is declared as null in schema, it is expected to be optional. But it still giv...
Uprush asked 4/8, 2016 at 18:57

1

Solved

There are a lot of questions and answers on stackoverflow on the subject, but no one that helps. I have a schema with optional value: { "type" : "record", "name" : "UserSessionEvent", "namespa...
Spiritualist asked 8/8, 2016 at 8:27

2

Solved

I'm trying to get Python to parse Avro schemas such as the following... from avro import schema mySchema = """ { "name": "person", "type": "record", "fields": [ {"name": "firstname", "type": ...
Emileemilee asked 1/8, 2012 at 17:16

2

Solved

Haven't seen a solution to my particular problem so far. It isn't working at least. Its driving me pretty crazy. This particular combo doesn't seem to have a lot in the google space. My error occur...
Disconcerted asked 4/4, 2015 at 15:37

1

Solved

We are using kafka for storing messages and pushing an extremely large number of messages(> 30k in a minute). I am not sure if its relevant but the code that is the producer of the kafka message is...

5

Solved

I'm trying to use Avro for messages being read from/written to Kafka. Does anyone have an example of using the Avro binary encoder to encode/decode data that will be put on a message queue? I need...
Odine asked 28/11, 2011 at 15:40

1

Solved

I am new to AVRO and please excuse me if it is a simple question. I have a use case where I am using AVRO schema for record calls. Let's say I have avro schema { "name": "abc", "namepsace": "x...
Gina asked 17/5, 2016 at 14:31

6

I am using Apache Avro. My schema has map type: {"name": "MyData", "type" : {"type": "map", "values":{ "type": "record", "name": "Person", "fields":[ {"name": "name", "type": "string"}, ...
Express asked 1/11, 2013 at 14:33

1

Solved

Im new to Avro schema. I try to publish/consumer my java objects using kafka. I have java bean classes, which contains fields with LocalDateTime and byte[] . How can i define both in avro schema pr...
Cohesive asked 14/4, 2016 at 5:14

6

Solved

All of these provide binary serialization, RPC frameworks and IDL. I'm interested in key differences between them and characteristics (performance, ease of use, programming languages support). If ...
Philomel asked 8/1, 2011 at 11:20

© 2022 - 2024 — McMap. All rights reserved.