JsonMappingException when serializing avro generated object to json
Asked Answered
E

7

16

I used avro-tools to generate java classes from avsc files, using:

java.exe -jar avro-tools-1.7.7.jar compile -string schema myfile.avsc 

Then I tried to serialize such objects to json by ObjectMapper, but always got a JsonMappingException saying "not an enum" or "not a union". In my test I create the generated object using it's builder or constructor. I got such exceptions for objects of different classes...

Sample Code:

ObjectMapper serializer = new ObjectMapper(); // com.fasterxml.jackson.databind
serializer.register(new JtsModule()); // com.bedatadriven.jackson.datatype.jts
...
return serializer.writeValueAsBytes(avroConvertedObject); // => JsonMappingException

I also tried many configurations using: serializer.configure(...) but still failed. Versions: Java 1.8, jackson-datatype-jts 2.3, jackson-core 2.6.5, jackson-databind 2.6.5, jackson-annotations 2.6.5

Any suggestions? Thanks!

Excise answered 6/9, 2016 at 13:4 Comment(5)
Did you find any solution to this problem? I am also facing same issue.Yahoo
I also experiencing this problem.Escent
Yes, I solved it. The point is to use one library end-to-end. I found a way to use the avro-tools also for json serialization. Sorry I can't give you a sample code because I'm at a vacation these days and I've got no access to my work sources... HTHExcise
I solved this in a different way. I copied the velocity templates from the avro distribution into my project and added @com.fasterxml.jackson.annotation.JsonIgnore on all SHEMA$ properties and getter methods.Fredette
@Excise is your sample code from controller?Clapper
M
17

If the SCHEMA member is really the case (we don't see the full error message), then you can switch it off. I use a mixin to do it, like this:

import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.databind.ObjectMapper;
import com.fasterxml.jackson.databind.SerializationFeature;
import org.apache.avro.Schema;
import org.junit.Test;

import java.io.File;
import java.io.IOException;

public class AvroGenerTests
{
  abstract class IgnoreSchemaProperty
  {
    // You have to use the correct package for JsonIgnore,
    // fasterxml or codehaus
    @JsonIgnore abstract void getSchema();
  }

  @Test
  public void writeJson() throws IOException {
    BookAvro b = BookAvro.newBuilder()
      .setTitle("Wilk stepowy")
      .setAuthor("Herman Hesse")
      .build();
    ObjectMapper om = new ObjectMapper();
    om.enable(SerializationFeature.INDENT_OUTPUT);
    om.addMixIn(BookAvro.class, IgnoreSchemaProperty.class);
    om.writeValue(new File("plik_z_gen.json"), b);
  }
}
Mama answered 5/12, 2017 at 19:6 Comment(4)
This worked for me. Thanks for the solution. For anyone looking to ignore same method in a hierarchy of classes the following worked for me. om.addMixIn(Object.class, IgnoreSchemaProperty.class)Swallow
i tried similar however, I am getting com.fasterxml.jackson.databind.JsonMappingException: Infinite recursion (StackOverflowError)Untouched
My avro schema has many Logical Types, I am getting this error even after following this solution - Caused by: org.apache.avro.AvroRuntimeException: Not an enum: {"type":"int","logicalType":"date"}Derange
Hi, I know I might be a bit late to this, but I ran into the "Infinite Recursion" issue after following the exact instructions as the answer. Only change I had to make was add another @JsonIgnore for SpecificData in the MixIn. Add the following line into the IgnoreSchemaProperty MixIn @JsonIgnore SpecificData getSpecificData()Inimical
D
5

2022 Avro field names

abstract class IgnoreSchemaPropertyConfig {
   // You have to use the correct package for JsonIgnore,
   // fasterxml or codehaus
   @JsonIgnore
   abstract void getClassSchema();

   @JsonIgnore
   abstract void getSpecificData();

   @JsonIgnore
   abstract void get();

   @JsonIgnore
   abstract void getSchema();
}


public class AvroGenerateJSON
{
 
  public String convertToJson() throws IOException {
    ObjectMapper om = new ObjectMapper();
    om.addMixIn(BookAvro.class,IgnoreSchemaPropertyConfig.class);
  }
}
Decimeter answered 20/9, 2022 at 16:38 Comment(3)
Thanks Jose, i had some exceptions related to {"type":"int","logicalType":"date"}, after adding all of these @JsonIgnore annotations I am now able to proceedePictor
I'm confused what are you supposed to do with this abstract class?Comstock
Hi, I added the code extracted from https://mcmap.net/q/722338/-jsonmappingexception-when-serializing-avro-generated-object-to-jsonDecimeter
T
3

My req'ts got changed on me and I was told I needed to convert Avro objects straight to JSON without preserving any of the meta-data. My other answer herein that specified a method convertToJsonString converts the entire Avro object to JSON so that using a de-encoder you can re-create the original Avro object as an Avro object. That isn't what my mgt. wanted anymore so I was back to the old drawing board.

As a Hail Mary pass I tried using Gson and it works to do what I now had to do. It's very simple:

 Gson gson = new Gson();
 String theJsonString = gson.toJson(object_ur_converting);

And you're done.

Tetchy answered 14/2, 2021 at 21:40 Comment(1)
Top rated solution was not working for me, I suppose because I had logical types in avro schema and I was getting exception as mentioned at the end. This Solution works perfectly.. Caused by: org.apache.avro.AvroRuntimeException: Not an enum: {"type":"int","logicalType":"date"}Derange
P
3

Agree with Shivansh's answer. To add, there might be instances where we need to use the avro-generated pojo in other classes. Under the hood, spring uses jackson library in handling this so we need to override global jackson config by adding a class

@Configuration
public class JacksonConfiguration {
    public abstract IgnoreSchemaProperty {
          @JsonIgnore abstract void getSchema();
    }
    @Bean
    public ObjectMapper objectMapper() {
          ObjectMapper om = new ObjectMapper();
          om.addMixIn(SpecificRecordBase.class, IgnoreSchemaProperty.class);
          return om;
    }
}

SpecificRecordBase -if we want to ignore the schema field from all avro generated classes. In this way, we can serialize/deserialize our avro classes and use it anywhere in our application without getting the issue.

Permenter answered 4/12, 2021 at 2:2 Comment(1)
Didn't work + too many basic syntax errorAnticline
T
2

After finding the code example at https://www.programcreek.com/java-api-examples/?api=org.apache.avro.io.JsonEncoder I wrote a method that should convert any given Avro object (they extend GenericRecord) to a Json String. Code:

import org.apache.avro.generic.GenericDatumWriter;
import org.apache.avro.generic.GenericRecord;
import org.apache.avro.io.DatumWriter;
import org.apache.avro.io.EncoderFactory;
import org.apache.avro.io.JsonEncoder;

import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.OutputStream;

// ... Class header etc. ...

public static <T extends GenericRecord> String convertToJsonString(T event) throws IOException {

    String jsonstring = "";

    try {
        DatumWriter<T> writer = new GenericDatumWriter<T>(event.getSchema());
        OutputStream out = new ByteArrayOutputStream();
        JsonEncoder encoder = EncoderFactory.get().jsonEncoder(event.getSchema(), out);
        writer.write(event, encoder);
        encoder.flush();
        jsonstring = out.toString();
    } catch (IOException e) {
        log.error("IOException occurred.", e);
        throw e;
    }

    return jsonstring;
}
Tetchy answered 9/2, 2021 at 17:47 Comment(0)
C
1

The previous post answers the question correctly. I am just adding on to the previous answer. Instead of writing it to a file I converted it to a string before sending it as body in a POST request.

public class AvroGenerateJSON
{
  abstract class IgnoreSchemaProperty
  {
    // You have to use the correct package for JsonIgnore,
    // fasterxml or codehaus
    @JsonIgnore abstract void getSchema();
  }

  public String convertToJson() throws IOException {
    BookAvro b = BookAvro.newBuilder()
      .setTitle("Wilk stepowy")
      .setAuthor("Herman Hesse")
      .build();
    ObjectMapper om = new ObjectMapper();
    om.enable(SerializationFeature.INDENT_OUTPUT);
    om.addMixIn(BookAvro.class, IgnoreSchemaProperty.class);
    String jsonString = om.writeValueAsString(b);
    return jsonString;
  }
}
Chronopher answered 9/9, 2020 at 11:51 Comment(0)
N
0

If you are experiencing this issue when trying to use structured logging via logstash-logback-encoder. Then you need to use a JsonFactoryDecorator implementation to add the mixin to ignore the problematic avro generated getters.

package the.package.path;

import com.fasterxml.jackson.annotation.JsonIgnore;
import com.fasterxml.jackson.core.JsonFactory;
import com.fasterxml.jackson.databind.ObjectMapper;
import net.logstash.logback.decorate.JsonFactoryDecorator;
import org.apache.avro.specific.SpecificRecordBase;


public class AvroJsonFactoryDecorator implements JsonFactoryDecorator {
    abstract class JacksonIgnoreAvroProperties {

        @JsonIgnore
        public abstract org.apache.avro.Schema getClassSchema();

        @JsonIgnore
        public abstract org.apache.avro.specific.SpecificData getSpecificData();

        @JsonIgnore
        public abstract java.lang.Object get(int field$);

        @JsonIgnore
        public abstract org.apache.avro.Schema getSchema();
    }

    @Override
    public JsonFactory decorate(JsonFactory factory) {
        ObjectMapper objectMapper = (ObjectMapper) factory.getCodec();
        objectMapper.addMixIn(SpecificRecordBase.class, JacksonIgnoreAvroProperties.class);
        return factory;
    }
}

And then include the following in your logback config:

<appender name="console" class="ch.qos.logback.core.ConsoleAppender">
            <encoder class="net.logstash.logback.encoder.LogstashEncoder">
                <jsonFactoryDecorator class="the.package.path.AvroJsonFactoryDecorator"/>
            </encoder>
        </appender>
Nonaggression answered 16/5, 2023 at 17:25 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.