Put a BigDecimal into an Avro Decimal
Asked Answered
A

0

7

As a user, the requirement is clear. I have a record with an Avro Logical Type Decimal(20,3) and want to put a value into it. So I expect to write

genericrecord.put("Quantity", 4.0);
genericrecord.put("Quantity", BigDecimal.valueOf(4.0));

From an Avro implementation point of view the Decimal() data type is a Fixed or ByteArray, hence the serializer needs either of the two, I need to convert the value to a byte array.

Easiest way I found is to use the DecimalConversion class.

DecimalConversion DECIMAL_CONVERTER = new DecimalConversion();
Schema s = item.getSchema().getField("Quantity").schema();
Decimal l = (Decimal) s.getLogicalType();
BigDecimal d = BigDecimal.valueOf(4.0).setScale(l.getScale());
ByteBuffer value = DECIMAL_CONVERTER.toBytes(d, null, l);
item.put("Quantity", value);

Find the LogicalType of the Quantity column. Convert the double value 4.0 to a BigDecimal. Match the scale of the BigDecimal with the LogicalType. Convert it to a ByteBuffer. Set the bytes as value.

That's a bit excessive for a simple item.put("Quantity", 4.0) requirement.

As said, all clear from a library implementer point of view but from a user perspective, not very handy.

Isn't there a better option? I found something about pluggable converters in the LogicalType classes. Any hint how you use those? Am I missing something completely???

Thanks in advance

PS: Avro 1.9.0

Artis answered 30/7, 2019 at 9:32 Comment(2)
Hi, were you able to find a solution for this. could you share what you ended up doing ? Thanks.Jeffryjeffy
I have created a helper library for this and more shortcomings. It is available via Maven central as well. see this test case for example: github.com/rtdi/KafkaAvro/blob/master/src/test/java/io/rtdi/…Artis

© 2022 - 2024 — McMap. All rights reserved.