Avro serialization exception – java.time.Instant cannot be converted to java.lang.Long
I want to send a Kafka message with a payload that extends the SpecificRecordBase class; It is a class generated with the help of the Maven plugin.
One field of my schema has a timestamp type that corresponds to java.time.Instant in the generated class.
The fields are defined as follows:
{"name": "processingTime", "type": {
"type": "long",
"logicalType": "timestamp-millis"
}
},
When I create an instance of this class and set the processing time
setProcessingTime(RandomDate.randomInstant())
Everything works fine, but when I run the program and try to send it to Kafka, I get the following error:
org.apache.kafka.common.errors.SerializationException: Can't convert value of class poc.avroGenerated.AvroMeasurement to class poc.avroSerde.AvroSerializer specified in value.serializer
Caused by: java.lang.ClassCastException: class java.time.Instant cannot be cast to class java.lang.Long (java.time.Instant and java.lang.Long are in module java.base of loader ' bootstrap')
Here is my custom serializer class:
@Override
public byte[] serialize(String topic, T data) {
byte[] result = null;
try {
ByteArrayOutputStream byteArrayOutputStream = new ByteArrayOutputStream();
BinaryEncoder binaryEncoder = EncoderFactory.get().binaryEncoder(byteArrayOutputStream, null);
DatumWriter<GenericRecord> datumWriter = new GenericDatumWriter<>(data.getSchema());
datumWriter.write(data, binaryEncoder);
binaryEncoder.flush();
byteArrayOutputStream.close();
result = byteArrayOutputStream.toByteArray();
} catch (IOException e) {
LOGGER.error(e);
}
return result;
}
Solution
Use SpecificDatumWriter instead of GenericDatumWriter
.
Add that change and your custom serializer looks good!
This is often a point of confusion. In the Java implementation, “generic” data does not account for any customizations built into a particular record, including logical type conversions.