I'm using Kafka & Schema-registry. Defined a schema, using confluent's KafkaAvroSerializer on the producer side. Everything works fine.
On the other hand, if a producer publishes the event without adhering to the schema, it gets published without any problem.
Understood that Kafka gets just serialized binary, doesn't inspect the data and functionality works as designed.
Wondering if there is any better way to enforce stronger schema validation so that the topic is not polluted with bad data ?