A number of API/microservices provide access to critical resources including Kafka topics. The API/microservice messages are validated using an OpenAPI specification that defines the API/microservice contract. Once the microservice validates the message it is published to a Kafka topic, at which point the message is (again) validated against Kafka's schema registry.
The problem is that there are two message definitions upon which messages are validated (the OpenAPI spec and the Kafka's schema registry) and it is a challenge to ensure both message definitions are in sync.
With this in mind, I have a few questions:
- is there a way to convert OpenAPI specs to Kafka schema registry format (and vice-versa)?
- is there a way to allow Kafka to verify against an OpenAPI spec instead of registry (probably not a great solution as native Kafka capabilities should be used)?
- is there a way to allow an API/Microservice to validate its messages against a Kafka schema instead of OpenAPI spec (again, probably not a good approach since OpenAPI specs are the standard approach to define messages for APIs)?
Lastly, which of the above makes the most sense. Are there any other better alternatives?
swaggergen
, for example, to create models, then pass those around Kafka topics. However, you might also want to look at AsyncAPI – Suprasegmentalconfluent.value.schema.validation
field, but I think that is very specifically tied to their Schema Registry, which, as mentioned does offer extensions for custom schema types. – Suprasegmental