Avro support#2468
Conversation
closes spring-cloud#2402 Signed-off-by: Emanuel Trandafir <emanueltrandafir1993@gmail.com>
relates to spring-cloud#2402 Signed-off-by: Emanuel Trandafir <emanueltrandafir1993@gmail.com>
relates to spring-cloud#2404 Signed-off-by: Emanuel Trandafir <emanueltrandafir1993@gmail.com>
| return body != null && body.getClientValue() instanceof FromFileProperty; | ||
| } | ||
|
|
||
| private boolean isAvroContract(YamlContract contract) { |
There was a problem hiding this comment.
Can we make this section kafka / avro agnostic? If we just set asBytes() it should work right? Another option is to verify the contentType in which case if it's not json but explicitly sth else then we just pass through? Wdyt?
I also sense that we could have some extension points here. Like check through spi (we already do it in other parts of scc) some interfaces to verify if this contract has a special way of treating payload. We could have some AvroContractPayloadProceesor that would activate when the metadata has avro and it would provide a function to how convert the payload. That way this core logic stays clean of avro but we can inject the behavior. There would have to be some priority / ordering like we do in other spis here in scc.
Wdyt?
| outputMessage: | ||
| sentTo: book.returned | ||
| headers: | ||
| X-Correlation-Id: abc-123-def |
There was a problem hiding this comment.
What headers are being sent in case of avro messages? Is there any content type?
|
|
||
| private File saveTmpContract(String contractYaml) { | ||
| File contractDir = File.createTempDir() | ||
| new File(contractDir, "book_returned.yml").text = contractYaml |
|
|
||
| @Bean | ||
| @ConditionalOnMissingBean(name = "avroKafkaTemplate") | ||
| KafkaTemplate<String, Object> avroKafkaTemplate(@Value("${spring.kafka.bootstrap-servers}") String bootstrapServers, |
There was a problem hiding this comment.
Why can't we reuse the users production template?
| } | ||
|
|
||
| @JsonIgnoreProperties({ "schema", "specificData", "classSchema", "conversion" }) | ||
| interface IgnoreAvroMixin { |
There was a problem hiding this comment.
Can you provide a javadoc why we need this?
Bug fix ContractVerifierObjectMapper fails for Avro-generated objects #2402 —
ContractVerifierObjectMapperfails for Avro objects: when the intercepted message is an Avro-generated object,ContractVerifierObjectMapperwould fail to serialize it to JSON due to Avro-specific fields (schema,specificData,classSchema,conversion). Fixed by configuring theJsonMapperto ignore those fields via a mixin when Avro is on the classpath.Avro support for contract-based messaging: introduced
KafkaAvroMessageVerifierSenderwhich builds an AvroGenericRecordfrom the contract body and sends it viaKafkaTemplate, backed byKafkaAvroContractVerifierConfiguration(auto-configuration) and
AvroMetadata(schema config in contract metadata).Header propagation:
KafkaAvroMessageVerifierSendernow wraps the payload in aProducerRecordand copies contract output headers (e.g.X-Correlation-Id) as UTF-8 bytes onto it, so they are actually sent to Kafka.Bug fix StubRunnerExecutor fails for Avro objects #2404 —
StubRunnerExecutorJSON-serializes Avro body:StubRunnerExecutor.sendMessage()was unconditionally callingJsonOutput.toJson()on the message body before passing it to theMessageVerifierSender. This brokeKafkaAvroMessageVerifierSenderwhich expects aMap, not a JSON string. Fixed by adding anisAvroContract()check that inspects the raw contract metadata and skips JSON serialization for Avro contracts, passing the body as aMapdirectly.
TODO: if this gets accepted, we also need to add the documentation and sthe samples repo
Related issues