-
Notifications
You must be signed in to change notification settings - Fork 2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[BUG] - Issue while deserializing the message by using Schema Registry with Forward compatibility - Schema Registry #22278
Comments
@srnagar Any priority and timeline on this issue? |
Any updates on this issue ? |
Any updates on this issue ? |
@conniey Could you please take a look at this issue? |
@conniey Could you please share any updates on this issue ? we are looking to implement Schema Registry feature in our applications. |
Hey. Thanks for reporting this. I'm looking into it. |
When I can expect azure-data-schemaregistry-apacheavro 1.0.0.- beta.9 will release as maven dependency ? Looks like above bug fix in that release. |
@KrupalVanukuri It's been published to Maven Central |
Thanks @conniey. But now I am confused with MessageWithMetadata model class. I am using spring Kafka to push the data to EventHub. Previously I used to send byte[] to Eventhub. Now Do I need to send MessageWithMetadata object to EventHub ? Is this config correct ? |
I think if I want to use azure-schema-registry-for-kafka, then this fix needs be implemented in azure-data-schemaregistry-avro/src/main/java/com/azure/data/schemaregistry/avro/AvroSchemaRegistryUtils.java. |
Any updates on this ? When the fix can be apply to azure-data-schemaregistry-avro/src/main/java/com/azure/data/schemaregistry/avro/AvroSchemaRegistryUtils.java ?? |
Could you explain a bit why that class needs to be updated? Its intention is as a utility class that shouldn't be exposed for public consumption (the class no longer exists in our repository). The forward compatibility issue was resolved here: #26592 |
I am using kafka libraries to publish and consume the data. so I what to use some value serializer class in kafka configuration. When I started my POC I used https://github.com/Azure/azure-schema-registry-for-kafka library that in turn use azure-data-schemaregistry-avro as a dependency . So I found the issue in that dependency library src/main/java/com/azure/data/schemaregistry/avro/AvroSchemaRegistryUtils.java. class. Now the fix included in azure-data-schemaregistry-apacheavro but it does not have any Serilialzer class that I can use in Kafka configuration. All I see Encoder class. So that's the where I got confused. Do you have any samples that I can refer for SchemaRegistry with Kafka libraries (including forward compatibility fix) ? |
I'll take a look at that repo and get back to you. |
@conniey Do you have any updates on this please ? |
@conniey Any updates on this Please ? Is the fix integrated with azure-schema-registry-for-kafka ? |
Describe the bug
I am working on EventHub Schema Registry with Forward compatibility. When I have same avro schema (let say V1) on both Producer and consumer applications, everything working fine. But when ever I do schema evolution by adding some new field(s) to Producer schema (say V2) and Consumer still using old schema (V1), getting an exception on Consumer application because newly added filed not found in consumer avro classes to map the field. I believe this issue is because of DatumReader created from writerSchema, not from ReaderSchema.
Note: Avro classes are SpecificRecord type and SpecifiRecord look for order/position of fields.
Exception or Stack Trace
To Reproduce
Steps:
Created schema group with Forward compatibility in portal
Defined a schema v1 at both producer and consumer applications and generated classes using avro-maven-plugin.
Produced the message, schema is auto registered in schema registry.
Consumer received the message successfully.
Added new field called middleName in schema (v2) and generated the classes at only producer application.
No changes made at consumer application, So consumer is still having schema v1 and corresponding generated classes.
Produced the message with schema v2 and schema v2 got auto registered in schema registry.
Now Consumer application throwing error because newly added filed not found in its corresponding avro class to map the field. Avro classes are SpecificRecord type and SpecifiRecord look for order/position of fields.
Code Snippet
https://github.com/Azure/azure-sdk-for-java/blob/master/sdk/schemaregistry/azure-data-schemaregistry-avro/src/main/java/com/azure/data/schemaregistry/avro/AvroSchemaRegistryUtils.java
Method:
In above method, DatumReader created from writerSchema, So consumer application expecting
Expected behavior
sample ex: (not full solution)
Screenshots
If applicable, add screenshots to help explain your problem.
Setup (please complete the following information):
Additional context
Reference Link I used:
[ https://github.com/Azure/azure-schema-registry-for-kafka/tree/master/java/avro/samples ]
Please refer this link for sample config and schemas I used.
Azure/azure-schema-registry-for-kafka#15
Information Checklist
Kindly make sure that you have added all the following information above and checkoff the required fields otherwise we will treat the issuer as an incomplete report
The text was updated successfully, but these errors were encountered: