Skip to content

KafkaPubSub avro decimal - only unscaled value is sent to underlying service #3721

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
GregerTomas opened this issue Mar 18, 2025 · 7 comments · May be fixed by #3712
Closed

KafkaPubSub avro decimal - only unscaled value is sent to underlying service #3721

GregerTomas opened this issue Mar 18, 2025 · 7 comments · May be fixed by #3712
Assignees
Labels
kind/bug Something isn't working stale

Comments

@GregerTomas
Copy link

What version of Dapr?

v1.14.4, v1.15.3

Expected Behavior

When using avro decimal property

{
 "type": "bytes",
 "logicalType": "decimal",
 "precision": 4,
 "scale": 2
}

Dapr should send probably a string representation of a decimal to a service.

Actual Behavior

Only bytes of unscaled value are sent to a service. Services must handle precision/scale on their own.

Steps to Reproduce the Problem

Setup a Kafka PubSub topic subscriber which uses an avro schema with any decimal property. Publish a message (using any tool). In a underlying service check an incomming request (from Dapr).

@GregerTomas GregerTomas added the kind/bug Something isn't working label Mar 18, 2025
@passuied
Copy link
Contributor

Overall, the Avro support in dapr doesn't really have support for logicalType and will only consider the type... This is the same as for date/datetime. The type is int/long and the logicalType is not leveraged to convert to the JSON. They will be sent as the corresponding JSON type int/long...
So it's not a bug but the way it works today..

@GregerTomas
Copy link
Author

Thank you for the reply. At least I can change it to feature request, do you agree?

@passuied
Copy link
Contributor

passuied commented Mar 22, 2025

Well it's down to the avro library implementation. We use goavro. The more prevalent hamba/avro is not a good option because it doesn't support avro/standard Json conversion. It only expects go native types. To make it all work with dapr we have to go through JSON as the lowest common denominator.

In the current implementation, we go from standard Json to avro as opposed to avro Json which has higher type accuracy.
Maybe we could introduce an option to use avro Json which would most likely offer higher accuracy of types (to be tested).

I could create a branch for you to test locally

@passuied
Copy link
Contributor

Actually this is an issue for the component-contrib repo, not the dapr one. Please close this one and create one on the appropriate repo.

@GregerTomas
Copy link
Author

Thank you for moving the issue to the proper repository. I haven't had time recently and now I see it is already done.

Copy link

github-actions bot commented May 1, 2025

This issue has been automatically marked as stale because it has not had activity in the last 30 days. It will be closed in the next 7 days unless it is tagged (pinned, good first issue, help wanted or triaged/resolved) or other activity occurs. Thank you for your contributions.

@github-actions github-actions bot added the stale label May 1, 2025
Copy link

github-actions bot commented May 8, 2025

This issue has been automatically closed because it has not had activity in the last 37 days. If this issue is still valid, please ping a maintainer and ask them to label it as pinned, good first issue, help wanted or triaged/resolved. Thank you for your contributions.

@github-actions github-actions bot closed this as not planned Won't fix, can't repro, duplicate, stale May 8, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
kind/bug Something isn't working stale
Projects
None yet
2 participants