Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] EvalMode.TRY is not handled properly for Spark 3.4.0+, at least for some arithmetic operators #11884

Open
revans2 opened this issue Dec 17, 2024 · 0 comments
Labels
? - Needs Triage Need team to review and classify bug Something isn't working

Comments

@revans2
Copy link
Collaborator

revans2 commented Dec 17, 2024

Describe the bug
This is a result of us looking more closely at try_multiply.

try_multiply was added to Spark 3.3.0 https://issues.apache.org/jira/browse/SPARK-38164, but was then changed in Spark 3.4.0 https://issues.apache.org/jira/browse/SPARK-40222 so that numeric arguments would not catch exceptions for their children.

Most multiply operations do not throw exceptions, but some do (like decimal in some corner cases). We currently think that ANSI and TRY are the same and that we should fail on error. This is not true at all and we will do the wrong thing for try_multiply. We need to look at all of the other try_ operations that are not translated using TryEval and rethink things. At a minimum we need to fall back to the CPU unless we have tests and have verified that we do the right thing.

@revans2 revans2 added ? - Needs Triage Need team to review and classify bug Something isn't working labels Dec 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
? - Needs Triage Need team to review and classify bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant