[Bug] Index errors when using split_part #469
Labels
pkg:dbt-spark
Issue affects dbt-spark
triage:product
In Product's queue
type:bug
Something isn't working as documented
Is this a new bug in dbt-spark?
Current Behavior
When called with a part index that is out of bounds, and ansi-mode on, the split_part macro leads to an exception
Expected Behavior
Per the tests in BaseSplitPart in the adapter tests, the expectation is that this macro can be invoked with part indexes greater than the number of parts generated without throwing an exception specifically this row in the seed:
We can accommodate this behavior by using get, rather than indexing the array, but only in Spark 3.4.0 or later.
Steps To Reproduce
Relevant log output
No response
Environment
This issue has been in there for a while, but I'm just hitting it now due to new defaults in a Databricks environment I was asked to test against.
Additional Context
No response
The text was updated successfully, but these errors were encountered: