Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] DeltatTableStreamWriter batch_function does not work without merge statement(?) #176

Open
IMC07 opened this issue Mar 20, 2025 · 0 comments
Labels
bug Something isn't working
Milestone

Comments

@IMC07
Copy link

IMC07 commented Mar 20, 2025

Describe the bug

See screenshots; in one I create a merge statement in spark sql, in the other one I do not, instead I put the merge conditions in the output_mode_params. The latter does not seem to be writing any data to the target table, but also does not show an error.
I checked that the other variables are all the same: the DeltaTableReader is exactly the same, and I reset the checkpoints (/volumes) and my_target_table's before testing.

Steps to Reproduce

See screenshots.

Expected behavior

I expect the same result in my_target_table2 as I have in my_target_table1.

Screenshots

Image
Image

Environment

Databricks 15.4 LTS (includes Apache Spark 3.5.0, Scala 2.12)

Additional context

Applicable Stack Trace

@IMC07 IMC07 added the bug Something isn't working label Mar 20, 2025
@dannymeijer dannymeijer added this to the 0.11 milestone Mar 24, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
Status: No status
Development

No branches or pull requests

2 participants