Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] MERGE operation fails due to AnalysisException when target table contains CHAR(N) columns #3868

Open
2 of 8 tasks
clee704 opened this issue Nov 12, 2024 · 0 comments · May be fixed by #3869
Open
2 of 8 tasks
Labels
bug Something isn't working

Comments

@clee704
Copy link
Contributor

clee704 commented Nov 12, 2024

Bug

Which Delta project/connector is this regarding?

  • Spark
  • Standalone
  • Flink
  • Kernel
  • Other (fill in here)

Describe the problem

Steps to reproduce

create table target (id char(10), value char(10)) using delta;
create table source (id char(10), value char(10)) using parquet;
merge into target
using source on target.id = source.id
when matched then update set target.value = source.value;

Observed results

org.apache.spark.sql.delta.DeltaAnalysisException: [DELTA_MERGE_RESOLVED_ATTRIBUTE_MISSING_FROM_INPUT] Resolved attribute(s) id#878,id#880,value#887,id#892,value#893 missing from id#872,value#873,id#874,value#875 in operator !DeltaMergeInto (id#878 = id#880), [Update [actions: [`value` = value#887]]], [Insert [actions: [`id` = id#892, `value` = value#893]]], false, StructType(StructField(id,StringType,true),StructField(value,StringType,true)); line 2 pos 0
at org.apache.spark.sql.delta.ResolveDeltaMergeInto$.resolveReferencesAndSchema(ResolveDeltaMergeInto.scala:335)                                                                                                                                                                          at org.apache.spark.sql.delta.DeltaAnalysis$$anonfun$apply$1.applyOrElse(DeltaAnalysis.scala:547)                                                                                                                                                                                         at org.apache.spark.sql.delta.DeltaAnalysis$$anonfun$apply$1.applyOrElse(DeltaAnalysis.scala:81)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper.$anonfun$resolveOperatorsDownWithPruning$2(AnalysisHelper.scala:170)
...

Expected results

MERGE should be performed.

Environment information

  • Delta Lake version: latest (860438f)
  • Spark version: 3.5.3
  • Scala version: 2.12.18

Willingness to contribute

The Delta Lake Community encourages bug fix contributions. Would you or another member of your organization be willing to contribute a fix for this bug to the Delta Lake code base?

  • Yes. I can contribute a fix for this bug independently.
  • Yes. I would be willing to contribute a fix for this bug with guidance from the Delta Lake community.
  • No. I cannot contribute a bug fix at this time.
@clee704 clee704 added the bug Something isn't working label Nov 12, 2024
@clee704 clee704 linked a pull request Nov 12, 2024 that will close this issue
5 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant