Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] Iceberg related jobs failed due to Spark version incompatibility #189

Closed
wjxiz1992 opened this issue Jun 14, 2024 · 0 comments · Fixed by #190
Closed

[BUG] Iceberg related jobs failed due to Spark version incompatibility #189

wjxiz1992 opened this issue Jun 14, 2024 · 0 comments · Fixed by #190
Assignees
Labels
bug Something isn't working

Comments

@wjxiz1992
Copy link
Collaborator

wjxiz1992 commented Jun 14, 2024

Describe the bug

The following error is thrown when running with Spark 3.2 + using our iceberg related templates.

java.lang.IncompatibleClassChangeError: class org.apache.spark.sql.catalyst.plans.logical.DynamicFileFilterWithCardinalityCheck has interface org.apache.spark.sql.catalyst.plans.logical.BinaryNode as super class
  at java.lang.ClassLoader.defineClass1(Native Method)
  at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
  at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
  at java.net.URLClassLoader.defineClass(URLClassLoader.java:473)
  at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
  at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
  at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
  at org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions.$anonfun$apply$8(IcebergSparkSessionExtensions.scala:50)
  at org.apache.spark.sql.SparkSessionExtensions.$anonfun$buildOptimizerRules$1(SparkSessionExtensions.scala:201)
  at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
  at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
  at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
  at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
  at scala.collection.TraversableLike.map(TraversableLike.scala:286)
  at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
  at scala.collection.AbstractTraversable.map(Traversable.scala:108)
  at org.apache.spark.sql.SparkSessionExtensions.buildOptimizerRules(SparkSessionExtensions.scala:201)

Reason is we drop support for Spark <= 3.1.x. we don't run with these templates on Spark 3.1.x. Thus we need to update the Iceberg package version to 3.2

Steps/Code to reproduce bug
Launch Spark-shell with spark > 3.2.x

./spark-shell --packages org.apache.iceberg:iceberg-spark-runtime-3.1_2.12:0.13.2 --conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions

spark.sql("drop table if exists abc")

Expected behavior
No error thrown.

Environment details (please complete the following information)
see commands above

solution
Change Iceberg package version to 3.2

@wjxiz1992 wjxiz1992 added bug Something isn't working ? - Needs Triage Need team to review and classify labels Jun 14, 2024
@wjxiz1992 wjxiz1992 self-assigned this Jun 14, 2024
@sameerz sameerz removed the ? - Needs Triage Need team to review and classify label Jun 29, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants