You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The following error is thrown when running with Spark 3.2 + using our iceberg related templates.
java.lang.IncompatibleClassChangeError: class org.apache.spark.sql.catalyst.plans.logical.DynamicFileFilterWithCardinalityCheck has interface org.apache.spark.sql.catalyst.plans.logical.BinaryNode as super class
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:473)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions.$anonfun$apply$8(IcebergSparkSessionExtensions.scala:50)
at org.apache.spark.sql.SparkSessionExtensions.$anonfun$buildOptimizerRules$1(SparkSessionExtensions.scala:201)
at scala.collection.TraversableLike.$anonfun$map$1(TraversableLike.scala:286)
at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62)
at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49)
at scala.collection.TraversableLike.map(TraversableLike.scala:286)
at scala.collection.TraversableLike.map$(TraversableLike.scala:279)
at scala.collection.AbstractTraversable.map(Traversable.scala:108)
at org.apache.spark.sql.SparkSessionExtensions.buildOptimizerRules(SparkSessionExtensions.scala:201)
Reason is we drop support for Spark <= 3.1.x. we don't run with these templates on Spark 3.1.x. Thus we need to update the Iceberg package version to 3.2
Steps/Code to reproduce bug
Launch Spark-shell with spark > 3.2.x
./spark-shell --packages org.apache.iceberg:iceberg-spark-runtime-3.1_2.12:0.13.2 --conf spark.sql.extensions=org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions
spark.sql("drop table if exists abc")
Expected behavior
No error thrown.
Environment details (please complete the following information)
see commands above
solution
Change Iceberg package version to 3.2
The text was updated successfully, but these errors were encountered:
Describe the bug
The following error is thrown when running with Spark 3.2 + using our iceberg related templates.
Reason is we drop support for Spark <= 3.1.x. we don't run with these templates on Spark 3.1.x. Thus we need to update the Iceberg package version to 3.2
Steps/Code to reproduce bug
Launch Spark-shell with spark > 3.2.x
Expected behavior
No error thrown.
Environment details (please complete the following information)
see commands above
solution
Change Iceberg package version to 3.2
The text was updated successfully, but these errors were encountered: