Skip to content
This repository was archived by the owner on Oct 12, 2023. It is now read-only.

Spark 3 support #80

Closed
tkasu opened this issue Jun 22, 2020 · 2 comments
Closed

Spark 3 support #80

tkasu opened this issue Jun 22, 2020 · 2 comments

Comments

@tkasu
Copy link

tkasu commented Jun 22, 2020

As Apache Spark 3.0.0 is now released, what is the timetable to support Spark 3 and Scala 2.12?

@lotsahelp
Copy link

lotsahelp commented Jun 26, 2020

I'm getting the following exception when running on databricks 7.0.
ERROR Uncaught throwable from user code: java.lang.NoClassDefFoundError: scala/Product$class at com.microsoft.azure.sqldb.spark.config.SqlDBConfigBuilder.<init>(SqlDBConfigBuilder.scala:31) at com.microsoft.azure.sqldb.spark.config.Config$.apply(Config.scala:254) at com.microsoft.azure.sqldb.spark.config.Config$.apply(Config.scala:235) at line0419139fc8114231985e78f2bf75c46d25.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command-802367102285758:15) at line0419139fc8114231985e78f2bf75c46d25.$read$$iw$$iw$$iw$$iw$$iw.<init>(command-802367102285758:67) at line0419139fc8114231985e78f2bf75c46d25.$read$$iw$$iw$$iw$$iw.<init>(command-802367102285758:69) at line0419139fc8114231985e78f2bf75c46d25.$read$$iw$$iw$$iw.<init>(command-802367102285758:71) at line0419139fc8114231985e78f2bf75c46d25.$read$$iw$$iw.<init>(command-802367102285758:73) at line0419139fc8114231985e78f2bf75c46d25.$read$$iw.<init>(command-802367102285758:75) at line0419139fc8114231985e78f2bf75c46d25.$read.<init>(command-802367102285758:77) at line0419139fc8114231985e78f2bf75c46d25.$read$.<init>(command-802367102285758:81) at line0419139fc8114231985e78f2bf75c46d25.$read$.<clinit>(command-802367102285758) at line0419139fc8114231985e78f2bf75c46d25.$eval$.$print$lzycompute(<notebook>:7) at line0419139fc8114231985e78f2bf75c46d25.$eval$.$print(<notebook>:6) at line0419139fc8114231985e78f2bf75c46d25.$eval.$print(<notebook>) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:745) at scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1021) at scala.tools.nsc.interpreter.IMain.$anonfun$interpret$1(IMain.scala:574) at scala.reflect.internal.util.ScalaClassLoader.asContext(ScalaClassLoader.scala:41) at scala.reflect.internal.util.ScalaClassLoader.asContext$(ScalaClassLoader.scala:37) at scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:41) at scala.tools.nsc.interpreter.IMain.loadAndRunReq$1(IMain.scala:573) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:600) at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:570) at com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215) at com.databricks.backend.daemon.driver.ScalaDriverLocal.$anonfun$repl$1(ScalaDriverLocal.scala:202) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:714) at com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:667) at com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:202) at com.databricks.backend.daemon.driver.DriverLocal.$anonfun$execute$10(DriverLocal.scala:396) at com.databricks.logging.UsageLogging.$anonfun$withAttributionContext$1(UsageLogging.scala:238) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at com.databricks.logging.UsageLogging.withAttributionContext(UsageLogging.scala:233) at com.databricks.logging.UsageLogging.withAttributionContext$(UsageLogging.scala:230) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:49) at com.databricks.logging.UsageLogging.withAttributionTags(UsageLogging.scala:275) at com.databricks.logging.UsageLogging.withAttributionTags$(UsageLogging.scala:268) at com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:49) at com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:373) at com.databricks.backend.daemon.driver.DriverWrapper.$anonfun$tryExecutingCommand$1(DriverWrapper.scala:653) at scala.util.Try$.apply(Try.scala:213) at com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:645) at com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:486) at com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:598) at com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:391) at com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337) at com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219) at java.lang.Thread.run(Thread.java:748)

@arvindshmicrosoft
Copy link
Contributor

Thank you for your questions and ideas. There are no plans to support Spark 3.0.0 with this connector. Consider evaluating the
Apache Spark Connector for SQL Server and Azure SQL which is a newer connector. They are already tracking the request for Spark 3.0.0 support in the new connector.

I am closing this issue as there are no plans to address this request in this connector.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants