You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
+ exec /usr/bin/tini -s -- /stackable/spark/bin/spark-submit --conf spark.driver.bindAddress=192.168.145.89 --conf spark.executorEnv.SPARK_DRIVER_POD_IP=192.168.145.89 --deploy-mode client --properties-file /opt/spark/conf/spark.properties --class org.gbif.occurrence.download.spark.GbifOccurrenceDownloads local:///stackable/spark/jobs/occurrence-download.jar 0000068-250225202704447 Occurrence /stackable/spark/jobs/download.properties QUERY
OpenJDK 64-Bit Server VM warning: Option UseConcMarkSweepGC was deprecated in version 9.0 and will likely be removed in a future release.
:: loading settings :: url = jar:file:/stackable/spark-3.5.1-bin-hadoop3/jars/ivy-2.5.1.jar!/org/apache/ivy/core/settings/ivysettings.xml
Ivy Default Cache set to: /tmp/cache
The jars for the packages stored in: /tmp/jars
org.apache.iceberg#iceberg-spark-runtime-3.5_2.12 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-2beb2f6c-4839-4cc6-9557-815b674ea671;1.0
confs: [default]
You probably access the destination server through a proxy server that is not well configured.
:: resolution report :: resolve 35824ms :: artifacts dl 0ms
:: modules in use:
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 1 | 0 | 0 | 0 || 0 | 0 |
---------------------------------------------------------------------
:: problems summary ::
:::: WARNINGS
Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.pom
Host repo1.maven.org not found. url=https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.jar
Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.pom
Host repos.spark-packages.org not found. url=https://repos.spark-packages.org/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.jar
module not found: org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.6.0
==== local-m2-cache: tried
file:/stackable/.m2/repository/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.pom
-- artifact org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.6.0!iceberg-spark-runtime-3.5_2.12.jar:
file:/stackable/.m2/repository/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.jar
==== local-ivy-cache: tried
/tmp/local/org.apache.iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/ivys/ivy.xml
-- artifact org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.6.0!iceberg-spark-runtime-3.5_2.12.jar:
/tmp/local/org.apache.iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/jars/iceberg-spark-runtime-3.5_2.12.jar
==== central: tried
https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.pom
-- artifact org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.6.0!iceberg-spark-runtime-3.5_2.12.jar:
https://repo1.maven.org/maven2/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.jar
==== spark-packages: tried
https://repos.spark-packages.org/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.pom
-- artifact org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.6.0!iceberg-spark-runtime-3.5_2.12.jar:
https://repos.spark-packages.org/org/apache/iceberg/iceberg-spark-runtime-3.5_2.12/1.6.0/iceberg-spark-runtime-3.5_2.12-1.6.0.jar
::::::::::::::::::::::::::::::::::::::::::::::
:: UNRESOLVED DEPENDENCIES ::
::::::::::::::::::::::::::::::::::::::::::::::
:: org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.6.0: not found
::::::::::::::::::::::::::::::::::::::::::::::
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.iceberg#iceberg-spark-runtime-3.5_2.12;1.6.0: not found]
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1608)
at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:185)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:334)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:964)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:194)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:217)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:91)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1120)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1129)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
[2025-02-25, 21:14:12 UTC] {extended_stackable_spark_sensor.py:196} INFO - Cleaning completed pods after dwnld-query-0000068-250225202704447
[2025-02-25, 21:14:12 UTC] {rest.py:231} DEBUG - response body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Success","details":{"name":"dwnld-query-0000068-250225202704447","group":"spark.stackable.tech","kind":"sparkapplications","uid":"9e720de3-a108-4b52-a226-9a33ebcfa905"}}
[2025-02-25, 21:14:12 UTC] {extended_stackable_spark_sensor.py:204} INFO - Finish cleaning
[2025-02-25, 21:14:13 UTC] {taskinstance.py:2698} ERROR - Task failed with exception
Traceback (most recent call last):
File "/stackable/app/lib64/python3.9/site-packages/airflow/models/taskinstance.py", line 433, in _execute_task
result = execute_callable(context=context, **execute_callable_kwargs)
File "/stackable/app/lib64/python3.9/site-packages/airflow/sensors/base.py", line 265, in execute
raise e
File "/stackable/app/lib64/python3.9/site-packages/airflow/sensors/base.py", line 247, in execute
poke_return = self.poke(context)
File "/stackable/app/git/current/dags/gbif_modules/sensors/extended_stackable_spark_sensor.py", line 149, in poke
raise AirflowException(f"SparkApplication failed with state: {application_state}")
The text was updated successfully, but these errors were encountered:
Download 0000068-250225202704447
https://airflow.gbif.org/log?execution_date=2025-02-25T21%3A11%3A48.141995%2B00%3A00&task_id=download_query_monitor&dag_id=gbif_occurrence_download_dag&map_index=-1
The text was updated successfully, but these errors were encountered: