Skip to content

Commit

Permalink
[Addon kubevela#603] Add Apache Spark as a experimental addon
Browse files Browse the repository at this point in the history
Signed-off-by: yanghua <[email protected]>
  • Loading branch information
yanghua committed Feb 27, 2023
1 parent 91dcf66 commit 7ba8b76
Showing 1 changed file with 29 additions and 18 deletions.
47 changes: 29 additions & 18 deletions experimental/addons/spark-kubernetes-operator/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,25 +32,36 @@ vela ls -A | grep spark
* Secondly, show the component type `spark-cluster`, so we know how to use it in one application. As a spark user, you can choose the parameter to set for your spark cluster.

```
vela show spark-cluster
vela show spark-application
# Specification
# Specification
+---------------------+-------------------------------------------------------------------------------+--------+----------+---------+
| NAME | DESCRIPTION | TYPE | REQUIRED | DEFAULT |
+---------------------+-------------------------------------------------------------------------------+--------+----------+---------+
| name | Specify the spark application name. | string | true | |
| namespace | Specify the namespace for spark application to install. | string | true | |
| type | Specify the application language type, e.g. "Scala", "Python", "Java" or "R". | string | true | |
| pythonVersion | Specify the python version. | string | true | |
| mode | Specify the deploy mode, e.go "cluster", "client" or "in-cluster-client". | string | true | |
| image | Specify the container image for the driver, executor, and init-container. | string | true | |
| imagePullPolicy | Specify the image pull policy for the driver, executor, and init-container. | string | true | |
| mainClass | Specify the fully-qualified main class of the Spark application. | string | true | |
| mainApplicationFile | Specify the path to a bundled JAR, Python, or R file of the application. | string | true | |
| sparkVersion | Specify the version of Spark the application uses. | string | true | |
| driverCores | Specify the number of CPU cores to request for the driver pod. | int | true | |
| executorCores | Specify the number of CPU cores to request for the executor pod. | int | true | |
+---------------------+-------------------------------------------------------------------------------+--------+----------+---------+
+---------------------+------------------------------------------------------------------------------------------------------+-------------------+----------+---------+
| NAME | DESCRIPTION | TYPE | REQUIRED | DEFAULT |
+---------------------+------------------------------------------------------------------------------------------------------+-------------------+----------+---------+
| name | Specify the spark application name. | string | true | |
| namespace | Specify the namespace for spark application to install. | string | true | |
| type | Specify the application language type, e.g. "Scala", "Python", "Java" or "R". | string | true | |
| pythonVersion | Specify the python version. | string | false | |
| mode | Specify the deploy mode, e.go "cluster", "client" or "in-cluster-client". | string | true | |
| image | Specify the container image for the driver, executor, and init-container. | string | true | |
| imagePullPolicy | Specify the image pull policy for the driver, executor, and init-container. | string | true | |
| mainClass | Specify the fully-qualified main class of the Spark application. | string | true | |
| mainApplicationFile | Specify the path to a bundled JAR, Python, or R file of the application. | string | true | |
| sparkVersion | Specify the version of Spark the application uses. | string | true | |
| driverCores | Specify the number of CPU cores to request for the driver pod. | int | true | |
| executorCores | Specify the number of CPU cores to request for the executor pod. | int | true | |
| arguments | Specify a list of arguments to be passed to the application. | []string | false | |
| sparkConf | Specify the config information carries user-specified Spark configuration properties as they would | map[string]string | false | |
| | use the "--conf" option in spark-submit. | | | |
| hadoopConf | Specify the config information carries user-specified Hadoop configuration properties as they would | map[string]string | false | |
| | use the the "--conf" option in spark-submit. The SparkApplication controller automatically adds | | | |
| | prefix "spark.hadoop." to Hadoop configuration properties. | | | |
| sparkConfigMap | Specify the name of the ConfigMap containing Spark configuration files such as log4j.properties. The | string | false | |
| | controller will add environment variable SPARK_CONF_DIR to the path where the ConfigMap is mounted | | | |
| | to. | | | |
| hadoopConfigMap | Specify the name of the ConfigMap containing Hadoop configuration files such as core-site.xml. The | string | false | |
| | controller will add environment variable HADOOP_CONF_DIR to the path where the ConfigMap is mounted | | | |
| | to. | | | |
+---------------------+------------------------------------------------------------------------------------------------------+-------------------+----------+---------+
```

Expand Down

0 comments on commit 7ba8b76

Please sign in to comment.