Skip to content

Commit d5ac803

Browse files
authored
fix docs warnings (apache#2320)
1 parent 633b33e commit d5ac803

File tree

11 files changed

+21
-22
lines changed

11 files changed

+21
-22
lines changed

docs/source/contributor-guide/development.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ under the License.
3030

3131
## Development Setup
3232

33-
1. Make sure `JAVA_HOME` is set and point to JDK using [support matrix](/docs/source/user-guide/latest/installation.md)
33+
1. Make sure `JAVA_HOME` is set and point to JDK using [support matrix](../user-guide/latest/installation.md)
3434
2. Install Rust toolchain. The easiest way is to use
3535
[rustup](https://rustup.rs).
3636

docs/source/contributor-guide/plugin_overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -147,4 +147,4 @@ accessing Arrow data structures from multiple languages.
147147

148148
The following diagram shows an example of the end-to-end flow for a query stage.
149149

150-
![Diagram of Comet Data Flow](../../_static/images/comet-dataflow.svg)
150+
![Diagram of Comet Data Flow](/_static/images/comet-dataflow.svg)

docs/source/contributor-guide/spark-sql-tests.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -54,7 +54,7 @@ git apply ../datafusion-comet/dev/diffs/3.4.3.diff
5454

5555
## 3. Run Spark SQL Tests
5656

57-
#### Use the following commands to run the Spark SQL test suite locally.
57+
### Use the following commands to run the Spark SQL test suite locally.
5858

5959
```shell
6060
ENABLE_COMET=true build/sbt catalyst/test
@@ -65,7 +65,7 @@ ENABLE_COMET=true build/sbt "hive/testOnly * -- -l org.apache.spark.tags.Extende
6565
ENABLE_COMET=true build/sbt "hive/testOnly * -- -n org.apache.spark.tags.ExtendedHiveTest"
6666
ENABLE_COMET=true build/sbt "hive/testOnly * -- -n org.apache.spark.tags.SlowHiveTest"
6767
```
68-
#### Steps to run individual test suites through SBT
68+
### Steps to run individual test suites through SBT
6969
1. Open SBT with Comet enabled
7070
```shell
7171
ENABLE_COMET=true sbt -J-Xmx4096m -Dspark.test.includeSlowTests=true
@@ -74,7 +74,7 @@ ENABLE_COMET=true sbt -J-Xmx4096m -Dspark.test.includeSlowTests=true
7474
```shell
7575
sql/testOnly org.apache.spark.sql.DynamicPartitionPruningV1SuiteAEOn -- -z "SPARK-35568"
7676
```
77-
#### Steps to run individual test suites in IntelliJ IDE
77+
### Steps to run individual test suites in IntelliJ IDE
7878
1. Add below configuration in VM Options for your test case (apache-spark repository)
7979
```shell
8080
-Dspark.comet.enabled=true -Dspark.comet.debug.enabled=true -Dspark.plugins=org.apache.spark.CometPlugin -DXmx4096m -Dspark.executor.heartbeatInterval=20000 -Dspark.network.timeout=10000 --add-exports=java.base/sun.nio.ch=ALL-UNNAMED --add-opens=java.base/java.nio=ALL-UNNAMED

docs/source/gluten_comparison.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -64,7 +64,7 @@ Comet relies on the full Spark SQL test suite (consisting of more than 24,000 te
6464
integration tests to ensure compatibility with Spark. Features that are known to have compatibility differences with
6565
Spark are disabled by default, but users can opt in. See the [Comet Compatibility Guide] for more information.
6666

67-
[Comet Compatibility Guide]: compatibility.md
67+
[Comet Compatibility Guide]: user-guide/latest/compatibility.md
6868

6969
Gluten also aims to provide compatibility with Spark, and includes a subset of the Spark SQL tests in its own test
7070
suite. See the [Gluten Compatibility Guide] for more information.

docs/source/index.rst

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,7 @@ as a native runtime to achieve improvement in terms of query efficiency and quer
4343
Comet Overview <overview>
4444
Comparison with Gluten <gluten_comparison>
4545

46-
.. _toc.user-guide-links:
46+
.. _toc.user-guide-links-versioned:
4747
.. toctree::
4848
:maxdepth: 2
4949
:caption: User Guides:
@@ -52,7 +52,6 @@ as a native runtime to achieve improvement in terms of query efficiency and quer
5252
Comet 0.10.0-SNAPSHOT <user-guide/latest/index>
5353
Comet 0.9.x <user-guide/0.9/index>
5454
Comet 0.8.x <user-guide/0.8/index>
55-
Comet 0.7.x <user-guide/0.7/index>
5655

5756
.. _toc.contributor-guide-links:
5857
.. toctree::

docs/source/overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -62,4 +62,4 @@ hardware.
6262

6363
Refer to the [Comet Installation Guide] to get started.
6464

65-
[Comet Installation Guide]: installation.md
65+
[Comet Installation Guide]: user-guide/latest/installation.md

docs/source/user-guide/0.8/index.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,14 +15,14 @@
1515
.. specific language governing permissions and limitations
1616
.. under the License.
1717
18-
.. image:: _static/images/DataFusionComet-Logo-Light.png
18+
.. image:: /_static/images/DataFusionComet-Logo-Light.png
1919
:alt: DataFusion Comet Logo
2020

2121
=======================
2222
Comet 0.8.x User Guide
2323
=======================
2424

25-
.. _toc.user-guide-links:
25+
.. _toc.user-guide-links-08:
2626
.. toctree::
2727
:maxdepth: 1
2828
:caption: Comet 0.8.x User Guide

docs/source/user-guide/0.9/index.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -15,14 +15,14 @@
1515
.. specific language governing permissions and limitations
1616
.. under the License.
1717
18-
.. image:: _static/images/DataFusionComet-Logo-Light.png
18+
.. image:: /_static/images/DataFusionComet-Logo-Light.png
1919
:alt: DataFusion Comet Logo
2020

2121
=======================
2222
Comet 0.9.x User Guide
2323
=======================
2424

25-
.. _toc.user-guide-links:
25+
.. _toc.user-guide-links-09:
2626
.. toctree::
2727
:maxdepth: 1
2828
:caption: Comet 0.9.x User Guide

docs/source/user-guide/latest/datasources.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -122,24 +122,24 @@ Input [3]: [id#0, first_name#1, personal_info#4]
122122

123123
Verify the native scan type should be `CometNativeScan`.
124124

125-
More on [HDFS Reader](../../../native/hdfs/README.md)
125+
More on [HDFS Reader](https://github.com/apache/datafusion-comet/blob/main/native/hdfs/README.md)
126126

127127
### Local HDFS development
128128

129129
- Configure local machine network. Add hostname to `/etc/hosts`
130-
```commandline
130+
```shell
131131
127.0.0.1 localhost namenode datanode1 datanode2 datanode3
132132
::1 localhost namenode datanode1 datanode2 datanode3
133133
```
134134

135135
- Start local HDFS cluster, 3 datanodes, namenode url is `namenode:9000`
136-
```commandline
136+
```shell
137137
docker compose -f kube/local/hdfs-docker-compose.yml up
138138
```
139139

140140
- Check the local namenode is up and running on `http://localhost:9870/dfshealth.html#tab-overview`
141141
- Build a project with HDFS support
142-
```commandline
142+
```shell
143143
JAVA_HOME="/opt/homebrew/opt/openjdk@11" make release PROFILES="-Pspark-3.5" COMET_FEATURES=hdfs RUSTFLAGS="-L /opt/homebrew/opt/openjdk@11/libexec/openjdk.jdk/Contents/Home/lib/server"
144144
```
145145

docs/source/user-guide/latest/iceberg.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,7 @@ scala> spark.sql(s"SELECT * from t1").explain()
143143
```
144144

145145
## Known issues
146-
- We temporarily disable Comet when there are delete files in Iceberg scan, see Iceberg [1.8.1 diff](../../../dev/diffs/iceberg/1.8.1.diff) and this [PR](https://github.com/apache/iceberg/pull/13793)
146+
- We temporarily disable Comet when there are delete files in Iceberg scan, see Iceberg [1.8.1 diff](https://github.com/apache/datafusion-comet/blob/main/dev/diffs/iceberg/1.8.1.diff) and this [PR](https://github.com/apache/iceberg/pull/13793)
147147
- Iceberg scan w/ delete files lead to [runtime exceptions](https://github.com/apache/datafusion-comet/issues/2117) and [incorrect results](https://github.com/apache/datafusion-comet/issues/2118)
148148
- Enabling `CometShuffleManager` leads to [runtime exceptions](https://github.com/apache/datafusion-comet/issues/2086)
149149
- Spark Runtime Filtering isn't [working](https://github.com/apache/datafusion-comet/issues/2116)

0 commit comments

Comments
 (0)