Skip to content

Commit be8bfed

Browse files
committed
[FLINK-37812][doc] Fix broken links to Apache Avro in doc and code
1 parent 16e5fbd commit be8bfed

File tree

7 files changed

+25
-24
lines changed

7 files changed

+25
-24
lines changed

docs/content.zh/docs/connectors/datastream/formats/parquet.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -198,14 +198,14 @@ ds = env.from_source(source, WatermarkStrategy.no_watermarks(), "file-source")
198198

199199
Flink 支持三种方式来读取 Parquet 文件并创建 Avro records (PyFlink 只支持 Generic record):
200200

201-
- [Generic record](https://avro.apache.org/docs/1.10.0/api/java/index.html)
202-
- [Specific record](https://avro.apache.org/docs/1.10.0/api/java/index.html)
203-
- [Reflect record](https://avro.apache.org/docs/1.10.0/api/java/org/apache/avro/reflect/package-summary.html)
201+
- [Generic record](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/generic/package-summary.html)
202+
- [Specific record](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/specific/package-summary.html)
203+
- [Reflect record](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/reflect/package-summary.html)
204204

205205
### Generic record
206206

207-
使用 JSON 定义 Avro schemas。你可以从 [Avro specification](https://avro.apache.org/docs/1.10.0/spec.html) 获取更多关于 Avro schemas 和类型的信息。
208-
此示例使用了一个在 [official Avro tutorial](https://avro.apache.org/docs/1.10.0/gettingstartedjava.html) 中描述的示例相似的 Avro schema:
207+
使用 JSON 定义 Avro schemas。你可以从 [Avro specification](https://avro.apache.org/docs/++version++/specification/) 获取更多关于 Avro schemas 和类型的信息。
208+
此示例使用了一个在 [Avro 官方教程](https://avro.apache.org/docs/++version++/getting-started-java/) 中描述的示例相似的 Avro schema:
209209

210210
```json lines
211211
{"namespace": "example.avro",
@@ -219,11 +219,11 @@ Flink 支持三种方式来读取 Parquet 文件并创建 Avro records (PyFlin
219219
}
220220
```
221221
这个 schema 定义了一个具有三个属性的的 user 记录:name,favoriteNumber 和 favoriteColor。你可以
222-
[record specification](https://avro.apache.org/docs/1.10.0/spec.html#schema_record) 找到更多关于如何定义 Avro schema 的详细信息。
222+
[record specification](https://avro.apache.org/docs/++version++/specification/#schema-record) 找到更多关于如何定义 Avro schema 的详细信息。
223223

224224
在此示例中,你将创建包含由 Avro Generic records 格式构成的 Parquet records 的 DataStream。
225225
Flink 会基于 JSON 字符串解析 Avro schema。也有很多其他的方式解析 schema,例如基于 java.io.File 或 java.io.InputStream。
226-
请参考 [Avro Schema](https://avro.apache.org/docs/1.10.0/api/java/org/apache/avro/Schema.html) 以获取更多详细信息。
226+
请参考 [Avro Schema](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/Schema.html) 以获取更多详细信息。
227227
然后,你可以通过 `AvroParquetReaders` 为 Avro Generic 记录创建 `AvroParquetRecordFormat`
228228

229229
{{< tabs "GenericRecord" >}}
@@ -286,7 +286,7 @@ stream = env.from_source(source, WatermarkStrategy.no_watermarks(), "file-source
286286
基于之前定义的 schema,你可以通过利用 Avro 代码生成来生成类。
287287
一旦生成了类,就不需要在程序中直接使用 schema。
288288
你可以使用 `avro-tools.jar` 手动生成代码,也可以直接使用 Avro Maven 插件对配置的源目录中的任何 .avsc 文件执行代码生成。
289-
请参考 [Avro Getting Started](https://avro.apache.org/docs/1.10.0/gettingstartedjava.html) 获取更多信息。
289+
请参考 [Avro Getting Started](https://avro.apache.org/docs/++version++/getting-started-java/) 获取更多信息。
290290

291291
此示例使用了样例 schema {{< gh_link file="flink-formats/flink-parquet/src/test/resources/avro/testdata.avsc" name="testdata.avsc" >}}:
292292

@@ -335,7 +335,7 @@ final DataStream<GenericRecord> stream =
335335

336336
除了需要预定义 Avro Generic 和 Specific 记录, Flink 还支持基于现有 Java POJO 类从 Parquet 文件创建 DateStream。
337337
在这种场景中,Avro 会使用 Java 反射为这些 POJO 类生成 schema 和协议。
338-
请参考 [Avro reflect](https://avro.apache.org/docs/1.10.0/api/java/index.html) 文档获取更多关于 Java 类型到 Avro schemas 映射的详细信息。
338+
请参考 [Avro reflect](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/reflect/package-summary.html) 文档获取更多关于 Java 类型到 Avro schemas 映射的详细信息。
339339

340340
本例使用了一个简单的 Java POJO 类 {{< gh_link file="flink-formats/flink-parquet/src/test/java/org/apache/flink/formats/parquet/avro/Datum.java" name="Datum" >}}:
341341

docs/content.zh/docs/connectors/table/formats/avro-confluent.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -291,4 +291,4 @@ Format 参数
291291

292292
除了此处列出的类型之外,Flink 还支持读取/写入可为空(nullable)的类型。 Flink 将可为空的类型映射到 Avro `union(something, null)`, 其中 `something` 是从 Flink 类型转换的 Avro 类型。
293293

294-
您可以参考 [Avro Specification](https://avro.apache.org/docs/current/spec.html) 以获取有关 Avro 类型的更多信息。
294+
您可以参考 [Avro 规范](https://avro.apache.org/docs/++version++/specification/) 以获取有关 Avro 类型的更多信息。

docs/content.zh/docs/connectors/table/formats/avro.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -204,4 +204,4 @@ Format 参数
204204

205205
除了上面列出的类型,Flink 支持读取/写入 nullable 的类型。Flink 将 nullable 的类型映射到 Avro `union(something, null)`,其中 `something` 是从 Flink 类型转换的 Avro 类型。
206206

207-
您可以参考 [Avro 规范](https://avro.apache.org/docs/current/spec.html) 获取更多有关 Avro 类型的信息。
207+
您可以参考 [Avro 规范](https://avro.apache.org/docs/++version++/specification/) 获取更多有关 Avro 类型的信息。

docs/content/docs/connectors/datastream/formats/parquet.md

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -196,14 +196,14 @@ ds = env.from_source(source, WatermarkStrategy.no_watermarks(), "file-source")
196196

197197
Flink supports producing three types of Avro records by reading Parquet files (Only Generic record is supported in PyFlink):
198198

199-
- [Generic record](https://avro.apache.org/docs/1.10.0/api/java/index.html)
200-
- [Specific record](https://avro.apache.org/docs/1.10.0/api/java/index.html)
201-
- [Reflect record](https://avro.apache.org/docs/1.10.0/api/java/org/apache/avro/reflect/package-summary.html)
199+
- [Generic record](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/generic/package-summary.html)
200+
- [Specific record](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/specific/package-summary.html)
201+
- [Reflect record](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/reflect/package-summary.html)
202202

203203
### Generic record
204204

205-
Avro schemas are defined using JSON. You can get more information about Avro schemas and types from the [Avro specification](https://avro.apache.org/docs/1.10.0/spec.html).
206-
This example uses an Avro schema example similar to the one described in the [official Avro tutorial](https://avro.apache.org/docs/1.10.0/gettingstartedjava.html):
205+
Avro schemas are defined using JSON. You can get more information about Avro schemas and types from the [Avro specification](https://avro.apache.org/docs/++version++/specification/).
206+
This example uses an Avro schema example similar to the one described in the [official Avro tutorial](https://avro.apache.org/docs/++version++/getting-started-java/):
207207

208208
```json lines
209209
{"namespace": "example.avro",
@@ -217,10 +217,10 @@ This example uses an Avro schema example similar to the one described in the [of
217217
}
218218
```
219219

220-
This schema defines a record representing a user with three fields: name, favoriteNumber, and favoriteColor. You can find more details at [record specification](https://avro.apache.org/docs/1.10.0/spec.html#schema_record) for how to define an Avro schema.
220+
This schema defines a record representing a user with three fields: name, favoriteNumber, and favoriteColor. You can find more details at [record specification](https://avro.apache.org/docs/++version++/specification/#schema-record) for how to define an Avro schema.
221221

222222
In the following example, you will create a DataStream containing Parquet records as Avro Generic records.
223-
It will parse the Avro schema based on the JSON string. There are many other ways to parse a schema, e.g. from java.io.File or java.io.InputStream. Please refer to [Avro Schema](https://avro.apache.org/docs/1.10.0/api/java/org/apache/avro/Schema.html) for details.
223+
It will parse the Avro schema based on the JSON string. There are many other ways to parse a schema, e.g. from java.io.File or java.io.InputStream. Please refer to [Avro Schema](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/Schema.html) for details.
224224
After that, you will create an `AvroParquetRecordFormat` via `AvroParquetReaders` for Avro Generic records.
225225

226226
{{< tabs "GenericRecord" >}}
@@ -284,7 +284,7 @@ Based on the previously defined schema, you can generate classes by leveraging A
284284
Once the classes have been generated, there is no need to use the schema directly in your programs.
285285
You can either use `avro-tools.jar` to generate code manually or you could use the Avro Maven plugin to perform
286286
code generation on any .avsc files present in the configured source directory. Please refer to
287-
[Avro Getting Started](https://avro.apache.org/docs/1.10.0/gettingstartedjava.html) for more information.
287+
[Avro Getting Started](https://avro.apache.org/docs/++version++/getting-started-java/) for more information.
288288

289289
The following example uses the example schema {{< gh_link file="flink-formats/flink-parquet/src/test/resources/avro/testdata.avsc" name="testdata.avsc" >}}:
290290

@@ -334,7 +334,7 @@ final DataStream<GenericRecord> stream =
334334
Beyond Avro Generic and Specific record that requires a predefined Avro schema,
335335
Flink also supports creating a DataStream from Parquet files based on existing Java POJO classes.
336336
In this case, Avro will use Java reflection to generate schemas and protocols for these POJO classes.
337-
Java types are mapped to Avro schemas, please refer to the [Avro reflect](https://avro.apache.org/docs/1.10.0/api/java/index.html) documentation for more details.
337+
Java types are mapped to Avro schemas, please refer to the [Avro reflect](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/reflect/package-summary.html) documentation for more details.
338338

339339
This example uses a simple Java POJO class {{< gh_link file="flink-formats/flink-parquet/src/test/java/org/apache/flink/formats/parquet/avro/Datum.java" name="Datum" >}}:
340340

docs/content/docs/connectors/table/formats/avro-confluent.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -298,4 +298,4 @@ See the [Apache Avro Format]({{< ref "docs/connectors/table/formats/avro" >}}#da
298298

299299
In addition to the types listed there, Flink supports reading/writing nullable types. Flink maps nullable types to Avro `union(something, null)`, where `something` is the Avro type converted from Flink type.
300300

301-
You can refer to [Avro Specification](https://avro.apache.org/docs/current/spec.html) for more information about Avro types.
301+
You can refer to [Avro Specification](https://avro.apache.org/docs/++version++/specification/) for more information about Avro types.

docs/content/docs/connectors/table/formats/avro.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -86,7 +86,7 @@ Format Options
8686
<td>yes</td>
8787
<td>binary</td>
8888
<td>String</td>
89-
<td>Serialization encoding to use. The valid enumerations are: <code>binary</code>, <code>json</code>. <a href="https://avro.apache.org/docs/current/specification/#encodings">(reference)</a><br>
89+
<td>Serialization encoding to use. The valid enumerations are: <code>binary</code>, <code>json</code>. <a href="https://avro.apache.org/docs/++version++/specification/#encodings">(reference)</a><br>
9090
Most applications will use the binary encoding, as it results in smaller and more efficient messages, reducing the usage of disk and network resources, and improving performance for high throughput data. <br>
9191
JSON encoding results in human-readable messages which can be useful during development and debugging, and is useful for compatibility when interacting with systems that cannot process binary encoded data.</td>
9292
</tr>
@@ -218,4 +218,4 @@ So the following table lists the type mapping from Flink type to Avro type.
218218

219219
In addition to the types listed above, Flink supports reading/writing nullable types. Flink maps nullable types to Avro `union(something, null)`, where `something` is the Avro type converted from Flink type.
220220

221-
You can refer to [Avro Specification](https://avro.apache.org/docs/current/spec.html) for more information about Avro types.
221+
You can refer to [Avro Specification](https://avro.apache.org/docs/++version++/specification/) for more information about Avro types.

flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSerializerSnapshot.java

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -166,7 +166,8 @@ public TypeSerializer<T> restoreSerializer() {
166166
*
167167
* <p>Checks whenever a new version of a schema (reader) can read values serialized with the old
168168
* schema (writer). If the schemas are compatible according to {@code Avro} schema resolution
169-
* rules (@see <a href="https://avro.apache.org/docs/current/spec.html#Schema+Resolution">Schema
169+
* rules (@see <a
170+
* href="https://avro.apache.org/docs/++version++/specification/#schema-resolution">Schema
170171
* Resolution</a>).
171172
*/
172173
@VisibleForTesting

0 commit comments

Comments
 (0)