You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Avro schemas are defined using JSON. You can get more information about Avro schemas and types from the [Avro specification](https://avro.apache.org/docs/1.10.0/spec.html).
206
-
This example uses an Avro schema example similar to the one described in the [official Avro tutorial](https://avro.apache.org/docs/1.10.0/gettingstartedjava.html):
205
+
Avro schemas are defined using JSON. You can get more information about Avro schemas and types from the [Avro specification](https://avro.apache.org/docs/++version++/specification/).
206
+
This example uses an Avro schema example similar to the one described in the [official Avro tutorial](https://avro.apache.org/docs/++version++/getting-started-java/):
207
207
208
208
```json lines
209
209
{"namespace": "example.avro",
@@ -217,10 +217,10 @@ This example uses an Avro schema example similar to the one described in the [of
217
217
}
218
218
```
219
219
220
-
This schema defines a record representing a user with three fields: name, favoriteNumber, and favoriteColor. You can find more details at [record specification](https://avro.apache.org/docs/1.10.0/spec.html#schema_record) for how to define an Avro schema.
220
+
This schema defines a record representing a user with three fields: name, favoriteNumber, and favoriteColor. You can find more details at [record specification](https://avro.apache.org/docs/++version++/specification/#schema-record) for how to define an Avro schema.
221
221
222
222
In the following example, you will create a DataStream containing Parquet records as Avro Generic records.
223
-
It will parse the Avro schema based on the JSON string. There are many other ways to parse a schema, e.g. from java.io.File or java.io.InputStream. Please refer to [Avro Schema](https://avro.apache.org/docs/1.10.0/api/java/org/apache/avro/Schema.html) for details.
223
+
It will parse the Avro schema based on the JSON string. There are many other ways to parse a schema, e.g. from java.io.File or java.io.InputStream. Please refer to [Avro Schema](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/Schema.html) for details.
224
224
After that, you will create an `AvroParquetRecordFormat` via `AvroParquetReaders` for Avro Generic records.
225
225
226
226
{{< tabs "GenericRecord" >}}
@@ -284,7 +284,7 @@ Based on the previously defined schema, you can generate classes by leveraging A
284
284
Once the classes have been generated, there is no need to use the schema directly in your programs.
285
285
You can either use `avro-tools.jar` to generate code manually or you could use the Avro Maven plugin to perform
286
286
code generation on any .avsc files present in the configured source directory. Please refer to
287
-
[Avro Getting Started](https://avro.apache.org/docs/1.10.0/gettingstartedjava.html) for more information.
287
+
[Avro Getting Started](https://avro.apache.org/docs/++version++/getting-started-java/) for more information.
288
288
289
289
The following example uses the example schema {{< gh_link file="flink-formats/flink-parquet/src/test/resources/avro/testdata.avsc" name="testdata.avsc" >}}:
290
290
@@ -334,7 +334,7 @@ final DataStream<GenericRecord> stream =
334
334
Beyond Avro Generic and Specific record that requires a predefined Avro schema,
335
335
Flink also supports creating a DataStream from Parquet files based on existing Java POJO classes.
336
336
In this case, Avro will use Java reflection to generate schemas and protocols for these POJO classes.
337
-
Java types are mapped to Avro schemas, please refer to the [Avro reflect](https://avro.apache.org/docs/1.10.0/api/java/index.html) documentation for more details.
337
+
Java types are mapped to Avro schemas, please refer to the [Avro reflect](https://avro.apache.org/docs/++version++/api/java/org/apache/avro/reflect/package-summary.html) documentation for more details.
338
338
339
339
This example uses a simple Java POJO class {{< gh_link file="flink-formats/flink-parquet/src/test/java/org/apache/flink/formats/parquet/avro/Datum.java" name="Datum" >}}:
Copy file name to clipboardExpand all lines: docs/content/docs/connectors/table/formats/avro-confluent.md
+1-1Lines changed: 1 addition & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -298,4 +298,4 @@ See the [Apache Avro Format]({{< ref "docs/connectors/table/formats/avro" >}}#da
298
298
299
299
In addition to the types listed there, Flink supports reading/writing nullable types. Flink maps nullable types to Avro `union(something, null)`, where `something` is the Avro type converted from Flink type.
300
300
301
-
You can refer to [Avro Specification](https://avro.apache.org/docs/current/spec.html) for more information about Avro types.
301
+
You can refer to [Avro Specification](https://avro.apache.org/docs/++version++/specification/) for more information about Avro types.
Copy file name to clipboardExpand all lines: docs/content/docs/connectors/table/formats/avro.md
+2-2Lines changed: 2 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -86,7 +86,7 @@ Format Options
86
86
<td>yes</td>
87
87
<td>binary</td>
88
88
<td>String</td>
89
-
<td>Serialization encoding to use. The valid enumerations are: <code>binary</code>, <code>json</code>. <a href="https://avro.apache.org/docs/current/specification/#encodings">(reference)</a><br>
89
+
<td>Serialization encoding to use. The valid enumerations are: <code>binary</code>, <code>json</code>. <a href="https://avro.apache.org/docs/++version++/specification/#encodings">(reference)</a><br>
90
90
Most applications will use the binary encoding, as it results in smaller and more efficient messages, reducing the usage of disk and network resources, and improving performance for high throughput data. <br>
91
91
JSON encoding results in human-readable messages which can be useful during development and debugging, and is useful for compatibility when interacting with systems that cannot process binary encoded data.</td>
92
92
</tr>
@@ -218,4 +218,4 @@ So the following table lists the type mapping from Flink type to Avro type.
218
218
219
219
In addition to the types listed above, Flink supports reading/writing nullable types. Flink maps nullable types to Avro `union(something, null)`, where `something` is the Avro type converted from Flink type.
220
220
221
-
You can refer to [Avro Specification](https://avro.apache.org/docs/current/spec.html) for more information about Avro types.
221
+
You can refer to [Avro Specification](https://avro.apache.org/docs/++version++/specification/) for more information about Avro types.
Copy file name to clipboardExpand all lines: flink-formats/flink-avro/src/main/java/org/apache/flink/formats/avro/typeutils/AvroSerializerSnapshot.java
+2-1Lines changed: 2 additions & 1 deletion
Original file line number
Diff line number
Diff line change
@@ -166,7 +166,8 @@ public TypeSerializer<T> restoreSerializer() {
166
166
*
167
167
* <p>Checks whenever a new version of a schema (reader) can read values serialized with the old
168
168
* schema (writer). If the schemas are compatible according to {@code Avro} schema resolution
0 commit comments