Skip to content

Commit 555f0be

Browse files
committed
[FLINK-37776][python] Drop python 3.8 support
1 parent 4c77daa commit 555f0be

File tree

31 files changed

+42
-50
lines changed

31 files changed

+42
-50
lines changed

docs/content.zh/docs/deployment/cli.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -428,11 +428,11 @@ Currently, users are able to submit a PyFlink job via the CLI. It does not requi
428428
JAR file path or the entry main class, which is different from the Java job submission.
429429

430430
{{< hint info >}}
431-
When submitting Python job via `flink run`, Flink will run the command "python". Please run the following command to confirm that the python executable in current environment points to a supported Python version of 3.8+.
431+
When submitting Python job via `flink run`, Flink will run the command "python". Please run the following command to confirm that the python executable in current environment points to a supported Python version of 3.9+.
432432
{{< /hint >}}
433433
```bash
434434
$ python --version
435-
# the version printed here must be 3.8+
435+
# the version printed here must be 3.9+
436436
```
437437

438438
The following commands show different PyFlink job submission use-cases:
@@ -576,7 +576,7 @@ related options. Here's an overview of all the Python related options for the ac
576576
<td>
577577
Specify the path of the python interpreter used to execute the python UDF worker
578578
(e.g.: --pyExecutable /usr/local/bin/python3).
579-
The python UDF worker depends on Python 3.8+, Apache Beam (version >= 2.54.0, <= 2.61.0),
579+
The python UDF worker depends on Python 3.9+, Apache Beam (version >= 2.54.0, <= 2.61.0),
580580
Pip (version >= 20.3) and SetupTools (version >= 37.0.0).
581581
Please ensure that the specified environment meets the above requirements.
582582
</td>

docs/content.zh/docs/dev/python/datastream_tutorial.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ Apache Flink 提供了 DataStream API,用于构建健壮的、有状态的流
4848
首先,你需要在你的电脑上准备以下环境:
4949

5050
* Java 11
51-
* Python 3.8, 3.9 or 3.10
51+
* Python 3.9, 3.10, 3.11 or 3.12
5252

5353
使用 Python DataStream API 需要安装 PyFlink,PyFlink 发布在 [PyPI](https://pypi.org/project/apache-flink/)上,可以通过 `pip` 快速安装。
5454

docs/content.zh/docs/dev/python/installation.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,11 +29,11 @@ under the License.
2929

3030

3131
## 环境要求
32-
<span class="label label-info">注意</span> PyFlink 需要 Python 3.7 以上版本(3.8, 3.9 或 3.10)。请运行以下命令,以确保 Python 版本满足要求。
32+
<span class="label label-info">注意</span> PyFlink 需要 Python 3.9 以上版本(3.9, 3.10, 3.11 或 3.12)。请运行以下命令,以确保 Python 版本满足要求。
3333

3434
```bash
3535
$ python --version
36-
# the version printed here must be 3.8, 3.9 or 3.10
36+
# the version printed here must be 3.9, 3.10, 3.11 or 3.12
3737
```
3838

3939
## 环境设置

docs/content.zh/docs/dev/python/python_execution_mode.md

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -121,9 +121,6 @@ Currently, it still doesn't support to execute Python UDFs in `THREAD` execution
121121
It will fall back to `PROCESS` execution mode in these cases. So it may happen that you configure a job
122122
to execute in `THREAD` execution mode, however, it's actually executed in `PROCESS` execution mode.
123123
{{< /hint >}}
124-
{{< hint info >}}
125-
`THREAD` execution mode is only supported in Python 3.8+.
126-
{{< /hint >}}
127124

128125
## Execution Behavior
129126

docs/content.zh/docs/dev/python/table/udfs/python_udfs.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -28,7 +28,7 @@ under the License.
2828

2929
用户自定义函数是重要的功能,因为它们极大地扩展了 Python Table API 程序的表达能力。
3030

31-
**注意:** 要执行 Python 用户自定义函数,客户端和集群端都需要安装 Python 3.7 以上版本(3.7、3.8、3.9 或 3.10),并安装 PyFlink。
31+
**注意:** 要执行 Python 用户自定义函数,客户端和集群端都需要安装 Python 3.9 以上版本(3.9、3.10、3.11 或 3.12),并安装 PyFlink。
3232

3333
<a name="scalar-functions"></a>
3434

docs/content.zh/docs/dev/python/table/udfs/vectorized_python_udfs.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,7 +34,7 @@ under the License.
3434
向量化用户自定义函数的定义,与[非向量化用户自定义函数]({{< ref "docs/dev/python/table/udfs/python_udfs" >}})具有相似的方式,
3535
用户只需要在调用 `udf` 或者 `udaf` 装饰器时添加一个额外的参数 `func_type="pandas"`,将其标记为一个向量化用户自定义函数即可。
3636

37-
**注意:** 要执行 Python 向量化自定义函数,客户端和集群端都需要安装 Python 3.7 以上版本(3.7、3.8、3.9 或 3.10),并安装 PyFlink。
37+
**注意:** 要执行 Python 向量化自定义函数,客户端和集群端都需要安装 Python 3.9 以上版本(3.9、3.10、3.11 或 3.12),并安装 PyFlink。
3838

3939
## 向量化标量函数
4040

docs/content.zh/docs/dev/python/table_api_tutorial.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ Apache Flink 提供 Table API 关系型 API 来统一处理流和批,即查询
5050
如果要继续我们的旅程,你需要一台具有以下功能的计算机:
5151

5252
* Java 11
53-
* Python 3.8, 3.9 or 3.10
53+
* Python 3.9, 3.10, 3.11 or 3.12
5454

5555
使用 Python Table API 需要安装 PyFlink,它已经被发布到 [PyPi](https://pypi.org/project/apache-flink/),你可以通过如下方式安装 PyFlink:
5656

docs/content.zh/docs/dev/table/sqlClient.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -321,7 +321,7 @@ Mode "embedded" (default) submits Flink jobs from the local machine.
321321
--pyExecutable
322322
/usr/local/bin/python3). The
323323
python UDF worker depends on
324-
Python 3.8+, Apache Beam
324+
Python 3.9+, Apache Beam
325325
(version >= 2.54.0, <= 2.61.0), Pip
326326
(version >= 20.3) and SetupTools
327327
(version >= 37.0.0). Please

docs/content.zh/docs/flinkDev/building.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -73,11 +73,11 @@ mvn clean install -DskipTests -Dfast -Pskip-webui-build -T 1C
7373

7474
如果想构建一个可用于 pip 安装的 PyFlink 包,需要先构建 Flink 工程,如 [构建 Flink](#build-flink) 中所述。
7575

76-
2. Python 的版本为 3.8, 3.9 或者 3.10.
76+
2. Python 的版本为 3.9, 3.10, 3.11 或者 3.12.
7777

7878
```shell
7979
$ python --version
80-
# the version printed here must be 3.8, 3.9 or 3.10
80+
# the version printed here must be 3.9, 3.10, 3.11 or 3.12
8181
```
8282

8383
3. 构建 PyFlink 的 Cython 扩展模块(可选的)

docs/content/docs/deployment/cli.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -426,11 +426,11 @@ Currently, users are able to submit a PyFlink job via the CLI. It does not requi
426426
JAR file path or the entry main class, which is different from the Java job submission.
427427

428428
{{< hint info >}}
429-
When submitting Python job via `flink run`, Flink will run the command "python". Please run the following command to confirm that the python executable in current environment points to a supported Python version of 3.8+.
429+
When submitting Python job via `flink run`, Flink will run the command "python". Please run the following command to confirm that the python executable in current environment points to a supported Python version of 3.9+.
430430
{{< /hint >}}
431431
```bash
432432
$ python --version
433-
# the version printed here must be 3.8+
433+
# the version printed here must be 3.9+
434434
```
435435

436436
The following commands show different PyFlink job submission use-cases:
@@ -574,7 +574,7 @@ related options. Here's an overview of all the Python related options for the ac
574574
<td>
575575
Specify the path of the python interpreter used to execute the python UDF worker
576576
(e.g.: --pyExecutable /usr/local/bin/python3).
577-
The python UDF worker depends on Python 3.8+, Apache Beam (version >= 2.54.0,<= 2.61.0),
577+
The python UDF worker depends on Python 3.9+, Apache Beam (version >= 2.54.0,<= 2.61.0),
578578
Pip (version >= 20.3) and SetupTools (version >= 37.0.0).
579579
Please ensure that the specified environment meets the above requirements.
580580
</td>

0 commit comments

Comments
 (0)