Skip to content

Commit 948d954

Browse files
xingcan-hujason-ltc
authored andcommitted
Dcos: fix flink doc typo (#41)
Signed-off-by: xingcan-ltc <[email protected]>
1 parent 85fdae5 commit 948d954

File tree

5 files changed

+6
-6
lines changed

5 files changed

+6
-6
lines changed

catalog/flink/apps/streampark.app/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -67,7 +67,7 @@ INSERT INTO print_table select f_sequence,f_random,f_random_str from datagen;
6767

6868
#### 运行作业
6969
- 在作业管理页面,点击`datagen-print`作业的`发布作业`按钮,稍等片刻,发布状态变为`Done` `Success`
70-
- 点击`datagen-print`作业的`启动作业`按钮,关闭弹窗中的`from savepoin`, 点击`应用`, 作业将提交到Flink session集群运行, 运行状态依次变为`Starting` `Running` `Finished`
70+
- 点击`datagen-print`作业的`启动作业`按钮,关闭弹窗中的`from savepoint`, 点击`应用`, 作业将提交到Flink session集群运行, 运行状态依次变为`Starting` `Running` `Finished`
7171

7272

7373

catalog/flink/apps/streampark.app/i18n/en/README.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -66,5 +66,5 @@ After the addition is successful, it will jump to the job management page
6666

6767
#### Run the job
6868
- In the job management page, click the `Release Application` button of the `datagen-print` job, wait for a moment, the publish status becomes `Done` `Success`
69-
- Click the `Start Application` button of the `datagen-print` job, close the `from savepoin` in the pop-up window, click `Apply`, the job will be submitted to the Flink session cluster for running, and the running status will change to `Starting` `Running` `Finished` in turn
69+
- Click the `Start Application` button of the `datagen-print` job, close the `from savepoint` in the pop-up window, click `Apply`, the job will be submitted to the Flink session cluster for running, and the running status will change to `Starting` `Running` `Finished` in turn
7070

docs/en/catalog-overview/Flink/developer-guide.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -144,7 +144,7 @@ Two methods are introduced for submitting applications: one is through the Flink
144144
Enter the flink session cluster container :
145145
```shell
146146
# Change the pod according to the actual situation
147-
kubuectl exec -it flink-session-cluster-xxxxx -n kdp-data -- bash
147+
kubectl exec -it flink-session-cluster-xxxxx -n kdp-data -- bash
148148
```
149149

150150
Execute the following command in the flink session cluster container:

docs/zh/catalog-overview/Flink/developer-guide.md

+2-2
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,7 @@ public class SocketWindowWordCount {
143143
进入flink session cluster容器
144144
```shell
145145
# pod 根据实际情况替换
146-
kubuectl exec -it flink-session-cluster-xxxxx -n kdp-data -- bash
146+
kubectl exec -it flink-session-cluster-xxxxx -n kdp-data -- bash
147147
```
148148

149149
在容器中执行以下命令:
@@ -334,5 +334,5 @@ group by
334334
运行作业
335335

336336
- 在作业管理页面,点击该作业的`发布作业`按钮,稍等片刻,发布状态变为`Done` `Success`
337-
- 点击该作业的`启动作业`按钮,关闭弹窗中的`from savepoin`, 点击`应用`, 作业将提交到Flink session集群运行, 运行状态依次变为`Starting` `Running`
337+
- 点击该作业的`启动作业`按钮,关闭弹窗中的`from savepoint`, 点击`应用`, 作业将提交到Flink session集群运行, 运行状态依次变为`Starting` `Running`
338338
- 最后不需要运行时,作业的`停止作业`按钮,停止作业。

docs/zh/user-tutorials/import-from-rdbms-to-hive.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -48,7 +48,7 @@ Hue 安装完成之后会自动连上 hive server2,无需进行额外配置。
4848
kubectl get pods -n kdp-data -l app=flink-session-cluster -l component=jobmanager -o name
4949
# 进入 flink-session-cluster 容器
5050
# flink-session-cluster-xxxxx 替换成 pod 真实名称
51-
kubuectl exec -it flink-session-cluster-xxxxx -n kdp-data -- bash
51+
kubectl exec -it flink-session-cluster-xxxxx -n kdp-data -- bash
5252
# 启动 Flink SQL
5353
./bin/sql-client.sh
5454
```

0 commit comments

Comments
 (0)