Skip to content

Commit f440c8c

Browse files
authored
fix: 404 links (#1016)
* chore: retranslate deploying local * fix: links * fix: recover
1 parent 2c6e405 commit f440c8c

File tree

5 files changed

+76
-72
lines changed

5 files changed

+76
-72
lines changed

docs/cn/guides/10-deploy/01-deploy/01-non-production/00-deploying-local.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -380,4 +380,4 @@ SELECT
380380
部署 Databend 后,您可能需要了解以下主题:
381381
382382
- [加载与卸载数据](/guides/load-data):在 Databend 中管理数据的导入/导出。
383-
- [可视化](/guides/visualize):将 Databend 与可视化工具集成以获取洞察。
383+
- [可视化](/guides/visualize):将 Databend 与可视化工具集成以获取洞察。

docs/en/guides/40-load-data/02-load-db/flink-cdc.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -14,7 +14,7 @@ To download and install the Flink SQL connector for Databend, follow these steps
1414

1515
1. Download and set up Flink: Before installing the Flink SQL connector for Databend, ensure that you have downloaded and set up Flink on your system. You can download Flink from the official website: https://flink.apache.org/downloads/
1616

17-
2. Download the connector: Visit the releases page of the Flink SQL connector for Databend on GitHub: https://github.com/databendcloud/flink-connector-databend/releases. Download the latest version of the connector (e.g., flink-connector-databend-0.0.2.jar).
17+
2. Download the connector: Visit the releases page of the Flink SQL connector for Databend on GitHub: [https://github.com/databendcloud/flink-connector-databend/releases](https://github.com/databendcloud/flink-connector-databend/releases). Download the latest version of the connector (e.g., flink-connector-databend-0.0.2.jar).
1818

1919
Please note that you can also compile the Flink SQL connector for Databend from source:
2020

@@ -56,8 +56,8 @@ CREATE TABLE products (id INT NOT NULL, name VARCHAR(255) NOT NULL, description
5656
```
5757

5858
2. Download [Flink](https://flink.apache.org/downloads/) and the following SQL connectors to your system:
59-
- Flink SQL connector for Databend: https://github.com/databendcloud/flink-connector-databend/releases
60-
- Flink SQL connector for MySQL: https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3.0/flink-sql-connector-mysql-cdc-2.3.0.jar
59+
- Flink SQL connector for Databend: [https://github.com/databendcloud/flink-connector-databend/releases](https://github.com/databendcloud/flink-connector-databend/releases)
60+
- Flink SQL connector for MySQL: [https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3.0/flink-sql-connector-mysql-cdc-2.3.0.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3.0/flink-sql-connector-mysql-cdc-2.3.0.jar)
6161
3. Move the both connector JAR files to the _lib_ folder in your Flink installation directory.
6262
4. Start Flink:
6363

@@ -118,7 +118,7 @@ You can now open the Apache Flink Dashboard if you go to http://localhost:8081 i
118118
Welcome! Enter 'HELP;' to list all available commands. 'QUIT;' to exit.
119119
```
120120

121-
6. Set the checkpointing interval to 3 seconds, and create corresponding tables with MySQL and Databend connectors in the Flink SQL Client. For the available connection parameters, see https://github.com/databendcloud/flink-connector-databend#connector-options:
121+
6. Set the checkpointing interval to 3 seconds, and create corresponding tables with MySQL and Databend connectors in the Flink SQL Client. For the available connection parameters, see [https://github.com/databendcloud/flink-connector-databend#connector-options](https://github.com/databendcloud/flink-connector-databend#connector-options):
122122

123123
```sql
124124
Flink SQL> SET execution.checkpointing.interval = 3s;

docs/en/guides/90-community/02-rfcs/20220425-new_sql_logic_test_framework.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@ Basically all robust database system needs to be tested in the following scope.
1010
1. Reasonable high level of unit test coverage.
1111
2. A large set of query logic tests.(**mainly discussed**)
1212
3. Distributed system related behavior tests.
13-
4. Performance tests (https://benchmark.databend.com/clickbench/release/hits.html)
13+
4. Performance tests [https://benchmark.databend.com/clickbench/release/hits.html](https://benchmark.databend.com/clickbench/release/hits.html)
1414

1515
Currently, our test framework is based on the following design.
1616

@@ -25,7 +25,7 @@ However, it has some shortages in current logic test which should be improved.
2525

2626
## Detailed design
2727

28-
The test input is an extended version of sql logic test(https://www.sqlite.org/sqllogictest/)
28+
The test input is an extended version of sql logic test[https://www.sqlite.org/sqllogictest/](https://www.sqlite.org/sqllogictest/)
2929

3030
The file is expressed in a domain specific language called test script. and it supports sql statements generate no output or statements intentionally has error
3131

0 commit comments

Comments
 (0)