Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs/content/about.md
Original file line number Diff line number Diff line change
Expand Up @@ -31,6 +31,7 @@ The following table shows the version mapping between Flink<sup>®</sup> CDC Con
| <font color="DarkCyan">2.0.*</font> | <font color="MediumVioletRed">1.13.*</font> |
| <font color="DarkCyan">2.1.*</font> | <font color="MediumVioletRed">1.13.*</font> |
| <font color="DarkCyan">2.2.*</font> | <font color="MediumVioletRed">1.13.\*</font>, <font color="MediumVioletRed">1.14.\*</font> |
| <font color="DarkCyan">2.3.*</font> | <font color="MediumVioletRed">1.13.\*</font>, <font color="MediumVioletRed">1.14.\*</font>, <font color="MediumVioletRed">1.15.\*</font> |

## Features

Expand Down
4 changes: 2 additions & 2 deletions docs/content/connectors/mongodb-cdc.md
Original file line number Diff line number Diff line change
Expand Up @@ -398,7 +398,7 @@ Data Type Mapping
----------------
[BSON](https://docs.mongodb.com/manual/reference/bson-types/) short for **Binary JSON** is a bin­ary-en­coded seri­al­iz­a­tion of JSON-like format used to store documents and make remote procedure calls in MongoDB.

[Flink SQL Data Type](https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/types/) is similar to the SQL standard’s data type terminology which describes the logical type of a value in the table ecosystem. It can be used to declare input and/or output types of operations.
[Flink SQL Data Type](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/table/types/) is similar to the SQL standard’s data type terminology which describes the logical type of a value in the table ecosystem. It can be used to declare input and/or output types of operations.

In order to enable Flink SQL to process data from heterogeneous data sources, the data types of heterogeneous data sources need to be uniformly converted to Flink SQL data types.

Expand Down Expand Up @@ -518,7 +518,7 @@ Reference
- [Replica set protocol](https://docs.mongodb.com/manual/reference/replica-configuration/#mongodb-rsconf-rsconf.protocolVersion)
- [Connection String Options](https://docs.mongodb.com/manual/reference/connection-string/#std-label-connections-connection-options)
- [BSON Types](https://docs.mongodb.com/manual/reference/bson-types/)
- [Flink DataTypes](https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/dev/table/types/)
- [Flink DataTypes](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/dev/table/types/)

FAQ
--------
Expand Down
2 changes: 1 addition & 1 deletion docs/content/connectors/mysql-cdc(ZH).md
Original file line number Diff line number Diff line change
Expand Up @@ -617,7 +617,7 @@ $ ./bin/flink run \
--fromSavepoint /tmp/flink-savepoints/savepoint-cca7bc-bb1e257f0dab \
./FlinkCDCExample.jar
```
**注意:** 请参考文档 [Restore the job from previous savepoint](https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/deployment/cli/#command-line-interface) 了解更多详细信息。
**注意:** 请参考文档 [Restore the job from previous savepoint](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/deployment/cli/#command-line-interface) 了解更多详细信息。

数据类型映射
----------------
Expand Down
2 changes: 1 addition & 1 deletion docs/content/connectors/mysql-cdc.md
Original file line number Diff line number Diff line change
Expand Up @@ -625,7 +625,7 @@ $ ./bin/flink run \
--fromSavepoint /tmp/flink-savepoints/savepoint-cca7bc-bb1e257f0dab \
./FlinkCDCExample.jar
```
**Note:** Please refer the doc [Restore the job from previous savepoint](https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/deployment/cli/#command-line-interface) for more details.
**Note:** Please refer the doc [Restore the job from previous savepoint](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/deployment/cli/#command-line-interface) for more details.

Data Type Mapping
----------------
Expand Down
2 changes: 1 addition & 1 deletion docs/content/formats/changelog-json.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Changelog JSON Format

**WARNING:** The CDC format `changelog-json` is deprecated since Flink CDC version 2.2.
The CDC format `changelog-json` was introduced at the point that Flink didn't offer any CDC format. Currently, Flink offers several well-maintained CDC formats i.e.[Debezium CDC, MAXWELL CDC, CANAL CDC](https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/formats/overview/), we recommend user to use above CDC formats.
The CDC format `changelog-json` was introduced at the point that Flink didn't offer any CDC format. Currently, Flink offers several well-maintained CDC formats i.e.[Debezium CDC, MAXWELL CDC, CANAL CDC](https://nightlies.apache.org/flink/flink-docs-release-1.15/docs/connectors/table/formats/overview/), we recommend user to use above CDC formats.

### Compatibility Note

Expand Down
2 changes: 1 addition & 1 deletion docs/content/quickstart/db2-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -61,7 +61,7 @@ docker-compose down

*Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. *

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-db2-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-db2-cdc/2.3-SNAPSHOT/flink-sql-connector-db2-cdc-2.3-SNAPSHOT.jar)

**3. Launch a Flink cluster and start a Flink SQL CLI**
Expand Down
2 changes: 1 addition & 1 deletion docs/content/quickstart/mongodb-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ db.customers.insertMany([

```Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. ```

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-mongodb-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mongodb-cdc/2.3-SNAPSHOT/flink-sql-connector-mongodb-cdc-2.3-SNAPSHOT.jar)

4. Launch a Flink cluster, then start a Flink SQL CLI and execute following SQL statements inside:
Expand Down
10 changes: 5 additions & 5 deletions docs/content/quickstart/mysql-postgres-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -73,11 +73,11 @@ This command automatically starts all the containers defined in the Docker Compo
We can also visit [http://localhost:5601/](http://localhost:5601/) to see if Kibana is running normally.

### Preparing Flink and JAR package required
1. Download [Flink 1.13.2](https://archive.apache.org/dist/flink/flink-1.13.2/flink-1.13.2-bin-scala_2.11.tgz) and unzip it to the directory `flink-1.13.2`
2. Download following JAR package required and put them under `flink-1.13.2/lib/`:
1. Download [Flink 1.15.2](https://archive.apache.org/dist/flink/flink-1.15.2/flink-1.15.2-bin-scala_2.12.tgz) and unzip it to the directory `flink-1.15.2`
2. Download following JAR package required and put them under `flink-1.15.2/lib/`:

**Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. **
- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3-SNAPSHOT/flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar)
- [flink-sql-connector-postgres-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-postgres-cdc/2.3-SNAPSHOT/flink-sql-connector-postgres-cdc-2.3-SNAPSHOT.jar)

Expand Down Expand Up @@ -151,7 +151,7 @@ We can also visit [http://localhost:5601/](http://localhost:5601/) to see if Kib

1. Use the following command to change to the Flink directory:
```
cd flink-1.13.2
cd flink-1.15.2
```

2. Use the following command to start a Flink cluster:
Expand Down Expand Up @@ -311,7 +311,7 @@ After finishing the tutorial, run the following command to stop all containers i
```shell
docker-compose down
```
Run the following command to stop the Flink cluster in the directory of Flink `flink-1.13.2`:
Run the following command to stop the Flink cluster in the directory of Flink `flink-1.15.2`:
```shell
./bin/stop-cluster.sh
```
2 changes: 1 addition & 1 deletion docs/content/quickstart/oceanbase-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -111,7 +111,7 @@ VALUES (default, '2020-07-30 10:08:22', 'Jark', 50.50, 102, false),

```Download links are only available for stable releases.```

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-oceanbase-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-oceanbase-cdc/2.3-SNAPSHOT/flink-sql-connector-oceanbase-cdc-2.3-SNAPSHOT.jar)

### Use Flink DDL to create dynamic table in Flink SQL CLI
Expand Down
2 changes: 1 addition & 1 deletion docs/content/quickstart/oracle-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ docker-compose down

*Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. *

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-oracle-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-oracle-cdc/2.3-SNAPSHOT/flink-sql-connector-oracle-cdc-2.3-SNAPSHOT.jar)

**3. Launch a Flink cluster and start a Flink SQL CLI**
Expand Down
10 changes: 5 additions & 5 deletions docs/content/quickstart/polardbx-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,12 +63,12 @@ This command automatically starts all the containers defined in the Docker Compo
We can also visit [http://localhost:5601/](http://localhost:5601/) to see if Kibana is running normally.

### Preparing Flink and JAR package required
1. Download [Flink 1.13.2](https://archive.apache.org/dist/flink/flink-1.13.2/flink-1.13.2-bin-scala_2.11.tgz) and unzip it to the directory `flink-1.13.2`
2. Download following JAR package required and put them under `flink-1.13.2/lib/`:
1. Download [Flink 1.15.2](https://archive.apache.org/dist/flink/flink-1.15.2/flink-1.15.2-bin-scala_2.12.tgz) and unzip it to the directory `flink-1.15.2`
2. Download following JAR package required and put them under `flink-1.15.2/lib/`:

**Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. **
- [flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3-SNAPSHOT/flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar)
- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)

### Preparing data in databases
#### Preparing data in PolarDB-X
Expand Down Expand Up @@ -116,7 +116,7 @@ We can also visit [http://localhost:5601/](http://localhost:5601/) to see if Kib

1. Use the following command to change to the Flink directory:
```
cd flink-1.13.2
cd flink-1.15.2
```

2. Use the following command to start a Flink cluster:
Expand Down Expand Up @@ -255,7 +255,7 @@ After finishing the tutorial, run the following command to stop all containers i
```shell
docker-compose down
```
Run the following command to stop the Flink cluster in the directory of Flink `flink-1.13.2`:
Run the following command to stop the Flink cluster in the directory of Flink `flink-1.15.2`:
```shell
./bin/stop-cluster.sh
```
2 changes: 1 addition & 1 deletion docs/content/quickstart/sqlserver-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ docker-compose down

*Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. *

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-sqlserver-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-sqlserver-cdc/2.3-SNAPSHOT/flink-sql-connector-sqlserver-cdc-2.3-SNAPSHOT.jar)


Expand Down
2 changes: 1 addition & 1 deletion docs/content/quickstart/tidb-tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -116,7 +116,7 @@ docker-compose down

*Download links are available only for stable releases, SNAPSHOT dependency need build by yourself. *

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-tidb-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-tidb-cdc/2.3-SNAPSHOT/flink-sql-connector-tidb-cdc-2.3-SNAPSHOT.jar)


Expand Down
2 changes: 1 addition & 1 deletion docs/content/快速上手/mongodb-tutorial-zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ db.customers.insertMany([

```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-mongodb-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mongodb-cdc/2.3-SNAPSHOT/flink-sql-connector-mongodb-cdc-2.3-SNAPSHOT.jar)

4. 然后启动 Flink 集群,再启动 SQL CLI.
Expand Down
10 changes: 5 additions & 5 deletions docs/content/快速上手/mysql-postgres-tutorial-zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -69,11 +69,11 @@ docker-compose up -d
该命令将以 detached 模式自动启动 Docker Compose 配置中定义的所有容器。你可以通过 docker ps 来观察上述的容器是否正常启动了,也可以通过访问 [http://localhost:5601/](http://localhost:5601/) 来查看 Kibana 是否运行正常。

### 下载 Flink 和所需要的依赖包
1. 下载 [Flink 1.13.2](https://archive.apache.org/dist/flink/flink-1.13.2/flink-1.13.2-bin-scala_2.11.tgz) 并将其解压至目录 `flink-1.13.2`
2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.13.2/lib/` 下:
1. 下载 [Flink 1.15.2](https://archive.apache.org/dist/flink/flink-1.15.2/flink-1.15.2-bin-scala_2.12.tgz) 并将其解压至目录 `flink-1.15.2`
2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.15.2/lib/` 下:

**下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译**
- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3-SNAPSHOT/flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar)
- [flink-sql-connector-postgres-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-postgres-cdc/2.3-SNAPSHOT/flink-sql-connector-postgres-cdc-2.3-SNAPSHOT.jar)

Expand Down Expand Up @@ -147,7 +147,7 @@ docker-compose up -d

1. 使用下面的命令跳转至 Flink 目录下
```
cd flink-1.13.2
cd flink-15.2
```

2. 使用下面的命令启动 Flink 集群
Expand Down Expand Up @@ -308,7 +308,7 @@ Flink SQL> INSERT INTO enriched_orders
```shell
docker-compose down
```
在 Flink 所在目录 `flink-1.13.2` 下执行如下命令停止 Flink 集群:
在 Flink 所在目录 `flink-1.15.2` 下执行如下命令停止 Flink 集群:
```shell
./bin/stop-cluster.sh
```
Expand Down
2 changes: 1 addition & 1 deletion docs/content/快速上手/oceanbase-tutorial-zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -110,7 +110,7 @@ VALUES (default, '2020-07-30 10:08:22', 'Jark', 50.50, 102, false),

```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-oceanbase-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-oceanbase-cdc/2.3-SNAPSHOT/flink-sql-connector-oceanbase-cdc-2.3-SNAPSHOT.jar)

### 在 Flink SQL CLI 中使用 Flink DDL 创建表
Expand Down
2 changes: 1 addition & 1 deletion docs/content/快速上手/oracle-tutorial-zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,7 +54,7 @@ docker-compose down

*下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译*

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-oracle-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-oracle-cdc/2.3-SNAPSHOT/flink-sql-connector-oracle-cdc-2.3-SNAPSHOT.jar)

**3. 然后启动 Flink 集群,再启动 SQL CLI:**
Expand Down
6 changes: 3 additions & 3 deletions docs/content/快速上手/polardbx-tutorial-zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,12 +105,12 @@ VALUES (default, '2020-07-30 10:08:22', 'Jark', 50.50, 102, false),
```

### 下载 Flink 和所需要的依赖包
1. 下载 [Flink 1.13.2](https://archive.apache.org/dist/flink/flink-1.13.2/flink-1.13.2-bin-scala_2.11.tgz) 并将其解压至目录 `flink-1.13.2`
2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.13.2/lib/` 下
1. 下载 [Flink 1.15.2](https://archive.apache.org/dist/flink/flink-1.15.2/flink-1.15.2-bin-scala_2.12.tgz) 并将其解压至目录 `flink-1.15.2`
2. 下载下面列出的依赖包,并将它们放到目录 `flink-1.15.2/lib/` 下

```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```
- 用于订阅PolarDB-X Binlog: [flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-mysql-cdc/2.3-SNAPSHOT/flink-sql-connector-mysql-cdc-2.3-SNAPSHOT.jar)
- 用于写入Elasticsearch: [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- 用于写入Elasticsearch: [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
3. 启动flink服务:
```shell
./bin/start-cluster.sh
Expand Down
2 changes: 1 addition & 1 deletion docs/content/快速上手/sqlserver-tutorial-zh.md
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ docker-compose down

```下载链接只对已发布的版本有效, SNAPSHOT 版本需要本地编译```

- [flink-sql-connector-elasticsearch7_2.11-1.13.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7_2.11/1.13.2/flink-sql-connector-elasticsearch7_2.11-1.13.2.jar)
- [flink-sql-connector-elasticsearch7-1.15.2.jar](https://repo.maven.apache.org/maven2/org/apache/flink/flink-sql-connector-elasticsearch7/1.15.2/flink-sql-connector-elasticsearch7-1.15.2.jar)
- [flink-sql-connector-sqlserver-cdc-2.3-SNAPSHOT.jar](https://repo1.maven.org/maven2/com/ververica/flink-sql-connector-sqlserver-cdc/2.3-SNAPSHOT/flink-sql-connector-sqlserver-cdc-2.3-SNAPSHOT.jar)


Expand Down
Loading