Skip to content

Commit 965cd1d

Browse files
LinuxGitlilin90
authored andcommitted
op-guide: update tikv rolling udpate policy (#592)
1 parent 9c458c6 commit 965cd1d

File tree

2 files changed

+4
-6
lines changed

2 files changed

+4
-6
lines changed

op-guide/ansible-deployment-rolling-update.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,7 @@ wget http://download.pingcap.org/tidb-v2.0.3-linux-amd64-unportable.tar.gz
6565
$ ansible-playbook rolling_update.yml --tags=tikv
6666
```
6767
68-
When you apply a rolling update to the TiKV instance, Ansible migrates the Region leader to other nodes. The concrete logic is as follows: Call the PD API to add the `evict leader scheduler` -> Inspect the `leader_count` of this TiKV instance every 10 seconds -> Wait the `leader_count` to reduce to below 10, or until the times of inspecting the `leader_count` is more than 12 -> Start closing the rolling update of TiKV after two minutes of timeout -> Delete the `evict leader scheduler` after successful start. The operations are executed serially.
68+
When you apply a rolling update to the TiKV instance, Ansible migrates the Region leader to other nodes. The concrete logic is as follows: Call the PD API to add the `evict leader scheduler` -> Inspect the `leader_count` of this TiKV instance every 10 seconds -> Wait the `leader_count` to reduce to below 1, or until the times of inspecting the `leader_count` is more than 18 -> Start closing the rolling update of TiKV after three minutes of timeout -> Delete the `evict leader scheduler` after successful start. The operations are executed serially.
6969
7070
If the rolling update fails in the process, log in to `pd-ctl` to execute `scheduler show` and check whether `evict-leader-scheduler` exists. If it does exist, delete it manually. Replace `{PD_IP}` and `{STORE_ID}` with your PD IP and the `store_id` of the TiKV instance:
7171

tispark/tispark-quick-start-guide.md

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -6,17 +6,17 @@ category: User Guide
66

77
# TiSpark Quick Start Guide
88

9-
To make it easy to [try TiSpark](tispark-user-guide.md), the TiDB cluster integrates Spark, TiSpark jar package and TiSpark sample data by default, in both the Pre-GA and master versions installed using TiDB-Ansible.
9+
To make it easy to [try TiSpark](tispark-user-guide.md), the TiDB cluster installed using TiDB-Ansible integrates Spark, TiSpark jar package and TiSpark sample data by default.
1010

1111
## Deployment information
1212

1313
- Spark is deployed by default in the `spark` folder in the TiDB instance deployment directory.
1414
- The TiSpark jar package is deployed by default in the `jars` folder in the Spark deployment directory.
1515

1616
```
17-
spark/jars/tispark-0.1.0-beta-SNAPSHOT-jar-with-dependencies.jar
17+
spark/jars/tispark-SNAPSHOT-jar-with-dependencies.jar
1818
```
19-
19+
2020
- TiSpark sample data and import scripts are deployed by default in the TiDB-Ansible directory.
2121
2222
```
@@ -108,8 +108,6 @@ MySQL [TPCH_001]> show tables;
108108

109109
## Use example
110110

111-
Assume that the IP of your PD node is `192.168.0.2`, and the port is `2379`.
112-
113111
First start the spark-shell in the spark deployment directory:
114112

115113
```

0 commit comments

Comments
 (0)