Skip to content

Commit

Permalink
update oudated repo links
Browse files Browse the repository at this point in the history
Signed-off-by: Ran <[email protected]>
  • Loading branch information
ran-huang committed Feb 10, 2023
1 parent c842282 commit 5b880ae
Show file tree
Hide file tree
Showing 3 changed files with 4 additions and 4 deletions.
4 changes: 2 additions & 2 deletions dumpling-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ aliases: ['/docs/dev/mydumper-overview/','/docs/dev/reference/tools/mydumper/','

# Use Dumpling to Export Data

This document introduces the data export tool - [Dumpling](https://github.com/pingcap/dumpling). Dumpling exports data stored in TiDB/MySQL as SQL or CSV data files and can be used to make a logical full backup or export. Dumpling also supports exporting data to Amazon S3.
This document introduces the data export tool - [Dumpling](https://github.com/pingcap/tidb/tree/master/dumpling). Dumpling exports data stored in TiDB/MySQL as SQL or CSV data files and can be used to make a logical full backup or export. Dumpling also supports exporting data to Amazon S3.

<CustomContent platform="tidb">

Expand Down Expand Up @@ -443,4 +443,4 @@ SET GLOBAL tidb_gc_life_time = '10m';
| `--status-addr` | Dumpling's service address, including the address for Prometheus to pull metrics and pprof debugging | ":8281" |
| `--tidb-mem-quota-query` | The memory limit of exporting SQL statements by a single line of Dumpling command, and the unit is byte. For v4.0.10 or later versions, if you do not set this parameter, TiDB uses the value of the `mem-quota-query` configuration item as the memory limit value by default. For versions earlier than v4.0.10, the parameter value defaults to 32 GB. | 34359738368 |
| `--params` | Specifies the session variable for the connection of the database to be exported. The required format is `"character_set_client=latin1,character_set_connection=latin1"` |
| `-c` or `--compress` | Compresses the CSV and SQL data and table structure files exported by Dumpling. It supports the following compression algorithms: `gzip`, `snappy`, and `zstd`. | "" |
| `-c` or `--compress` | Compresses the CSV and SQL data and table structure files exported by Dumpling. It supports the following compression algorithms: `gzip`, `snappy`, and `zstd`. | "" |
2 changes: 1 addition & 1 deletion tidb-lightning/tidb-lightning-overview.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ aliases: ['/docs/dev/tidb-lightning/tidb-lightning-overview/','/docs/dev/referen

# TiDB Lightning Overview

[TiDB Lightning](https://github.com/pingcap/tidb-lightning) is a tool used for importing data at TB scale to TiDB clusters. It is often used for initial data import to TiDB clusters.
[TiDB Lightning](https://github.com/pingcap/tidb/tree/master/br/pkg/lightning) is a tool used for importing data at TB scale to TiDB clusters. It is often used for initial data import to TiDB clusters.

TiDB Lightning supports the following file formats:

Expand Down
2 changes: 1 addition & 1 deletion tidb-troubleshooting-map.md
Original file line number Diff line number Diff line change
Expand Up @@ -518,7 +518,7 @@ Check the specific cause for busy by viewing the monitor **Grafana** -> **TiKV**

### 6.3 TiDB Lightning

- 6.3.1 TiDB Lightning is a tool for fast full import of large amounts of data into a TiDB cluster. See [TiDB Lightning on GitHub](https://github.com/pingcap/tidb-lightning).
- 6.3.1 TiDB Lightning is a tool for fast full import of large amounts of data into a TiDB cluster. See [TiDB Lightning on GitHub](https://github.com/pingcap/tidb/tree/master/br/pkg/lightning).

- 6.3.2 Import speed is too slow.

Expand Down

0 comments on commit 5b880ae

Please sign in to comment.