Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

importer updates #2198

Merged
merged 1 commit into from
Jul 28, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
50 changes: 33 additions & 17 deletions docs-2.0/nebula-importer/use-importer.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,16 +40,20 @@ Prepare the CSV file to be imported and configure the YAML file to use the tool

!!! note

For details about the YAML configuration file, see configuration file description at the end of topic.
For details about the YAML configuration file, see [Configuration File Description](#configuration_file_description) at the end of topic.

### Download binary package and run

1. Download the executable [binary package](https://github.com/vesoft-inc/nebula-importer/releases/tag/{{importer.tag}}).

2. Start the service.
!!! note

The file installation path based on the RPM/DEB package is `/usr/bin/nebula-importer`.

2. Under the directory where the binary file is located, run the following command to start importing data.

```bash
$ ./<binary_package_name> --config <yaml_config_file_path>
./<binary_file_name> --config <yaml_config_file_path>
```

### Source code compile and run
Expand All @@ -59,7 +63,7 @@ Compiling the source code requires deploying a Golang environment. For details,
1. Clone repository.

```bash
$ git clone -b {{importer.branch}} https://github.com/vesoft-inc/nebula-importer.git
git clone -b {{importer.branch}} https://github.com/vesoft-inc/nebula-importer.git
```

!!! note
Expand All @@ -69,45 +73,57 @@ Compiling the source code requires deploying a Golang environment. For details,
2. Access the directory `nebula-importer`.

```bash
$ cd nebula-importer
cd nebula-importer
```

3. Compile the source code.

```bash
$ make build
make build
```

4. Start the service.

```bash
$ ./bin/nebula-importer --config <yaml_config_file_path>
./bin/nebula-importer --config <yaml_config_file_path>
```

### Run in Docker mode

Instead of installing the Go locale locally, you can use Docker to pull the [image](https://hub.docker.com/r/vesoft/nebula-importer) of the NebulaGraph Importer and mount the local configuration file and CSV data file into the container. The command is as follows:

```bash
$ docker pull vesoft/nebula-importer
$ docker run --rm -ti \
docker pull vesoft/nebula-importer:<version>
docker run --rm -ti \
--network=host \
-v <config_file>:<config_file> \
-v <data_dir>:<data_dir> \
vesoft/nebula-importer:<version>
vesoft/nebula-importer:<version> \
--config <config_file>
```

- `<config_file>`: The absolute path to the YAML configuration file.
- `<csv_data_dir>`: The absolute path to the CSV data file. If the file is not local, ignore this parameter.
- `<data_dir>`: The absolute path to the CSV data file. If the file is not local, ignore this parameter.
- `<version>`: NebulaGraph 3.x Please fill in 'v3'.

!!! note
A relative path is recommended. If you use a local absolute path, check that the path maps to the path in the Docker.

Example:

```bash
docker pull vesoft/nebula-importer:v4
docker run --rm -ti \
--network=host \
-v /home/user/config.yaml:/home/user/config.yaml \
-v /home/user/data:/home/user/data \
vesoft/nebula-importer:v4 \
--config /home/user/config.yaml
```

## Configuration File Description

Various example configuration files are available within the [Github](https://github.com/vesoft-inc/nebula-ng-tools/tree/{{importer.branch}}/importer/examples) of the NebulaGraph Importer. The configuration files are used to describe information about the files to be imported, {{nebula.name}} server information, etc. The following section describes the fields within the configuration file in categories.
Various example configuration files are available within the [Github](https://github.com/vesoft-inc/nebula-importer/tree/{{importer.branch}}/examples) of the NebulaGraph Importer. The configuration files are used to describe information about the files to be imported, {{nebula.name}} server information, etc. The following section describes the fields within the configuration file in categories.

!!! note

Expand Down Expand Up @@ -162,9 +178,9 @@ manager:
- UPDATE CONFIGS storage:rocksdb_column_family_options = { disable_auto_compactions = true };
- statements:
- |
DROP SPACE IF EXISTS basic_int_examples;
CREATE SPACE IF NOT EXISTS basic_int_examples(partition_num=5, replica_factor=1, vid_type=int);
USE basic_int_examples;
DROP SPACE IF EXISTS basic_string_examples;
CREATE SPACE IF NOT EXISTS basic_string_examples(partition_num=5, replica_factor=1, vid_type=int);
USE basic_string_examples;
wait: 10s
after:
- statements:
Expand Down Expand Up @@ -204,11 +220,11 @@ log:
|:---|:---|:---|:---|
|`log.level`|`INFO`|No| Specifies the log level. Optional values are `DEBUG`, `INFO`, `WARN`, `ERROR`, `PANIC`, `FATAL`.|
|`log.console`|`true`|No| Whether to print the logs to console synchronously when storing logs.|
|`log.files`|-|No|The log file path.|
|`log.files`|-|No|The log file path. The log directory must exist.|

### Source configuration

The Source configuration requires configuration of data source information, data processing methods, and Schema mapping.
The Source configuration requires the configuration of data source information, data processing methods, and Schema mapping.

The example configuration is as follows:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ In the YAML configuration file of the cluster instance, you can configure log ro
spec:
graphd:
config:
# Whether to include a timestamp in the log file name. "true" means yes, "false" means no.
# Whether to include a timestamp in the log file name. "true" means yes, "false" means no. It is "true" by default.
"timestamp_in_logfile_name": "false"
metad:
config:
Expand Down