Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Adding DBQnA example in GenAIExamples #894

Merged
merged 47 commits into from
Oct 14, 2024
Merged
Show file tree
Hide file tree
Changes from 42 commits
Commits
Show all changes
47 commits
Select commit Hold shift + click to select a range
f783d2f
Adding text to Sql example
Supriya-Krishnamurthi Sep 27, 2024
35749e2
Added docker files
Supriya-Krishnamurthi Sep 29, 2024
93be424
Added Readme file
Supriya-Krishnamurthi Sep 29, 2024
9182059
Added Docker compose up in Readme
Supriya-Krishnamurthi Sep 29, 2024
1ba11bd
Added chinook.sql under docker compose folder
Supriya-Krishnamurthi Sep 30, 2024
e4bc1b9
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 30, 2024
4efb66b
Merge branch 'main' into genAiDbQna
Supriya-Krishnamurthi Sep 30, 2024
98a3f0a
added git clone genaIcomps in test to E2E run
Supriya-Krishnamurthi Sep 30, 2024
e0ab0a9
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Sep 30, 2024
eac5686
Renamed example to dbqna and remove genAICOmps before git clone in test
Supriya-Krishnamurthi Oct 1, 2024
073eb23
Added *.sql in codespell ignore file
Supriya-Krishnamurthi Oct 1, 2024
9f6984b
Merge branch 'main' into genAiDbQna
lvliang-intel Oct 1, 2024
a4b8afc
fixed tests
Oct 1, 2024
96d36fd
Fixed Tests
Oct 1, 2024
1c3a959
kept dynamic hostname in test_compose_on_xeon
Supriya-Krishnamurthi Oct 2, 2024
f1e031e
Updated Model for tests
Oct 3, 2024
4709c8e
Updated Model for tests
Oct 3, 2024
1fb4628
Updated new Model for tests
Oct 3, 2024
9a70904
Updated new Model for tests
Oct 3, 2024
ec0a887
Fixed Tests
Oct 3, 2024
0801554
Fixed Tests to install nodejs
Oct 3, 2024
053832c
Get dynamic host in UI test
Supriya-Krishnamurthi Oct 3, 2024
59f7cde
updating timeout value in test
Supriya-Krishnamurthi Oct 3, 2024
217dbd2
updated endpointUrl in UI test
Supriya-Krishnamurthi Oct 3, 2024
bf89101
updated endpointUrl for UI test
Supriya-Krishnamurthi Oct 3, 2024
32fd20d
Dynamically set hostname for VITE_TEXT_TO_SQL_URL
Supriya-Krishnamurthi Oct 3, 2024
1f9edb4
Updated Query in UI tests
Oct 3, 2024
223a978
Correct ed hyperlinks
Supriya-Krishnamurthi Oct 3, 2024
c891fee
Changed texttosql to dbqna in yaml files, test, readme
Supriya-Krishnamurthi Oct 4, 2024
40e6682
Changed main example from TextToSql to DBQnA
Supriya-Krishnamurthi Oct 4, 2024
b9653eb
Changed main example from TextToSql to DBQnA
Supriya-Krishnamurthi Oct 4, 2024
39a0e84
To Trigger run test in git
Supriya-Krishnamurthi Oct 4, 2024
e5a8f70
used - instead of _ in yaml files, corrected README
Supriya-Krishnamurthi Oct 7, 2024
64cff44
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 7, 2024
a48b742
Added Root level README.md file
Supriya-Krishnamurthi Oct 7, 2024
e1eb2de
[pre-commit.ci] auto fixes from pre-commit.com hooks
pre-commit-ci[bot] Oct 7, 2024
10685e7
Corrected Title in Root README
Supriya-Krishnamurthi Oct 9, 2024
1d7cd22
Removed react public folder
Supriya-Krishnamurthi Oct 9, 2024
a79206b
Merge branch 'main' into genAiDbQna
hteeyeoh Oct 9, 2024
58931bb
Removed console.logs
Supriya-Krishnamurthi Oct 9, 2024
9816caa
Merge branch 'main' into genAiDbQna
lvliang-intel Oct 9, 2024
6300a21
Merge branch 'main' into genAiDbQna
hteeyeoh Oct 10, 2024
1f026af
Removed AWS section from README
Supriya-Krishnamurthi Oct 10, 2024
9b43f0b
Merge branch 'main' into genAiDbQna
hteeyeoh Oct 10, 2024
3a65b58
Merge branch 'main' into genAiDbQna
hteeyeoh Oct 10, 2024
9f08a76
Merge branch 'main' into genAiDbQna
hteeyeoh Oct 11, 2024
d73fc2f
Merge branch 'main' into genAiDbQna
yogeshmpandey Oct 14, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
17 changes: 17 additions & 0 deletions DBQnA/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
# DBQnA Application

Experience a revolutionary way to interact with your database using our DBQnA app! Harnessing the power of OPEA microservices, our application seamlessly translates natural language queries into SQL and delivers real-time database results, all designed to optimize workflows and enhance productivity for modern enterprises.

---

## 🛠️ Key Features

### 💬 SQL Query Generation

The key feature of DBQnA app is that it converts a user's natural language query into an SQL query and automatically executes the generated SQL query on the database to return the relevant results. BAsically ask questions to database, receive corresponding SQL query and real-time query execution output, all without needing any SQL knowledge.

---

## 📚 Setup Guide

- **[Xeon Guide](./docker_compose/intel/cpu/xeon/README.md)**: Instructions to build Docker images from source and run the application via Docker Compose.
Binary file added DBQnA/assets/img/dbQnA_ui_db_credentials.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added DBQnA/assets/img/dbQnA_ui_enter_question.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added DBQnA/assets/img/dbQnA_ui_init.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
180 changes: 180 additions & 0 deletions DBQnA/docker_compose/intel/cpu/xeon/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,180 @@
# Deploy on Intel Xeon Processor

This document outlines the deployment process for DBQnA application which helps generating a SQL query and its output given a NLP question, utilizing the [GenAIComps](https://github.com/opea-project/GenAIComps.git) microservice pipeline on an Intel Xeon server. The steps include Docker image creation, container deployment via Docker Compose, and service execution to integrate microservices. We will publish the Docker images to Docker Hub soon, which will simplify the deployment process for this service.

## 🚀 Apply Intel Xeon Server on AWS

To apply a Intel Xeon server on AWS, start by creating an AWS account if you don't have one already. Then, head to the [EC2 Console](https://console.aws.amazon.com/ec2/v2/home) to begin the process. Within the EC2 service, select the Amazon EC2 M7i or M7i-flex instance type to leverage 4th Generation Intel Xeon Scalable processors. These instances are optimized for high-performance computing and demanding workloads.

For detailed information about these instance types, you can refer to this [link](https://aws.amazon.com/ec2/instance-types/m7i/). Once you've chosen the appropriate instance type, proceed with configuring your instance settings, including network configurations, security groups, and storage options.

After launching your instance, you can connect to it using SSH (for Linux instances) or Remote Desktop Protocol (RDP) (for Windows instances). From there, you'll have full access to your Xeon server, allowing you to install, configure, and manage your applications as needed.

## 🚀 Build Docker Images

First of all, you need to build Docker Images locally. This step can be ignored once the Docker images are published to Docker hub.

### 1.1 Build Text to SQL service Image

```bash
git clone https://github.com/opea-project/GenAIComps.git
cd GenAIComps
docker build --no-cache -t opea/texttosql:comps -f comps/texttosql/langchain/Dockerfile .

```

### 1.2 Build react UI Docker Image

Build the frontend Docker image based on react framework via below command:

```bash
cd GenAIExamples/DBQnA/ui
docker build --no-cache -t opea/texttosql-react-ui:latest -f docker/Dockerfile.react .

```

Then run the command `docker images`, you will have the following Docker Images:

1. `opea/texttosql:latest`
2. `opea/dbqna-react-ui:latest`

## 🚀 Start Microservices

### Required Models

We set default model as "mistralai/Mistral-7B-Instruct-v0.3", change "LLM_MODEL_ID" in following Environment Variables setting if you want to use other models.

If use gated models, you also need to provide [huggingface token](https://huggingface.co/docs/hub/security-tokens) to "HUGGINGFACEHUB_API_TOKEN" environment variable.

### 2.1 Setup Environment Variables

Since the `compose.yaml` will consume some environment variables, you need to setup them in advance as below.

```bash
# your_ip should be your external IP address, do not use localhost.
export your_ip=$(hostname -I | awk '{print $1}')

# Example: no_proxy="localhost,127.0.0.1,192.168.1.1"
export no_proxy=${your_no_proxy},${your_ip}

# If you are in a proxy environment, also set the proxy-related environment variables:
export http_proxy=${your_http_proxy}
export https_proxy=${your_http_proxy}

# Set other required variables

export TGI_PORT=8008
export TGI_LLM_ENDPOINT=http://${your_ip}:${TGI_PORT}
export HF_TOKEN=${HUGGINGFACEHUB_API_TOKEN}
export LLM_MODEL_ID="mistralai/Mistral-7B-Instruct-v0.3"
export POSTGRES_USER=postgres
export POSTGRES_PASSWORD=testpwd
export POSTGRES_DB=chinook
export texttosql_port=9090
```

Note: Please replace with `your_ip` with your external IP address, do not use localhost.

### 2.2 Start Microservice Docker Containers

There are 2 options to start the microservice

#### 2.2.1 Start the microservice using docker compose

```bash
cd GenAIExamples/DBQnA/docker_compose/intel/cpu/xeon
docker compose up -d
```

#### 2.2.2 Alternatively we can start the microservices by running individual docker services

**NOTE:** Make sure all the individual docker services are down before starting them.

Below are the commands to start each of the docker service individually

- Start PostgresDB Service

We will use [Chinook](https://github.com/lerocha/chinook-database) sample database as a default to test the Text-to-SQL microservice. Chinook database is a sample database ideal for demos and testing ORM tools targeting single and multiple database servers.

```bash

docker run --name test-texttosql-postgres --ipc=host -e POSTGRES_USER=${POSTGRES_USER} -e POSTGRES_HOST_AUTH_METHOD=trust -e POSTGRES_DB=${POSTGRES_DB} -e POSTGRES_PASSWORD=${POSTGRES_PASSWORD} -p 5442:5432 -d -v $WORKPATH/comps/texttosql/langchain/chinook.sql:/docker-entrypoint-initdb.d/chinook.sql postgres:latest
```

- Start TGI Service

```bash

docker run -d --name="test-texttosql-tgi-endpoint" --ipc=host -p $TGI_PORT:80 -v ./data:/data --shm-size 1g -e HUGGINGFACEHUB_API_TOKEN=${HUGGINGFACEHUB_API_TOKEN} -e HF_TOKEN=${HF_TOKEN} -e model=${model} ghcr.io/huggingface/text-generation-inference:2.1.0 --model-id $model
```

- Start Text-to-SQL Service

```bash
unset http_proxy

docker run -d --name="test-texttosql-server" --ipc=host -p ${texttosql_port}:8090 --ipc=host -e http_proxy=$http_proxy -e https_proxy=$https_proxy -e TGI_LLM_ENDPOINT=$TGI_LLM_ENDPOINT opea/texttosql:latest
```

- Start React UI service

```bash
docker run -d --name="test-dbqna-react-ui-server" --ipc=host -p 5174:80 -e no_proxy=$no_proxy -e https_proxy=$https_proxy -e http_proxy=$http_proxy opea/dbqna-react-ui:latest
```

## 🚀 Validate Microservices

### 3.1 TGI Service

```bash

curl http://${your_ip}:$TGI_PORT/generate \
-X POST \
-d '{"inputs":"What is Deep Learning?","parameters":{"max_new_tokens":17, "do_sample": true}}' \
-H 'Content-Type: application/json'
```

### 3.2 Postgres Microservice

Once Text-to-SQL microservice is started, user can use below command

#### 3.2.1 Test the Database connection

```bash
curl --location http://${your_ip}:9090/v1/postgres/health \
--header 'Content-Type: application/json' \
--data '{"user": "'${POSTGRES_USER}'","password": "'${POSTGRES_PASSWORD}'","host": "'${your_ip}'", "port": "5442", "database": "'${POSTGRES_DB}'"}'
```

#### 3.2.2 Invoke the microservice.

```bash
curl http://${your_ip}:9090/v1/texttosql\
-X POST \
-d '{"input_text": "Find the total number of Albums.","conn_str": {"user": "'${POSTGRES_USER}'","password": "'${POSTGRES_PASSWORD}'","host": "'${your_ip}'", "port": "5442", "database": "'${POSTGRES_DB}'"}}' \
-H 'Content-Type: application/json'
```

### 3.3 Frontend validation

We test the API in frontend validation to check if API returns HTTP_STATUS: 200 and validates if API response returns SQL query and output

The test is present in App.test.tsx under react root folder ui/react/

Command to run the test

```bash
npm run test
```

## 🚀 Launch the React UI

Open this URL `http://{your_ip}:5174` in your browser to access the frontend.

![project-screenshot](../../../../assets/img/dbQnA_ui_init.png)

Test DB Connection
![project-screenshot](../../../../assets/img/dbQnA_ui_successful_db_connection.png)

Create SQL query and output for given NLP question
![project-screenshot](../../../../assets/img/dbQnA_ui_succesful_sql_output_generation.png)
Loading
Loading