Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Return Mega/Micro Service Version number at runtime #912

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 16 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,6 +139,22 @@ from comps import ChatQnAGateway
self.gateway = ChatQnAGateway(megaservice=self.megaservice, host="0.0.0.0", port=self.port)
```

## Check Mega/Micro Service health status and version number

Use below command to check Mega/Micro Service status.

```bash
curl http://${your_ip}:${service_port}/v1/health_check\
-X GET \
-H 'Content-Type: application/json'
```

Users should get output like below example if Mega/Micro Service works correctly.

```bash
{"Service Title":"ChatQnAGateway/MicroService","Version":"1.0","Service Description":"OPEA Microservice Infrastructure"}
```

## Contributing to OPEA

Welcome to the OPEA open-source community! We are thrilled to have you here and excited about the potential contributions you can bring to the OPEA platform. Whether you are fixing bugs, adding new GenAI components, improving documentation, or sharing your unique use cases, your contributions are invaluable.
Expand Down
4 changes: 3 additions & 1 deletion comps/cores/mega/http_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -69,7 +69,9 @@
)
async def _health_check():
"""Get the health status of this GenAI microservice."""
return {"Service Title": self.title, "Service Description": self.description}
from comps.version import __version__

Check warning on line 72 in comps/cores/mega/http_service.py

View check run for this annotation

Codecov / codecov/patch

comps/cores/mega/http_service.py#L72

Added line #L72 was not covered by tests

return {"Service Title": self.title, "Version": __version__, "Service Description": self.description}

Check warning on line 74 in comps/cores/mega/http_service.py

View check run for this annotation

Codecov / codecov/patch

comps/cores/mega/http_service.py#L74

Added line #L74 was not covered by tests

@app.get(
path="/v1/statistics",
Expand Down
8 changes: 7 additions & 1 deletion comps/llms/text-generation/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -257,14 +257,20 @@ docker compose -f docker_compose_llm.yaml up -d

## 🚀3. Consume LLM Service

### 3.1 Check Service Status
### 3.1 Check Service Status and its Version

```bash
curl http://${your_ip}:9000/v1/health_check\
-X GET \
-H 'Content-Type: application/json'
```

Users should get output like below example if MicroService works correctly.

```bash
{"Service Title":"ChatQnAGateway/MicroService","Version":"1.0","Service Description":"OPEA Microservice Infrastructure"}
```

### 3.2 Verify the LLM Service

#### 3.2.1 Verify the TGI Service
Expand Down
Loading