Skip to content

Commit

Permalink
Merge pull request #30 from Arquisoft/fix_llmkey_exposure
Browse files Browse the repository at this point in the history
Fix llmkey exposure
  • Loading branch information
pglez82 authored Feb 20, 2025
2 parents 25a54d1 + b9e7e1b commit 294aaa3
Show file tree
Hide file tree
Showing 12 changed files with 57 additions and 32 deletions.
6 changes: 4 additions & 2 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -54,14 +54,13 @@ jobs:
uses: elgohr/Publish-Docker-Github-Action@v5
env:
API_URI: http://${{ secrets.DEPLOY_HOST }}:8000
LLM_API_KEY: ${{ secrets.LLM_API_KEY }}
with:
name: arquisoft/wichat_0/webapp
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
registry: ghcr.io
workdir: webapp
buildargs: API_URI,LLM_API_KEY
buildargs: API_URI
docker-push-authservice:
name: Push auth service Docker Image to GitHub Packages
runs-on: ubuntu-latest
Expand Down Expand Up @@ -112,12 +111,15 @@ jobs:
- uses: actions/checkout@v4
- name: Publish to Registry
uses: elgohr/Publish-Docker-Github-Action@v5
env:
LLM_API_KEY: ${{ secrets.LLM_API_KEY }}
with:
name: arquisoft/wichat_0/llmservice
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
registry: ghcr.io
workdir: llmservice
buildargs: LLM_API_KEY

docker-push-gatewayservice:
name: Push gateway service Docker Image to GitHub Packages
Expand Down
3 changes: 3 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"postman.settings.dotenv-detection-notification-visibility": false
}
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,9 @@ First, clone the project:
In order to communicate with the LLM integrated in this project, we need to setup an API key. Two integrations are available in this propotipe: gemini and empaphy. The API key provided must match the LLM provider used.

We need to create two .env files.
- The first one in the webapp directory (for executing the webapp using ```npm start```). The content of this .env file should be as follows:
- The first one in the llmservice directory (for executing the llmservice using ```npm start```). The content of this .env file should be as follows:
```
REACT_APP_LLM_API_KEY="YOUR-API-KEY"
LLM_API_KEY="YOUR-API-KEY"
```
- The second one located in the root of the project (along the docker-compose.yml). This .env file is used for the docker-compose when launching the app with docker. The content of this .env file should be as follows:
```
Expand Down
10 changes: 5 additions & 5 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,10 @@ services:
container_name: llmservice-wichat_0
image: ghcr.io/arquisoft/wichat_0/llmservice:latest
profiles: ["dev", "prod"]
build: ./llmservice
build:
context: ./llmservice
args:
LLM_API_KEY: ${LLM_API_KEY}
ports:
- "8003:8003"
networks:
Expand Down Expand Up @@ -71,10 +74,7 @@ services:
container_name: webapp-wichat_0
image: ghcr.io/arquisoft/wichat_0/webapp:latest
profiles: ["dev", "prod"]
build:
context: ./webapp
args:
LLM_API_KEY: ${LLM_API_KEY}
build: ./webapp
depends_on:
- gatewayservice
ports:
Expand Down
3 changes: 2 additions & 1 deletion llmservice/.dockerignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
node_modules
coverage
coverage
.env
3 changes: 3 additions & 0 deletions llmservice/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,9 @@ COPY package*.json ./
# Install app dependencies
RUN npm install

ARG LLM_API_KEY
ENV LLM_API_KEY=$LLM_API_KEY

# Copy the app source code to the working directory
COPY . .

Expand Down
11 changes: 9 additions & 2 deletions llmservice/llm-service.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ const port = 8003;

// Middleware to parse JSON in request body
app.use(express.json());
// Load enviornment variables
require('dotenv').config();

// Define configurations for different LLM APIs
const llmConfigs = {
Expand Down Expand Up @@ -71,9 +73,14 @@ async function sendQuestionToLLM(question, apiKey, model = 'gemini') {
app.post('/ask', async (req, res) => {
try {
// Check if required fields are present in the request body
validateRequiredFields(req, ['question', 'model', 'apiKey']);
validateRequiredFields(req, ['question', 'model']);

const { question, model, apiKey } = req.body;
const { question, model } = req.body;
//load the api key from an environment variable
const apiKey = process.env.LLM_API_KEY;
if (!apiKey) {
return res.status(400).json({ error: 'API key is missing.' });
}
const answer = await sendQuestionToLLM(question, apiKey, model);
res.json({ answer });

Expand Down
5 changes: 4 additions & 1 deletion llmservice/llm-service.test.js
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
//set a fake api key
process.env.LLM_API_KEY = 'test-api-key';

const request = require('supertest');
const axios = require('axios');
const app = require('./llm-service');
Expand All @@ -22,7 +25,7 @@ describe('LLM Service', () => {
it('the llm should reply', async () => {
const response = await request(app)
.post('/ask')
.send({ question: 'a question', apiKey: 'apiKey', model: 'gemini' });
.send({ question: 'a question', model: 'gemini' });

expect(response.statusCode).toBe(200);
expect(response.body.answer).toBe('llmanswer');
Expand Down
13 changes: 13 additions & 0 deletions llmservice/package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

17 changes: 9 additions & 8 deletions llmservice/package.json
Original file line number Diff line number Diff line change
Expand Up @@ -9,12 +9,13 @@
"license": "ISC",
"description": "",
"homepage": "https://github.com/arquisoft/wichat_0#readme",
"dependencies": {
"axios": "^1.7.9",
"express": "^4.21.2"
},
"devDependencies": {
"jest": "^29.7.0",
"supertest": "^7.0.0"
}
"dependencies": {
"axios": "^1.7.9",
"dotenv": "^16.4.7",
"express": "^4.21.2"
},
"devDependencies": {
"jest": "^29.7.0",
"supertest": "^7.0.0"
}
}
2 changes: 0 additions & 2 deletions webapp/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -7,9 +7,7 @@ WORKDIR /app
RUN npm install --omit=dev

ARG API_URI="http://localhost:8000"
ARG LLM_API_KEY
ENV REACT_APP_API_ENDPOINT=$API_URI
ENV REACT_APP_LLM_API_KEY=$LLM_API_KEY

#Create an optimized version of the webapp
RUN npm run build
Expand Down
12 changes: 3 additions & 9 deletions webapp/src/components/Login.js
Original file line number Diff line number Diff line change
Expand Up @@ -14,22 +14,16 @@ const Login = () => {
const [openSnackbar, setOpenSnackbar] = useState(false);

const apiEndpoint = process.env.REACT_APP_API_ENDPOINT || 'http://localhost:8000';
const apiKey = process.env.REACT_APP_LLM_API_KEY || 'None';


const loginUser = async () => {
try {
const response = await axios.post(`${apiEndpoint}/login`, { username, password });

const question = "Please, generate a greeting message for a student called " + username + " that is a student of the Software Architecture course in the University of Oviedo. Be nice and polite. Two to three sentences max.";
const model = "empathy"

if (apiKey==='None'){
setMessage("LLM API key is not set. Cannot contact the LLM.");
}
else{
const message = await axios.post(`${apiEndpoint}/askllm`, { question, model, apiKey })
setMessage(message.data.answer);
}
const message = await axios.post(`${apiEndpoint}/askllm`, { question, model })
setMessage(message.data.answer);
// Extract data from the response
const { createdAt: userCreatedAt } = response.data;

Expand Down

0 comments on commit 294aaa3

Please sign in to comment.