Skip to content

Commit

Permalink
Merge branch 'Back-end' into Back-Ana
Browse files Browse the repository at this point in the history
  • Loading branch information
UO288302 authored Mar 7, 2025
2 parents 4aa234f + 2d89a2a commit e59d02c
Show file tree
Hide file tree
Showing 21 changed files with 895 additions and 149 deletions.
1 change: 1 addition & 0 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
name: Build
on:
workflow_dispatch:
push:
branches:
- master
Expand Down
35 changes: 17 additions & 18 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ jobs:
- run: npm --prefix users/userservice ci
- run: npm --prefix llmservice ci
- run: npm --prefix gatewayservice ci
- run: npm --prefix webapp ci
- run: npm --prefix webapp ci --legacy-peer-deps
- run: npm --prefix users/authservice test -- --coverage
- run: npm --prefix users/userservice test -- --coverage
- run: npm --prefix llmservice test -- --coverage
Expand Down Expand Up @@ -54,14 +54,13 @@ jobs:
uses: elgohr/Publish-Docker-Github-Action@v5
env:
API_URI: http://${{ secrets.DEPLOY_HOST }}:8000
LLM_API_KEY: ${{ secrets.LLM_API_KEY }}
with:
name: arquisoft/wichat_es4a/webapp
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
registry: ghcr.io
workdir: webapp
buildargs: API_URI,LLM_API_KEY
buildargs: API_URI
docker-push-authservice:
name: Push auth service Docker Image to GitHub Packages
runs-on: ubuntu-latest
Expand Down Expand Up @@ -140,18 +139,18 @@ jobs:
password: ${{ secrets.GITHUB_TOKEN }}
registry: ghcr.io
workdir: gatewayservice
# deploy:
# name: Deploy over SSH
# runs-on: ubuntu-latest
# needs: [docker-push-userservice,docker-push-authservice,docker-push-llmservice,docker-push-gatewayservice,docker-push-webapp]
# steps:
# - name: Deploy over SSH
# uses: fifsky/ssh-action@master
# with:
# host: ${{ secrets.DEPLOY_HOST }}
# user: ${{ secrets.DEPLOY_USER }}
# key: ${{ secrets.DEPLOY_KEY }}
# command: |
# wget https://raw.githubusercontent.com/arquisoft/wichat_es4a/master/docker-compose.yml -O docker-compose.yml
# docker compose --profile prod down
# docker compose --profile prod up -d --pull always
deploy:
name: Deploy over SSH
runs-on: ubuntu-latest
needs: [docker-push-userservice,docker-push-authservice,docker-push-llmservice,docker-push-gatewayservice,docker-push-webapp]
steps:
- name: Deploy over SSH
uses: fifsky/ssh-action@master
with:
host: ${{ secrets.DEPLOY_HOST }}
user: ${{ secrets.DEPLOY_USER }}
key: ${{ secrets.DEPLOY_KEY }}
command: |
wget https://raw.githubusercontent.com/arquisoft/wichat_es4a/master/docker-compose.yml -O docker-compose.yml
docker compose --profile prod down
docker compose --profile prod up -d --pull always
3 changes: 3 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
{
"postman.settings.dotenv-detection-notification-visibility": false
}
7 changes: 3 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -30,9 +30,9 @@ First, clone the project:
In order to communicate with the LLM integrated in this project, we need to setup an API key. Two integrations are available in this propotipe: gemini and empaphy. The API key provided must match the LLM provider used.

We need to create two .env files.
- The first one in the webapp directory (for executing the webapp using ```npm start```). The content of this .env file should be as follows:
- The first one in the llmservice directory (for executing the llmservice using ```npm start```). The content of this .env file should be as follows:
```
REACT_APP_LLM_API_KEY="YOUR-API-KEY"
LLM_API_KEY="YOUR-API-KEY"
```
- The second one located in the root of the project (along the docker-compose.yml). This .env file is used for the docker-compose when launching the app with docker. The content of this .env file should be as follows:
```
Expand All @@ -41,8 +41,7 @@ LLM_API_KEY="YOUR-API-KEY"

Note that these files must NOT be uploaded to the github repository (they are excluded in the .gitignore).

An extra configuration for the LLM to work in the deployed version of the app is to include it as a repository secret (LLM_API_KEY). This secret will be used by GitHub Action when building and deploying the application.

An extra configuration for the LLM to work in the deployed version of the app is to create the same .env file (with the LLM_API_KEY variable) in the virtual machine (in the home of the azureuser directory).

### Launching Using docker
For launching the propotipe using docker compose, just type:
Expand Down
21 changes: 17 additions & 4 deletions db/Connection.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,7 @@
const mongoose = require("mongoose");
const { MongoMemoryServer } = require('mongodb-memory-server');
import mongoose from "mongoose";
import { MongoMemoryServer } from "mongodb-memory-server";

let mongoServer;

//async -> asegura que el código no se bloquee mientras espera que MongoDB se conecte.
const connect = async() => {
Expand All @@ -11,7 +13,7 @@ const connect = async() => {
console.log("MongoDB URL server")
} else {
//si no hay una variable de entorno DB_URL, creamos un servidor de MongoDB en memoria usando MongoMemoryServer
const mongoServer = await MongoMemoryServer.create(); //se crea un servidor de bd en memoria
mongoServer = await MongoMemoryServer.create(); //se crea un servidor de bd en memoria
const mongoUri = mongoServer.getUri(); //obtenemos la URL del servidor en memoria
await mongoose.connect(mongoUri) //nos conectamos a mongoDB

Expand All @@ -26,5 +28,16 @@ const connect = async() => {
}
}

const disconnect = async () => {
try {
await mongoose.disconnect();
if (mongoServer) {
await mongoServer.stop();
}
console.log("MongoDB disconnected");
} catch (error) {
console.error("Error al desconectar de MongoDB:", error);
}
};

module.exports = connect;
export {connect, disconnect};
16 changes: 0 additions & 16 deletions db/Topic.js

This file was deleted.

92 changes: 92 additions & 0 deletions db/crud.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,92 @@
import {connect, disconnect} from './Connection.js';
import User from './user.js';
// Y los demás imports necesarios

// Clase que contiene los métodos para realizar operaciones CRUD sobre cualquier modelo de la base de datos
class Crud {

static async createUser(data) {
try {
const newUser = new User(data);
const savedUser = await newUser.save();
return savedUser;
} catch (error) {
error.message = 'Error al crear el usuario: ' + error.message;
throw error;
}
}

static async getAllUsers() {
try {
const users = await User.find();
return users;
} catch (error) {
error.message = 'Error al obtener los usuarios: ' + error.message;
throw error;
}
}

static async getUserById(userId) {
try {
const user = await User.findById(userId);
return user;
} catch (error) {
error.message = 'Error al obtener el usuario: ' + error.message;
throw error;
}
}

static async updateUser(userId, updateData) {
try {
const updatedUser = await User.findByIdAndUpdate(
userId,
updateData,
{ new: true }
);
return updatedUser; // Devuelve el usuario actualizado
} catch (error) {
error.message = 'Error al actualizar el usuario: ' + error.message;
throw error;
}
}

static async deleteUser(userId) {
try {
const deletedUser = await User.findByIdAndDelete(userId);
return deletedUser; // Devuelve el usuario eliminado
} catch (error) {
error.message = 'Error al eliminar el usuario: ' + error.message;
throw error;
}
}
}

connect()
.then(() => {
console.log("Conexión establecida");
// Se crea el usuario
return Crud.createUser({
username: 'user1',
password: '123456'
});
})
.then(createdUser => {
console.log("Usuario creado:", createdUser);
// Se busca el usuario recién creado usando su _id
return Crud.getUserById(createdUser._id);
})
.then(foundUser => {
console.log("Usuario encontrado:", foundUser);
console.log(`ID: ${foundUser._id}`);
console.log(`Username: ${foundUser.username}`);
console.log(`Friend Code: ${foundUser.friendCode}`);
console.log(`Password: ${foundUser.password}`);
})
.catch(error => {
console.error("Error:", error);
})
.finally(() => {
disconnect();
});

export default Crud;
50 changes: 50 additions & 0 deletions db/user.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
import { Schema, model } from "mongoose";

// Definir el esquema de usuario
const userSchema = new Schema({
username: {
type: String,
required: true,
trim: true
},
friendCode: {
type: String,
unique: true,
required: false
},
password: {
type: String,
required: true
}
});

// Función que genera un número entre 100000 y 999999 en formato String.
function generarCodigoAmigo() {
return Math.floor(100000 + Math.random() * 900000).toString();
}

// Antes de guardar, se genera y asigna un friendCode único.
// Usamos `this.constructor` para buscar dentro del mismo modelo, ya que "User"
// aún no está definido en el momento de crear el hook.
userSchema.pre("save", async function (next) {
const user = this;
let codigoValido = false;
let codigoAleatorio;

while (!codigoValido) {
codigoAleatorio = generarCodigoAmigo();

// Verifica si ya existe otro usuario con este friendCode
const existe = await this.constructor.findOne({ friendCode: codigoAleatorio });
if (!existe) {
codigoValido = true;
}
}

user.friendCode = codigoAleatorio;
next();
});

// Crear el modelo a partir del esquema
const User = model("User", userSchema);
export default User;
10 changes: 5 additions & 5 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,10 @@ services:
container_name: llmservice-wichat_es4a
image: ghcr.io/arquisoft/wichat_es4a/llmservice:latest
profiles: ["dev", "prod"]
build: ./llmservice
env_file:
- .env
build:
context: ./llmservice
ports:
- "8003:8003"
networks:
Expand Down Expand Up @@ -71,10 +74,7 @@ services:
container_name: webapp-wichat_es4a
image: ghcr.io/arquisoft/wichat_es4a/webapp:latest
profiles: ["dev", "prod"]
build:
context: ./webapp
args:
LLM_API_KEY: ${LLM_API_KEY}
build: ./webapp
depends_on:
- gatewayservice
ports:
Expand Down
3 changes: 2 additions & 1 deletion llmservice/.dockerignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
node_modules
coverage
coverage
.env
13 changes: 10 additions & 3 deletions llmservice/llm-service.js
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ const port = 8003;

// Middleware to parse JSON in request body
app.use(express.json());
// Load enviornment variables
require('dotenv').config();

// Define configurations for different LLM APIs
const llmConfigs = {
Expand All @@ -19,7 +21,7 @@ const llmConfigs = {
empathy: {
url: () => 'https://empathyai.prod.empathy.co/v1/chat/completions',
transformRequest: (question) => ({
model: "qwen/Qwen2.5-Coder-7B-Instruct",
model: "mistralai/Mistral-7B-Instruct-v0.3",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: question }
Expand Down Expand Up @@ -71,9 +73,14 @@ async function sendQuestionToLLM(question, apiKey, model = 'gemini') {
app.post('/ask', async (req, res) => {
try {
// Check if required fields are present in the request body
validateRequiredFields(req, ['question', 'model', 'apiKey']);
validateRequiredFields(req, ['question', 'model']);

const { question, model, apiKey } = req.body;
const { question, model } = req.body;
//load the api key from an environment variable
const apiKey = process.env.LLM_API_KEY;
if (!apiKey) {
return res.status(400).json({ error: 'API key is missing.' });
}
const answer = await sendQuestionToLLM(question, apiKey, model);
res.json({ answer });

Expand Down
Loading

0 comments on commit e59d02c

Please sign in to comment.