This microservice is primarily designed to handle user-related functionalities within a larger application.
It provides essential features such as user registration, authentication, and management of personal information, delivery addresses, and shopping cart items.
- User Authentication:
- Signup: Allows users to create new accounts.
- Signin: Enables existing users to log in.
- Password Reset: Provides a mechanism for users to recover their passwords.
- Email Confirmation: Requires users to verify their email addresses before full account activation.
- User Profile Management:
- Read, Update: Users can view and modify their personal information.
- Delivery Address Management:
- Create, Read, Update, Delete: Users can manage their delivery addresses for shipping purposes.
- Recently Viewed Items:
- View: Users can see a list of items they have recently viewed.
- Shopping Cart Management:
- Create, Update, Read, Delete: Users can add, remove, and modify items in their shopping carts.
git clone https://github.com/GhostMEn20034/SM1L3_SAL3S_user_microservice.git
on Windows (PowerShell), run:
New-Item -Path ".env" -ItemType "File"
on Unix or MacOS, run:
touch .env
SUPER_USER_PWD=some_pwd # Postgres user password
SECRET_KEY=some_secret key # Django's secret key
JWT_SIGNING_KEY=some_key # Signing key for JWT Tokens
TWILIO_ACCOUNT_SID=123456 # TWILIO ACCOUND ID
TWILIO_AUTH_TOKEN=123456 # Twilio's token for authentication in twilio
TWILIO_SERVICE_SID_CHANGE_EMAIL=123456 # Service ID for sending emails when the user wants to change an email
TWILIO_SERVICE_SID_SIGNUP_CONFIRMATION=123456 # Service ID for sending emails when the user need to confirm an email address
TWILIO_SERVICE_SID_PASSWORD_RESET=123456 # Service ID for sending emails when the user need to reset a password
DEBUG=0_or_1 # Determines whether the debug mode turned on (1 - on, 0 - off)
ALLOWED_HOSTS=localhost,127.0.0.1,[::1] # Your allowed hosts
CORS_ALLOWED_ORIGINS=http://localhost:3000,http://127.0.0.1:3000 # Allowed CORS Origins
CORS_ORIGIN_WHITELIST=http://localhost:3000,http://127.0.0.1:3000,http://localhost:3001 # CORS White list
SQL_ENGINE=django.db.backends.postgresql # DB Engine
SQL_DATABASE=smile_sales_users # Database Name
SQL_USER=smile_sales_usr # Database Owner
SQL_PASSWORD=xxxx4444 # DB user's password
SQL_HOST=db # Database host
SQL_PORT=5432 # Database Port
SQL_CONN_MAX_AGE=60 # Maximum Connection's age
DB_SSL_MODE=0_or_1 # Determines whether SSL mode is enabled in DB connection (0 - disabled, 1 - enabled)
DRAMATIQ_BROKER_URL=redis_url # Redis url for dramatiq worker
DRAMATIQ_CRONTAB_BROKER_URL=redis_url # Redis url for dramatiq worker
DRAMATIQ_RESULT_BACKEND_URL=redis_url # Redis url for dramatiq worker
FLUSH_EXPIRED_TOKEN_PERIOD_HOURS=1 # How often expired tokens will be cleaned
DELETE_INACTIVE_CARTS_PERIOD_DAYS=1 # How often inactive carts will be deleted
AMPQ_CONNECTION_URL=url_rabbit_mq # URL for message broker
PRODUCT_CRUD_EXCHANGE_TOPIC_NAME=product_replication # Just copy that
USERS_DATA_CRUD_EXCHANGE_TOPIC_NAME=users_data_replication # Just copy that
ORDER_PROCESSING_EXCHANGE_TOPIC_NAME=order_processing_replication # Just copy that
If you want to run this app you have two options:
- Run it using
docker-compose
- Run it using
k8s
1.1 Add a permission to execute the init-database.sh
file inside the docker-compose's Postgres service:
chmod +x init-database.sh
docker compose up -d --build
1.3 Go to localhost:8000 or 127.0.0.1:8000 and use the API.
Note: If you want to run this API using Kubernetes, you need to create and expose Postgres and Redis servers manually
kubectl create namespace smile-sales-users
kubectl create secret generic web-secrets --from-env-file=.env
cd k8s/dev
kubectl apply -f config-maps/
kubectl apply -f persistent-volumes/
kubectl apply -f persistent-volume-claims/
kubectl apply -f jobs/
kubectl get jobs
kubectl apply -f services/
kubectl apply -f deployments/
You need to perform identical steps as in the development environment.
But, there's a few nuances:
- You need to use k8s resources from
k8s/production
directory - GCS Fuse bucket is required (it used as a storage for static files)
- At the end of
k8s/production/persistent-volumes/staticfiles-pv-user-microservice.yaml
file, paste your bucket's name (see "volumeHandle" key) - Make sure that your kubernetes service account
gke-user
(If it's not created, create one) is bound to gcp service account which has "StorageAdmin" role and has access to the created bucket
POV: make sure you're in the default directory
To run tests you need to complete 3 steps:
- Run the test app with another
docker-compose
file:
docker compose -f docker-compose-test-env.yaml up -d
- Run the command below:
docker compose exec web python manage.py test -v 2
- Shutdown
docker-compose
services:
docker compose -f docker-compose-test-env.yaml down