Black Alert is a serverless chat system that leverages Google's Gemini AI API through AWS Lambda. The system provides a scalable, efficient way to interact with Gemini AI's advanced language model, with infrastructure managed through Terraform and automated deployments via BitBucket Pipelines.
Note: Replace the above image path with your actual cloud architecture diagram.
- Serverless Architecture: AWS Lambda-based deployment for optimal scaling and cost efficiency
- Gemini AI Integration: Direct integration with Google's latest Gemini AI model
- Infrastructure as Code: Complete Terraform-managed infrastructure
- Automated CI/CD: Streamlined deployment pipeline using BitBucket Pipelines
- Error Handling: Comprehensive error management and logging
- CORS Support: Built-in CORS configuration for web integration
- Python 3.12
- AWS Lambda
- Google Gemini AI API
- Terraform
- BitBucket Pipelines
- AWS API Gateway
- AWS S3
- AWS Account with appropriate permissions
- Google Cloud Platform account with Gemini AI API access
- Terraform installed locally
- Python 3.12
- BitBucket account for CI/CD
-
Clone the repository:
git clone https://github.com/Black-Alert/google-gemini-aws-ai-code.git cd black-alert-gemini-demo
-
Create and activate virtual environment:
python -m venv venv source venv/bin/activate # On Windows: .\venv\Scripts\activate
-
Install dependencies:
pip install -r requirements.txt
-
Set up environment variables:
cp .env.example .env # Add your GEMINI_API_KEY and AWS credentials
-
Initialize Terraform:
cd infra terraform init
-
Apply Terraform configuration:
terraform plan terraform apply
- Configure AWS credentials
- Set up Gemini AI API key
- Update BitBucket pipeline variables:
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- AWS_DEFAULT_REGION
- LAMBDA_NAME
- ZIP_BUCKET
POST request to your API Gateway endpoint:
{
"message": "Your message here"
}
Response:
{
"response": "AI generated response"
}
black-alert/
├── infra/ # Terraform infrastructure code
│ ├── modules/ # Reusable Terraform modules
│ ├── main.tf # Main Terraform configuration
│ ├── lambdas.tf # Lambda function configuration
│ └── rest_api.tf # API Gateway configuration
├── lambda_function.py # Main Lambda function code
├── requirements.txt # Python dependencies
├── bitbucket-pipelines.yml # CI/CD configuration
└── docs/ # Additional documentation
- Push changes to the dev branch
- BitBucket Pipeline automatically:
- Builds the deployment package
- Uploads to S3
- Updates Lambda function
Test the Lambda function locally using AWS SAM or direct Python execution:
python lambda_function.py
- CloudWatch Logs for Lambda function monitoring
- API Gateway metrics
- Terraform state monitoring
- BitBucket Pipeline logs
- Environment variables for sensitive data
- IAM roles with least privilege
- API Gateway authentication (if configured)
- CORS policy implementation
- Fork the repository
- Create your feature branch (
git checkout -b feature/AmazingFeature
) - Commit your changes (
git commit -m 'Add some AmazingFeature'
) - Push to the branch (
git push origin feature/AmazingFeature
) - Open a Pull Request
For support and queries:
- Create an issue in the repository
- Contact the development team at [[email protected]]
This project is licensed under the MIT License - see the LICENSE file for details.
- Google Cloud Platform and Gemini AI team
- AWS Lambda and Serverless community
- Terraform and HashiCorp
- BitBucket team for CI/CD capabilities