Deployment Guide
Overview
This guide covers the deployment of JobHive to production environments using AWS services. The deployment process includes containerization, infrastructure setup, CI/CD pipelines, and monitoring configuration.Deployment Architecture
Production Environment Stack
Copy
Production Deployment Flow:
GitHub → GitHub Actions → ECR → ECS → Production
Development → Staging → Production
↓ ↓ ↓
Local AWS Test AWS Prod
Prerequisites
Required Tools and Access
Copy
# AWS CLI
aws --version # aws-cli/2.x.x or higher
# Docker
docker --version # Docker version 20.10.x or higher
# Terraform (for infrastructure)
terraform --version # Terraform v1.5.x or higher
# kubectl (for EKS deployment)
kubectl version --client # v1.28 or higher
AWS Account Setup
Copy
# Configure AWS credentials
aws configure
# AWS Access Key ID: [Your Access Key]
# AWS Secret Access Key: [Your Secret Key]
# Default region name: us-east-1
# Default output format: json
# Verify access
aws sts get-caller-identity
Required AWS Services
- ECS Fargate: Container orchestration
- RDS PostgreSQL: Database
- ElastiCache Redis: Caching and sessions
- Application Load Balancer: Load balancing
- ECR: Container registry
- S3: Static files and media storage
- CloudFront: CDN
- Route 53: DNS management
- Certificate Manager: SSL certificates
- Secrets Manager: Sensitive configuration
- CloudWatch: Monitoring and logging
Infrastructure Setup
1. Terraform Infrastructure Deployment
Directory Structure
Copy
aws-ecs/
├── terraform/
│ ├── main.tf
│ ├── variables.tf
│ ├── outputs.tf
│ ├── vpc.tf
│ ├── ecs.tf
│ ├── rds.tf
│ ├── elasticache.tf
│ ├── load_balancer.tf
│ ├── secrets.tf
│ └── terraform.tfvars
├── scripts/
│ ├── build-and-push.sh
│ ├── deploy-services.sh
│ └── migrate-secrets.sh
└── task-definitions/
├── django-task-definition.json
├── celery-worker-task-definition.json
└── celery-beat-task-definition.json
Initialize Terraform
Copy
cd aws-ecs/terraform
# Initialize Terraform
terraform init
# Create terraform.tfvars file
cat > terraform.tfvars << EOF
# Project Configuration
project_name = "jobhive"
environment = "production"
aws_region = "us-east-1"
# Networking
vpc_cidr = "10.0.0.0/16"
availability_zones = ["us-east-1a", "us-east-1b", "us-east-1c"]
# Database Configuration
db_instance_class = "db.r6g.xlarge"
db_allocated_storage = 1000
db_backup_retention_period = 30
# Cache Configuration
cache_node_type = "cache.r7g.large"
cache_num_nodes = 3
# ECS Configuration
ecs_cpu = 512
ecs_memory = 1024
ecs_desired_count = 3
# Domain Configuration
domain_name = "jobhive.com"
subdomain = "api"
# SSL Certificate ARN (create in ACM first)
ssl_certificate_arn = "arn:aws:acm:us-east-1:123456789:certificate/your-cert-id"
EOF
Deploy Infrastructure
Copy
# Plan the deployment
terraform plan
# Apply the infrastructure
terraform apply
# Save outputs for later use
terraform output > ../outputs.txt
2. Container Registry Setup
Create ECR Repository
Copy
# Create ECR repository
aws ecr create-repository --repository-name jobhive/backend --region us-east-1
# Get login token
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin 123456789.dkr.ecr.us-east-1.amazonaws.com
Build and Push Images
Copy
#!/bin/bash
# build-and-push.sh
set -e
# Configuration
AWS_ACCOUNT_ID="123456789"
AWS_REGION="us-east-1"
ECR_REPOSITORY="jobhive/backend"
IMAGE_TAG="latest"
# Get ECR login
aws ecr get-login-password --region $AWS_REGION | docker login --username AWS --password-stdin $AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com
# Build Docker image
echo "Building Docker image..."
docker build -f compose/production/django/Dockerfile -t $ECR_REPOSITORY:$IMAGE_TAG .
# Tag for ECR
ECR_URI="$AWS_ACCOUNT_ID.dkr.ecr.$AWS_REGION.amazonaws.com/$ECR_REPOSITORY:$IMAGE_TAG"
docker tag $ECR_REPOSITORY:$IMAGE_TAG $ECR_URI
# Push to ECR
echo "Pushing image to ECR..."
docker push $ECR_URI
echo "Image pushed successfully: $ECR_URI"
3. Secrets Management
Create Secrets in AWS Secrets Manager
Copy
#!/bin/bash
# migrate-secrets.sh
# Database credentials
aws secretsmanager create-secret \
--name "jobhive/database" \
--description "PostgreSQL database credentials" \
--secret-string '{
"engine": "postgres",
"host": "jobhive-db-cluster.cluster-xyz.us-east-1.rds.amazonaws.com",
"username": "jobhive_app",
"password": "your-secure-database-password",
"dbname": "jobhive_production",
"port": 5432
}'
# Redis credentials
aws secretsmanager create-secret \
--name "jobhive/redis" \
--description "Redis cache credentials" \
--secret-string '{
"host": "jobhive-redis-cluster.abc123.cache.amazonaws.com",
"port": 6379,
"auth_token": "your-redis-auth-token"
}'
# Django application secrets
aws secretsmanager create-secret \
--name "jobhive/django" \
--description "Django application secrets" \
--secret-string '{
"secret_key": "your-django-secret-key-50-chars-long",
"jwt_secret": "your-jwt-signing-secret"
}'
# External service API keys
aws secretsmanager create-secret \
--name "jobhive/external" \
--description "External service API keys" \
--secret-string '{
"openai_api_key": "sk-your-openai-api-key",
"stripe_secret_key": "sk_live_your-stripe-secret-key",
"stripe_webhook_secret": "whsec_your-webhook-secret",
"datadog_api_key": "your-datadog-api-key"
}'
ECS Deployment
1. Task Definitions
Django Web Application Task Definition
Copy
{
"family": "jobhive-web-task",
"networkMode": "awsvpc",
"requiresCompatibilities": ["FARGATE"],
"cpu": "512",
"memory": "1024",
"executionRoleArn": "arn:aws:iam::123456789:role/ecsTaskExecutionRole",
"taskRoleArn": "arn:aws:iam::123456789:role/jobhiveTaskRole",
"containerDefinitions": [
{
"name": "django-web",
"image": "123456789.dkr.ecr.us-east-1.amazonaws.com/jobhive/backend:latest",
"essential": true,
"portMappings": [
{
"containerPort": 8000,
"protocol": "tcp"
}
],
"environment": [
{
"name": "DJANGO_SETTINGS_MODULE",
"value": "config.settings.production"
},
{
"name": "AWS_DEFAULT_REGION",
"value": "us-east-1"
},
{
"name": "USE_S3",
"value": "true"
}
],
"secrets": [
{
"name": "DATABASE_URL",
"valueFrom": "arn:aws:secretsmanager:us-east-1:123456789:secret:jobhive/database:host::"
},
{
"name": "REDIS_URL",
"valueFrom": "arn:aws:secretsmanager:us-east-1:123456789:secret:jobhive/redis:host::"
},
{
"name": "DJANGO_SECRET_KEY",
"valueFrom": "arn:aws:secretsmanager:us-east-1:123456789:secret:jobhive/django:secret_key::"
}
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/ecs/jobhive-web",
"awslogs-region": "us-east-1",
"awslogs-stream-prefix": "ecs"
}
},
"healthCheck": {
"command": [
"CMD-SHELL",
"curl -f http://localhost:8000/health/ || exit 1"
],
"interval": 30,
"timeout": 5,
"retries": 3,
"startPeriod": 60
}
}
]
}
Celery Worker Task Definition
Copy
{
"family": "jobhive-celery-worker-task",
"networkMode": "awsvpc",
"requiresCompatibilities": ["FARGATE"],
"cpu": "256",
"memory": "512",
"executionRoleArn": "arn:aws:iam::123456789:role/ecsTaskExecutionRole",
"taskRoleArn": "arn:aws:iam::123456789:role/jobhiveTaskRole",
"containerDefinitions": [
{
"name": "celery-worker",
"image": "123456789.dkr.ecr.us-east-1.amazonaws.com/jobhive/backend:latest",
"essential": true,
"command": [
"celery",
"-A",
"config.celery_app",
"worker",
"-l",
"info",
"--concurrency=2"
],
"environment": [
{
"name": "DJANGO_SETTINGS_MODULE",
"value": "config.settings.production"
},
{
"name": "C_FORCE_ROOT",
"value": "1"
}
],
"secrets": [
{
"name": "DATABASE_URL",
"valueFrom": "arn:aws:secretsmanager:us-east-1:123456789:secret:jobhive/database::"
},
{
"name": "REDIS_URL",
"valueFrom": "arn:aws:secretsmanager:us-east-1:123456789:secret:jobhive/redis::"
}
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/ecs/jobhive-celery-worker",
"awslogs-region": "us-east-1",
"awslogs-stream-prefix": "ecs"
}
}
}
]
}
2. Service Deployment
Deploy Services Script
Copy
#!/bin/bash
# deploy-services.sh
set -e
CLUSTER_NAME="jobhive-production"
AWS_REGION="us-east-1"
echo "Deploying JobHive services to ECS..."
# Register task definitions
echo "Registering task definitions..."
aws ecs register-task-definition \
--cli-input-json file://django-task-definition.json \
--region $AWS_REGION
aws ecs register-task-definition \
--cli-input-json file://celery-worker-task-definition.json \
--region $AWS_REGION
aws ecs register-task-definition \
--cli-input-json file://celery-beat-task-definition.json \
--region $AWS_REGION
# Update services
echo "Updating ECS services..."
# Update web service
aws ecs update-service \
--cluster $CLUSTER_NAME \
--service jobhive-web \
--task-definition jobhive-web-task \
--region $AWS_REGION
# Update celery worker service
aws ecs update-service \
--cluster $CLUSTER_NAME \
--service jobhive-celery-workers \
--task-definition jobhive-celery-worker-task \
--region $AWS_REGION
# Update celery beat service
aws ecs update-service \
--cluster $CLUSTER_NAME \
--service jobhive-celery-beat \
--task-definition jobhive-celery-beat-task \
--region $AWS_REGION
echo "Deployment completed!"
# Wait for services to be stable
echo "Waiting for services to stabilize..."
aws ecs wait services-stable \
--cluster $CLUSTER_NAME \
--services jobhive-web jobhive-celery-workers jobhive-celery-beat \
--region $AWS_REGION
echo "All services are stable and running!"
Database Migration and Setup
1. Database Migration
Copy
# Run database migrations
aws ecs run-task \
--cluster jobhive-production \
--task-definition jobhive-web-task \
--launch-type FARGATE \
--network-configuration "awsvpcConfiguration={subnets=[subnet-1a2b3c4d,subnet-5e6f7g8h],securityGroups=[sg-web-application],assignPublicIp=DISABLED}" \
--overrides '{
"containerOverrides": [
{
"name": "django-web",
"command": ["python", "manage.py", "migrate"]
}
]
}'
# Create superuser
aws ecs run-task \
--cluster jobhive-production \
--task-definition jobhive-web-task \
--launch-type FARGATE \
--overrides '{
"containerOverrides": [
{
"name": "django-web",
"command": ["python", "manage.py", "createsuperuser", "--noinput"],
"environment": [
{"name": "DJANGO_SUPERUSER_USERNAME", "value": "admin"},
{"name": "DJANGO_SUPERUSER_EMAIL", "value": "[email protected]"},
{"name": "DJANGO_SUPERUSER_PASSWORD", "value": "your-secure-admin-password"}
]
}
]
}'
# Setup initial data
aws ecs run-task \
--cluster jobhive-production \
--task-definition jobhive-web-task \
--launch-type FARGATE \
--overrides '{
"containerOverrides": [
{
"name": "django-web",
"command": ["python", "manage.py", "setup_billing"]
}
]
}'
aws ecs run-task \
--cluster jobhive-production \
--task-definition jobhive-web-task \
--launch-type FARGATE \
--overrides '{
"containerOverrides": [
{
"name": "django-web",
"command": ["python", "manage.py", "seed_initial_data"]
}
]
}'
2. Static Files Collection
Copy
# Collect static files to S3
aws ecs run-task \
--cluster jobhive-production \
--task-definition jobhive-web-task \
--launch-type FARGATE \
--overrides '{
"containerOverrides": [
{
"name": "django-web",
"command": ["python", "manage.py", "collectstatic", "--noinput"]
}
]
}'
CI/CD Pipeline Setup
1. GitHub Actions Workflow
.github/workflows/deploy-production.yml
Copy
name: Deploy to Production
on:
push:
branches: [main]
workflow_dispatch:
env:
AWS_REGION: us-east-1
ECR_REPOSITORY: jobhive/backend
ECS_CLUSTER: jobhive-production
jobs:
deploy:
name: Deploy to Production
runs-on: ubuntu-latest
environment: production
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ env.AWS_REGION }}
- name: Login to Amazon ECR
id: login-ecr
uses: aws-actions/amazon-ecr-login@v1
- name: Build, tag, and push image to Amazon ECR
id: build-image
env:
ECR_REGISTRY: ${{ steps.login-ecr.outputs.registry }}
IMAGE_TAG: ${{ github.sha }}
run: |
# Build Docker image
docker build -f compose/production/django/Dockerfile -t $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG .
docker build -f compose/production/django/Dockerfile -t $ECR_REGISTRY/$ECR_REPOSITORY:latest .
# Push image to ECR
docker push $ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG
docker push $ECR_REGISTRY/$ECR_REPOSITORY:latest
echo "image=$ECR_REGISTRY/$ECR_REPOSITORY:$IMAGE_TAG" >> $GITHUB_OUTPUT
- name: Download task definition
run: |
aws ecs describe-task-definition --task-definition jobhive-web-task --query taskDefinition > task-definition.json
- name: Fill in the new image ID in the Amazon ECS task definition
id: task-def
uses: aws-actions/amazon-ecs-render-task-definition@v1
with:
task-definition: task-definition.json
container-name: django-web
image: ${{ steps.build-image.outputs.image }}
- name: Deploy Amazon ECS task definition
uses: aws-actions/amazon-ecs-deploy-task-definition@v1
with:
task-definition: ${{ steps.task-def.outputs.task-definition }}
service: jobhive-web
cluster: ${{ env.ECS_CLUSTER }}
wait-for-service-stability: true
- name: Update Celery Worker Service
run: |
# Update celery worker task definition
aws ecs describe-task-definition --task-definition jobhive-celery-worker-task --query taskDefinition > celery-worker-task-def.json
# Update image in task definition
jq --arg IMAGE "${{ steps.build-image.outputs.image }}" '.containerDefinitions[0].image = $IMAGE' celery-worker-task-def.json > updated-celery-worker-task-def.json
# Register new task definition
aws ecs register-task-definition --cli-input-json file://updated-celery-worker-task-def.json
# Update service
aws ecs update-service \
--cluster ${{ env.ECS_CLUSTER }} \
--service jobhive-celery-workers \
--task-definition jobhive-celery-worker-task
- name: Run Database Migrations
run: |
aws ecs run-task \
--cluster ${{ env.ECS_CLUSTER }} \
--task-definition jobhive-web-task \
--launch-type FARGATE \
--network-configuration "awsvpcConfiguration={subnets=[subnet-1a2b3c4d,subnet-5e6f7g8h],securityGroups=[sg-web-application],assignPublicIp=DISABLED}" \
--overrides '{
"containerOverrides": [
{
"name": "django-web",
"command": ["python", "manage.py", "migrate"]
}
]
}'
- name: Collect Static Files
run: |
aws ecs run-task \
--cluster ${{ env.ECS_CLUSTER }} \
--task-definition jobhive-web-task \
--launch-type FARGATE \
--overrides '{
"containerOverrides": [
{
"name": "django-web",
"command": ["python", "manage.py", "collectstatic", "--noinput"]
}
]
}'
- name: Notify Deployment
if: always()
run: |
if [ "${{ job.status }}" == "success" ]; then
echo "✅ Deployment successful!"
else
echo "❌ Deployment failed!"
fi
2. Staging Environment Workflow
.github/workflows/deploy-staging.yml
Copy
name: Deploy to Staging
on:
push:
branches: [develop]
pull_request:
branches: [main]
env:
AWS_REGION: us-east-1
ECR_REPOSITORY: jobhive/backend
ECS_CLUSTER: jobhive-staging
jobs:
deploy-staging:
name: Deploy to Staging
runs-on: ubuntu-latest
environment: staging
steps:
- name: Checkout code
uses: actions/checkout@v3
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials@v2
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ${{ env.AWS_REGION }}
- name: Run Tests
run: |
# Set up test environment
python -m venv test-env
source test-env/bin/activate
pip install -r requirements/test.txt
# Run test suite
python manage.py test --settings=config.settings.test
# Run linting
flake8 .
black --check .
- name: Build and Deploy (similar to production)
# ... (similar steps as production deployment)
Monitoring and Logging Setup
1. CloudWatch Configuration
Copy
# Create log groups
aws logs create-log-group --log-group-name /ecs/jobhive-web
aws logs create-log-group --log-group-name /ecs/jobhive-celery-worker
aws logs create-log-group --log-group-name /ecs/jobhive-celery-beat
# Set retention policies
aws logs put-retention-policy --log-group-name /ecs/jobhive-web --retention-in-days 30
aws logs put-retention-policy --log-group-name /ecs/jobhive-celery-worker --retention-in-days 30
aws logs put-retention-policy --log-group-name /ecs/jobhive-celery-beat --retention-in-days 7
2. DataDog Integration
Copy
# datadog-agent-sidecar.json (add to task definitions)
{
"name": "datadog-agent",
"image": "datadog/agent:latest",
"essential": false,
"environment": [
{
"name": "DD_API_KEY",
"value": "your-datadog-api-key"
},
{
"name": "DD_SITE",
"value": "datadoghq.com"
},
{
"name": "ECS_FARGATE",
"value": "true"
},
{
"name": "DD_LOGS_ENABLED",
"value": "true"
},
{
"name": "DD_LOGS_CONFIG_CONTAINER_COLLECT_ALL",
"value": "true"
}
],
"logConfiguration": {
"logDriver": "awslogs",
"options": {
"awslogs-group": "/ecs/datadog-agent",
"awslogs-region": "us-east-1",
"awslogs-stream-prefix": "ecs"
}
}
}
Health Checks and Monitoring
1. Application Health Check Endpoint
Copy
# In Django settings
HEALTH_CHECK_SETTINGS = {
'PING_URL': '/health/',
'DATABASE_CHECK': True,
'CACHE_CHECK': True,
'CELERY_CHECK': True
}
# views.py
from django.http import JsonResponse
from django.db import connections
from django.core.cache import cache
import redis
def health_check(request):
"""Comprehensive health check endpoint."""
health_status = {
'status': 'healthy',
'timestamp': datetime.now().isoformat(),
'checks': {}
}
# Database check
try:
connections['default'].cursor()
health_status['checks']['database'] = 'healthy'
except Exception as e:
health_status['checks']['database'] = f'unhealthy: {str(e)}'
health_status['status'] = 'unhealthy'
# Cache check
try:
cache.set('health_check', 'ok', 10)
cache.get('health_check')
health_status['checks']['cache'] = 'healthy'
except Exception as e:
health_status['checks']['cache'] = f'unhealthy: {str(e)}'
health_status['status'] = 'unhealthy'
# Celery check
try:
from celery import current_app
inspect = current_app.control.inspect()
stats = inspect.stats()
if stats:
health_status['checks']['celery'] = 'healthy'
else:
health_status['checks']['celery'] = 'no workers available'
except Exception as e:
health_status['checks']['celery'] = f'unhealthy: {str(e)}'
status_code = 200 if health_status['status'] == 'healthy' else 503
return JsonResponse(health_status, status=status_code)
2. CloudWatch Alarms
Copy
# High CPU alarm
aws cloudwatch put-metric-alarm \
--alarm-name "jobhive-web-high-cpu" \
--alarm-description "Alert when CPU exceeds 80%" \
--metric-name CPUUtilization \
--namespace AWS/ECS \
--statistic Average \
--period 300 \
--evaluation-periods 2 \
--threshold 80 \
--comparison-operator GreaterThanThreshold \
--dimensions Name=ServiceName,Value=jobhive-web Name=ClusterName,Value=jobhive-production \
--alarm-actions arn:aws:sns:us-east-1:123456789:jobhive-alerts
# Database connection alarm
aws cloudwatch put-metric-alarm \
--alarm-name "jobhive-db-connection-count" \
--alarm-description "Alert when DB connections are high" \
--metric-name DatabaseConnections \
--namespace AWS/RDS \
--statistic Average \
--period 300 \
--evaluation-periods 2 \
--threshold 80 \
--comparison-operator GreaterThanThreshold \
--dimensions Name=DBInstanceIdentifier,Value=jobhive-db \
--alarm-actions arn:aws:sns:us-east-1:123456789:jobhive-alerts
Security Configuration
1. IAM Roles and Policies
Copy
// ECS Task Execution Role Policy
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"ecr:GetAuthorizationToken",
"ecr:BatchCheckLayerAvailability",
"ecr:GetDownloadUrlForLayer",
"ecr:BatchGetImage",
"logs:CreateLogStream",
"logs:PutLogEvents",
"secretsmanager:GetSecretValue"
],
"Resource": "*"
}
]
}
// Application Task Role Policy
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject",
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::jobhive-media-production/*",
"arn:aws:s3:::jobhive-static-production/*"
]
},
{
"Effect": "Allow",
"Action": [
"secretsmanager:GetSecretValue"
],
"Resource": [
"arn:aws:secretsmanager:us-east-1:123456789:secret:jobhive/*"
]
}
]
}
2. Security Groups
Copy
# Web application security group
aws ec2 create-security-group \
--group-name jobhive-web-sg \
--description "Security group for JobHive web application"
# Allow traffic from ALB only
aws ec2 authorize-security-group-ingress \
--group-id sg-web-application \
--protocol tcp \
--port 8000 \
--source-group sg-alb-external
# Database security group - allow from web app only
aws ec2 authorize-security-group-ingress \
--group-id sg-rds-database \
--protocol tcp \
--port 5432 \
--source-group sg-web-application
Backup and Recovery
1. Database Backup
Copy
# Automated RDS snapshots are configured via Terraform
# Manual snapshot for deployment
aws rds create-db-snapshot \
--db-instance-identifier jobhive-db \
--db-snapshot-identifier jobhive-db-pre-deployment-$(date +%Y%m%d%H%M%S)
2. Application Data Backup
Copy
# S3 cross-region replication is configured via Terraform
# Manual backup script
aws s3 sync s3://jobhive-media-production s3://jobhive-backups-$(date +%Y%m%d)
Rollback Procedures
1. Application Rollback
Copy
#!/bin/bash
# rollback.sh
PREVIOUS_TASK_DEFINITION_ARN="arn:aws:ecs:us-east-1:123456789:task-definition/jobhive-web-task:123"
CLUSTER_NAME="jobhive-production"
echo "Rolling back to previous task definition..."
# Rollback web service
aws ecs update-service \
--cluster $CLUSTER_NAME \
--service jobhive-web \
--task-definition $PREVIOUS_TASK_DEFINITION_ARN
# Wait for rollback to complete
aws ecs wait services-stable \
--cluster $CLUSTER_NAME \
--services jobhive-web
echo "Rollback completed successfully!"
2. Database Rollback
Copy
# Restore from snapshot (if needed)
aws rds restore-db-instance-from-db-snapshot \
--db-instance-identifier jobhive-db-restored \
--db-snapshot-identifier jobhive-db-pre-deployment-20240215120000
Post-Deployment Verification
1. Smoke Tests
Copy
#!/bin/bash
# smoke-tests.sh
API_BASE_URL="https://api.jobhive.com"
echo "Running post-deployment smoke tests..."
# Test health endpoint
echo "Testing health endpoint..."
health_response=$(curl -s -o /dev/null -w "%{http_code}" $API_BASE_URL/health/)
if [ "$health_response" -eq 200 ]; then
echo "✅ Health check passed"
else
echo "❌ Health check failed (HTTP $health_response)"
exit 1
fi
# Test API authentication
echo "Testing API authentication..."
auth_response=$(curl -s -o /dev/null -w "%{http_code}" -X POST $API_BASE_URL/api/auth/login/ \
-H "Content-Type: application/json" \
-d '{"email":"[email protected]","password":"wrongpassword"}')
if [ "$auth_response" -eq 400 ]; then
echo "✅ Authentication endpoint working"
else
echo "❌ Authentication endpoint failed (HTTP $auth_response)"
exit 1
fi
# Test database connectivity
echo "Testing database connectivity..."
db_test_response=$(curl -s $API_BASE_URL/health/ | jq -r '.checks.database')
if [ "$db_test_response" = "healthy" ]; then
echo "✅ Database connectivity confirmed"
else
echo "❌ Database connectivity failed"
exit 1
fi
echo "All smoke tests passed! 🎉"
2. Performance Verification
Copy
# Load testing with Apache Bench
ab -n 100 -c 10 https://api.jobhive.com/health/
# Monitor response times
curl -w "@curl-format.txt" -s -o /dev/null https://api.jobhive.com/api/v1/users/me/
