Skip to content

Backend Deployment - AWS EC2 GitHub Actions Deployment Setup

This guide walks you through setting up GitHub Actions to build, push, and deploy Docker images using AWS ECR and AWS SSM RunCommand with a single-branch deployment strategy.

Deployment Strategy Overview

  • Single Branch: All work happens on main
  • Build on Push: Every push to main builds and pushes images to both stage and prod ECR repositories
  • Deploy by Release Type:
  • Pre-release → Deploy to stage environment
  • Release → Deploy to prod environment
  • SHA-based Deployment: Uses git commit SHA to ensure exact code deployment

Prerequisites

Before starting, ensure you have:

  • Working AWS CLI with appropriate permissions
  • Backend repository with Docker Compose files
  • EC2 instances already deployed using the manual deployment guide
  • ECR repositories created during manual deployment

1. Repository Setup

1.1 Branch Protection

  • Enable Require linear history (no merge commits)
  • Protect the main branch from direct pushes
  • Leave "Restrict who can push" unchecked (Enterprise-only feature)

1.2 GitHub Environments

Create two environments in your repository:

Settings → Environments

Stage Environment

Create environment: stage

Add these variables:

Name Value Type Notes
AWS_ACCOUNT_ID <your_aws_account_id> Variable AWS account IDs aren't considered secrets
AWS_REGION us-east-1 Variable Your AWS region
PROJECT mow Variable Your project name

Production Environment

Create environment: prod

Add the same variables as stage:

Name Value Type Notes
AWS_ACCOUNT_ID <your_aws_account_id> Variable AWS account IDs aren't considered secrets
AWS_REGION us-east-1 Variable Your AWS region
PROJECT mow Variable Your project name

2. AWS Setup

2.1 Environment Variables

Set these variables for use throughout the setup:

PROFILE="admin-cli-sso"
REGION="us-east-1"
PROJECT="mow"
REPO="github.com/your-org/your-repo-name"  # Replace with your actual repo
ACCOUNT_ID=$(aws sts get-caller-identity --query Account --output text --profile "$PROFILE")

2.2 Verify ECR Repositories

The ECR repositories should already exist from the manual deployment. Verify they're present:

# List ECR repositories for both environments
aws ecr describe-repositories \
  --repository-names \
    "${PROJECT}/backend/stage/django" \
    "${PROJECT}/backend/stage/caddy" \
    "${PROJECT}/backend/prod/django" \
    "${PROJECT}/backend/prod/caddy" \
  --region "$REGION" \
  --profile "$PROFILE" \
  --output table

If any repositories are missing, they'll be created automatically when the build workflow runs.

2.3 Configure GitHub OIDC Trust

One-time Setup

This step is done once per AWS account. If you've already set up GitHub Actions OIDC for other projects, you can skip this step.

Get the GitHub Actions thumbprint:

THUMBPRINT=$(echo | openssl s_client -servername token.actions.githubusercontent.com \
  -connect token.actions.githubusercontent.com:443 2>/dev/null \
  | openssl x509 -fingerprint -noout -sha1 \
  | awk -F= '{print tolower($2)}' \
  | tr -d :)

aws iam create-open-id-connect-provider \
  --url https://token.actions.githubusercontent.com \
  --client-id-list sts.amazonaws.com \
  --thumbprint-list "$THUMBPRINT" \
  --profile "$PROFILE"

If you get an "EntityAlreadyExists" error, that's fine - the provider already exists.

2.4 Create GitHub Actions Deploy Roles

Create trust policy template:

cat <<EOF > /tmp/gh-oidc-trust-template.json
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Effect": "Allow",
      "Principal": { "Federated": "arn:aws:iam::<ACCOUNT_ID>:oidc-provider/token.actions.githubusercontent.com" },
      "Action": "sts:AssumeRoleWithWebIdentity",
      "Condition": {
        "StringEquals": {
          "token.actions.githubusercontent.com:aud": "sts.amazonaws.com",
          "token.actions.githubusercontent.com:sub": "repo:<REPO>:environment:<ENVIRONMENT>"
        }
      }
    }
  ]
}
EOF

Create permissions policy template:

cat <<EOF > /tmp/gh-deploy-policy-template.json
{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Sid": "EcrPushPull",
      "Effect": "Allow",
      "Action": [
        "ecr:GetAuthorizationToken",
        "ecr:BatchCheckLayerAvailability",
        "ecr:CompleteLayerUpload",
        "ecr:UploadLayerPart",
        "ecr:InitiateLayerUpload",
        "ecr:PutImage",
        "ecr:BatchGetImage",
        "ecr:GetDownloadUrlForLayer",
        "ecr:DescribeRepositories",
        "ecr:ListImages",
        "ecr:DescribeImages"
      ],
      "Resource": "*"
    },
    {
      "Sid": "SendCommandDocument",
      "Effect": "Allow",
      "Action": "ssm:SendCommand",
      "Resource": [
        "arn:aws:ssm:<REGION>::document/AWS-RunShellScript",
        "arn:aws:ssm:<REGION>:<ACCOUNT_ID>:document/*"
      ]
    },
    {
      "Sid": "SendCommandToTaggedInstances",
      "Effect": "Allow",
      "Action": "ssm:SendCommand",
      "Resource": "arn:aws:ec2:<REGION>:<ACCOUNT_ID>:instance/*",
      "Condition": {
        "StringEquals": {
          "ssm:resourceTag/<PROJECT>:component": "backend",
          "ssm:resourceTag/<PROJECT>:environment": "<ENVIRONMENT>"
        }
      }
    },
    {
      "Sid": "ReadCommandResults",
      "Effect": "Allow",
      "Action": [
        "ssm:GetCommandInvocation",
        "ssm:ListCommands",
        "ssm:ListCommandInvocations"
      ],
      "Resource": "*"
    }
  ]
}
EOF

Create roles for both environments:

for ENV in stage prod; do
  # Create trust policy for this environment
  sed "s/<ACCOUNT_ID>/$ACCOUNT_ID/g; s|<REPO>|$REPO|g; s/<ENVIRONMENT>/$ENV/g" \
    /tmp/gh-oidc-trust-template.json > "/tmp/gh-oidc-trust-$ENV.json"

  # Create permissions policy for this environment  
  sed "s/<REGION>/$REGION/g; s/<ACCOUNT_ID>/$ACCOUNT_ID/g; s/<PROJECT>/$PROJECT/g; s/<ENVIRONMENT>/$ENV/g" \
    /tmp/gh-deploy-policy-template.json > "/tmp/gh-deploy-policy-$ENV.json"

  # Create the role
  aws iam create-role \
    --role-name "${PROJECT}-backend-deploy-${ENV}-gha-deployer" \
    --assume-role-policy-document "file:///tmp/gh-oidc-trust-${ENV}.json" \
    --profile "$PROFILE"

  # Attach the policy
  aws iam put-role-policy \
    --role-name "${PROJECT}-backend-deploy-${ENV}-gha-deployer" \
    --policy-name gha-ecr-ssm-deploy \
    --policy-document "file:///tmp/gh-deploy-policy-${ENV}.json" \
    --profile "$PROFILE"
done

If you get "EntityAlreadyExists" errors, delete the existing roles first:

# For each environment, delete existing roles
for ENV in stage prod; do
  aws iam delete-role-policy \
    --role-name "${PROJECT}-backend-deploy-${ENV}-gha-deployer" \
    --policy-name gha-ecr-ssm-deploy \
    --profile "$PROFILE" || true

  aws iam delete-role \
    --role-name "${PROJECT}-backend-deploy-${ENV}-gha-deployer" \
    --profile "$PROFILE" || true
done

2.5 Verify EC2 Instance Permissions

Your EC2 instances should already have the required permissions from the manual deployment. Verify they include:

Required Managed Policies: - AmazonEC2ContainerRegistryReadOnly (should already be attached) - AmazonSSMManagedInstanceCore (add if missing)

Check current policies:

# Replace with your actual instance role name
INSTANCE_ROLE_NAME="${PROJECT}-backend-ec2-stage-role"  # or prod

aws iam list-attached-role-policies \
  --role-name "$INSTANCE_ROLE_NAME" \
  --profile "$PROFILE"

If AmazonSSMManagedInstanceCore is missing, add it:

# For stage
aws iam attach-role-policy \
  --role-name "${PROJECT}-backend-ec2-stage-role" \
  --policy-arn arn:aws:iam::aws:policy/AmazonSSMManagedInstanceCore \
  --profile "$PROFILE"

# For prod  
aws iam attach-role-policy \
  --role-name "${PROJECT}-backend-ec2-prod-role" \
  --policy-arn arn:aws:iam::aws:policy/AmazonSSMManagedInstanceCore \
  --profile "$PROFILE"

2.6 Verify EC2 Instance Tags

Your instances must be tagged correctly for SSM targeting:

Stage Instance Tags: | Key | Value | | ------------------ | ------- | | mow:component | backend | | mow:environment | stage |

Production Instance Tags:
| Key | Value | | ------------------ | ------- | | mow:component | backend | | mow:environment | prod |

Check instance tags:

# List instances by stage environment
aws ec2 describe-instances \
  --filters "Name=tag:${PROJECT}:environment,Values=stage" \
  --query "Reservations[].Instances[].{ID:InstanceId,State:State.Name,Tags:Tags}" \
  --region "$REGION" \
  --profile "$PROFILE" \
  --output table

# List instances by prod environment
aws ec2 describe-instances \
  --filters "Name=tag:${PROJECT}:environment,Values=prod" \
  --query "Reservations[].Instances[].{ID:InstanceId,State:State.Name,Tags:Tags}" \
  --region "$REGION" \
  --profile "$PROFILE" \
  --output table

3. GitHub Actions Workflows

3.1 Workflow Structure

The deployment uses three workflow files in .github/workflows/:

  1. build-images.yml - Builds and pushes images on every push to main
  2. deploy-stage.yml - Deploys to stage on pre-releases
  3. deploy-prod.yml - Deploys to production on releases

3.2 Docker Compose Deploy Configuration

The docker-compose.deploy.yml file uses environment variables for flexible deployments:

services:
  django:
    image: ${ECR_URI}/${PROJECT}/backend/${ENVIRONMENT}/django:${DEPLOY_TAG}
  # ... other services follow the same pattern

Key variables: - ECR_URI: Computed from AWS account ID and region - PROJECT: From GitHub environment variables
- ENVIRONMENT: Set by deployment workflow (stage or prod) - DEPLOY_TAG: Set to the commit SHA being deployed


4. Deployment Process

4.1 Development Workflow

  1. Develop and push changes to main
  2. GitHub Actions automatically:
  3. Builds multi-platform Docker images
  4. Pushes to both stage and prod ECR repositories
  5. Tags with commit SHA and environment names

4.2 Stage Deployment

  1. Create a pre-release on GitHub:
  2. Go to ReleasesCreate a new release
  3. Tag: Create a new tag (e.g., v1.0.0, v1.2.3-hotfix)
  4. Target: Select the latest commit on main
  5. Title: Descriptive release title
  6. ☑️ Set as a pre-release (this is key!)
  7. Publish release

  8. Automatic deployment:

  9. Pre-release triggers the stage deploy workflow
  10. Workflow resolves git tag to commit SHA
  11. Deploys images tagged with that SHA from stage ECR repos

4.3 Production Deployment

  1. Create a release on GitHub:
  2. Same process as stage, but do not check "Set as a pre-release"
  3. Leave it as a regular release

  4. Automatic deployment:

  5. Release triggers the production deploy workflow
  6. Deploys the same SHA to production environment

4.4 Manual Deployment

Both workflows support manual triggering:

  1. Go to Actions → Select deploy workflow
  2. Run workflow → Enter the git tag to deploy
  3. Workflow runs with the specified tag

5. SHA-Based Deployment Explained

When you create a release with tag v1.2.3, the deploy workflow:

# Resolves the git tag to its commit SHA
SHA=$(git rev-list -n 1 "v1.2.3")  # → abc123def456

# Sets DEPLOY_TAG to use that specific SHA
export DEPLOY_TAG="abc123def456"

# Docker Compose pulls exact images
# e.g., mow/backend/stage/django:abc123def456

This ensures you deploy exactly the code that was tagged, using images that definitely exist since the build workflow creates them with SHA tags.


6. Verification and Debugging

6.1 Verify GitHub Actions

Check that workflows are properly configured:

  1. Go to Actions tab in your repository
  2. Look for three workflows:
  3. "Build & Push Images"
  4. "Deploy to Stage"
  5. "Deploy to Production"

6.2 Test Build Workflow

Push a small change to main and verify:

  1. Build workflow triggers automatically
  2. Matrix strategy builds for both stage and prod
  3. Images are pushed to ECR with SHA and environment tags

6.3 Test Stage Deployment

  1. Create a pre-release following the process above
  2. Monitor the deploy workflow in the Actions tab
  3. Check SSM command execution:
# List recent SSM commands
aws ssm list-commands --region "$REGION" --profile "$PROFILE" --max-items 5

# Check command status
aws ssm list-command-invocations \
  --command-id <COMMAND_ID> \
  --details \
  --profile "$PROFILE"

6.4 Verify Deployment on EC2

SSH into your instance and check:

# Check running containers
docker ps

# Check container start times (should be recent)
docker inspect -f '{{.Name}} -> {{.State.StartedAt}}' $(docker ps -q) | sort

# Check docker compose status
cd /opt/app
docker compose ps

# Check application logs
docker compose logs --tail=20 django

6.5 Common Issues and Solutions

Issue Solution
Role assumption fails Check OIDC trust policy repo name matches exactly
SSM command not found Verify instance tags and SSM agent running
Image pull fails Ensure ECR repositories exist and images were built
Environment variables missing Check SSM Parameter Store configuration

7. Rollback Process

Since images are tagged with commit SHAs, rollbacks are straightforward:

  1. Find the previous working release in your GitHub releases
  2. Create a new release with a new tag pointing to the old commit
  3. Deploy as normal - the workflow will use the older commit's images

Alternatively, use manual workflow dispatch with the previous tag.


8. Security Considerations

  • OIDC tokens are short-lived and scoped to specific repositories
  • IAM roles use least-privilege permissions
  • ECR repositories are environment-isolated
  • SSM commands target only properly tagged instances
  • GitHub tokens have limited scope and automatic expiration

9. Monitoring and Observability

9.1 GitHub Actions

  • Monitor workflow runs in the Actions tab
  • Set up notifications for workflow failures
  • Review deployment history through release tags

9.2 AWS CloudWatch

  • SSM command execution logs
  • EC2 instance system logs
  • Application logs via Docker logging driver

9.3 Application Monitoring

The deployed stack includes Grafana, Prometheus, and other observability tools configured through the base Docker Compose setup.


Conclusion

This GitHub Actions setup provides:

  • Automated builds on every commit
  • Environment isolation through separate ECR repositories
  • Safe deployments using SHA-based image tagging
  • Flexible release management through GitHub releases
  • Infrastructure as Code approach to CI/CD

The deployment process scales well and provides clear traceability from git commits to deployed applications.


< Backend Deployment - AWS EC2 Manual Deploy

Next: Backend Deployment - Verify Dev Deployment >