Chunk 4: Navigating IAC and CI/CD in the AWS Cloud Resume Challenge

Chunk 4: Navigating IAC and CI/CD in the AWS Cloud Resume Challenge

This chunk is the last part of the AWS Cloud Resume Challenge where the working system has to be ported to IAC and and be backed by CI/CD for both the front-end and back-end processes of the system. Being a cloud-agnostic tool, I decided to use Terraform for IAC. For CI/CD, I chose GitHub Actions as it is already integrated into GitHub which is what I use for version control.

Here's the link to the website: Don Angeles | AWS SAA, PMI CAPM

Typing the Infrastructure

Much of the difficulty in my experience with implementing the system through IAC came from choosing to import existing resources into Terraform and is detailed here:

Deciding to import resources instead of destroying everything and recreating them from scratch made me try to search for an efficient way to do so rather than having to import those one by one. I discovered tools that do this like Terracognita. Although it does great work of importing resources, I found that much of the work is abstracted whereas I needed to understand more about how and what gets imported into where. This meant that I had to use Terraform's existing manual method of importing resources.

Fortunately, I discovered that it had recently been released (>=v1.5) import blocks which enabled the generation of resource blocks as given the ID of the resource.

    import {
      to = aws_s3_bucket.test_bucket
      id = "tf-import-test-bucket"
    }

Then running a terraform plan -generate-config-out=generated_resources.tf would generate a file containing the resource block.

  # __generated__ by Terraform
  # Please review these resources and move them into your main configuration files.

  # __generated__ by Terraform from "tf-import-test-bucket"
  resource "aws_s3_bucket" "test_bucket" {
    bucket              = "tf-import-test-bucket"
    bucket_prefix       = null
    force_destroy       = null
    object_lock_enabled = false
    tags                = {}
    tags_all            = {}
  }

I repeated this step for all resources but as posted in a previous blog, I imported existing resources into modules as well to try and visually simplify the Terraform configuration files.

I told Gian, a dear friend of mine, that it felt magical the first time I successfully created an S3 bucket using Terraform. Until now, every time I make changes to my Terraform config and apply those, I am still amazed about how I don't need to click on the console for changes.

Tying Up Everything with CI/CD

The goal of CI/CD is to streamline and accelerate the development and deployment process of a system by ensuring that tests are performed as changes are made and that deployments are rapid and reliable. GitHub Actions enable CI/CD through the automation of workflows directly in a Github repository. Workflows to test and deploy can be triggered by events in a repo such as when making a push or, a pull request. It is important to note that proper AWS credentials are needed to enable the workflows to successfully run the actions.

I created a back-end workflow that handles the changes in the infrastructure and a front-end workflow that handles changes on the website interface. The separation of the workflows is necessary to only run the tests and commands related to what's being modified.

name: Backend CI/CD

on:
  push:
    paths:
      - '**.tf'
      - '**.py'

jobs:
  test-and-deploy:
    runs-on: ubuntu-latest
    env:  # Declare environment variables at the job level
      AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
      AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
      AWS_REGION: 'us-east-1' 

    steps:
      - uses: actions/checkout@v2

      - name: Set up Python
        uses: actions/setup-python@v2
        with:
          python-version: '3.x'

      - name: Install Python dependencies
        run: pip install -r requirements.txt

      - name: Run Python tests
        run: pytest

      - name: Set up Terraform
        uses: hashicorp/setup-terraform@v1

      - name: Terraform Init
        run: terraform init

      - name: Terraform Plan
        run: terraform plan

      - name: Terraform Apply
        if: github.ref == 'refs/heads/main'
        run: terraform apply -auto-approve

The above is my back-end workflow where whenever changes are made to my Lambda function, on push to the GitHub repo, Pytest is run to check if it is working as intended and then only if successful, Terraform is run and applied to make the changes in my AWS account.

For the front-end workflow, when any of the website resources such as the HTML code is modified, the changes are synced with the S3 bucket and the CloudFront distribution is invalidated for the website to almost instantly reflect the changes.

I have always been comfortable in simply making changes and running commands every time I made changes completing this part of the project imbibed in me the value of having automated testing and deployment of changes. Employing automation saved a lot of time compared to making manual changes and tests, and helped ensure that fewer errors were deployed. As of writing, I have 17 combined workflow runs which amount to around only 11 minutes to test and deploy those changes. Each of those workflow runs point exactly to where errors are caught, if any, and allowed me to more quickly apply fixes to my code.

Finally, Complete

The implementation of the IAC and the CI/CD pipeline marks the culmination of the challenge and is one that I am truly delighted to learn about. Embracing IAC allowed for efficient infrastructure management, while CI/CD streamlined the deployment process helping ensure robust and error-free releases. As mentioned in the challenge book, this chunk shows how professional Cloud Engineers deploy services and why aspiring ones should adopt a mindset of continuous improvement and automation.