With AWS, Terraform, and GitHub Actions
DevOps is more than just tools — it’s a philosophy aimed at improving collaboration between development and operations. That said, a common question is: “Where do I start?” This guide offers a practical answer.
You won’t find “DevOps in 5 steps” here — but you will find a simple way to start working with tools that can be part of a DevOps culture: AWS as the infrastructure provider, Terraform as the declarative language for provisioning, and GitHub Actions for automation.
Reminder: using these tools alone doesn’t make you a DevOps engineer. But learning them prepares you to contribute to teams that apply real DevOps practices.
A public S3 bucket, provisioned using Terraform and deployed automatically from GitHub.
Good structure leads to maintainable code. Define a folder layout that supports growth, new environments, and consistent practices.
dev/
├── us-east-1/
│ └── s3/
│ ├── terraform.tf
│ ├── main.tf
│ └── outputs.tf
stage/
├── us-east-1/
│ └── s3/
prod/
├── us-east-1/
│ └── s3/
modules/
└── s3/
├── main.tf
├── variables.tf
└── outputs.tf
We define three environments: dev, stage, and prod. We also define a reusable Terraform module named s3
.
The first folder inside each environment defines the AWS region. The next level defines the specific module used to deploy resources.
terraform.tf
: defines the backend and providers.main.tf
: the main logic — instantiates modules and resources.variables.tf
: declares inputs (e.g., bucket name).outputs.tf
: declares outputs (e.g., the bucket URL).First, define the backend — this keeps Terraform’s state safely stored. Add this to terraform.tf
:
terraform {
backend "s3" {
bucket = "myorg-state-buckets"
key = "s3.tfstate"
region = "us-east-1"
}
}
Terraform uses a local backend by default — this works locally, but won’t persist across GitHub Actions runs. That’s why we use a remote backend (like S3 or Terraform Cloud). In this example, we’ll use an S3 bucket.
terraform {
backend "s3" {
...
}
required_providers {
aws = {
source = "hashicorp/aws"
version = "5.99.1"
}
}
}
provider "aws" {
region = "us-east-1"
}
Now let’s create the bucket in modules/s3/main.tf
:
resource "aws_s3_bucket" "example" {
bucket = var.bucket_name
}
resource "aws_s3_bucket_ownership_controls" "example" {
bucket = aws_s3_bucket.example.id
rule {
object_ownership = "BucketOwnerPreferred"
}
}
resource "aws_s3_bucket_public_access_block" "example" {
bucket = aws_s3_bucket.example.id
block_public_acls = false
block_public_policy = false
ignore_public_acls = false
restrict_public_buckets = false
}
resource "aws_s3_bucket_acl" "example" {
depends_on = [
aws_s3_bucket_ownership_controls.example,
aws_s3_bucket_public_access_block.example,
]
bucket = aws_s3_bucket.example.id
acl = "public-read"
}
We use var.bucket_name
, which is declared in modules/s3/variables.tf
:
variable "bucket_name" {
type = string
description = "Name of the bucket to create"
}
Let’s also define an output to get the bucket URL. Add this in modules/s3/outputs.tf
:
output "bucket_domain_name" {
value = aws_s3_bucket.example.bucket_domain_name
}
Your module is ready! You can now deploy the bucket in dev
using dev/s3/main.tf
:
module "my_awesome_bucket" {
source = "../../modules/s3"
bucket_name = "dev-awesome-bucket"
}
This code deploys a public S3 bucket in the dev environment.
First, authenticate GitHub Actions with AWS.
Go to your repo → Settings → Secrets and variables → Actions, and create these secrets:
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
Then add this workflow at .github/workflows/deploy.yml
:
name: Deploy S3 Bucket
on:
push:
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: hashicorp/setup-terraform@v3
with:
terraform_version: 1.6.6
- name: Configure AWS Credentials
uses: aws-actions/configure-aws-credentials@v4.1.0
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: us-east-1
- name: Terraform Init
run: terraform init
- name: Terraform Apply
run: terraform apply -auto-approve
That was your first step. Now you can:
Learning DevOps isn’t about memorizing commands — it’s about understanding how to connect tools to build reliable pipelines. And today, you took a great first step!