Automation That Saved My Team Months of Toil
Automate Lambda Layer management with Terraform and GitHub Actions – A complete guide from pain to automation
I started my DevOps career with the Google Cloud platform and used Cloud Functions for event-driven serverless workloads. It was pretty straightforward — package the Python code with its dependencies, i.e. requirements.txt
and everything else will be taken care of by Cloud Function (i.e. GCP).
Then I moved to AWS and discovered Lambda, their popular serverless offering. It was great, except for one headache: managing code dependencies.
I was used to just using a requirements.txt
file, but that doesn’t work with Lambda. Instead, you have two options. You can either bundle all your libraries with your code in a big zip file (which is a pain to manage) or use something called Lambda Layers.
Before I explain Layers and how to use them effectively, let me give you a quick rundown on Lambda and serverless computing in general.
Table of Contents
- What is Serverless?
- What is AWS Lambda?
- What are Lambda Layers?
- How to Build Lambda Layers
- The Automation Solution
- Implementation Guide
- Running the Automation
What is Serverless?
Don’t let the term „serverless“ fool you — it doesn’t mean servers have vanished into thin air. Instead, think of it as „servers you don’t see.“ In this model, your cloud provider manages the resources for you, handling all the nitty-gritty of server maintenance.
At its core, serverless computing is about simplifying your life as a developer or business owner. It’s a world where you can focus on what truly matters — your code and business logic — without getting bogged down in the complexities of managing the machines that run it.
That’s the promise of serverless computing — it’s about freeing you to innovate, while your cloud provider handles the heavy lifting of infrastructure management.
What is AWS Lambda?
Lambda is the Function as a Service (serverless) offering by AWS. It uses an event-driven architecture, meaning it runs a piece of code when some event occurs. It integrates well with other AWS services such as DynamoDB, S3, SQS, API Gateway, etc.
Your code may be just a couple of lines, or spread across 2 files but if it requires 15 different packages to tie everything together, it’s referred to as a „deployment package.“
It is best practice to manage the code dependencies separately and this is where Lambda Layers come in handy.
What are Lambda Layers?
To quote AWS documentation:
A Lambda layer is a .zip file archive that contains supplementary code or data. Layers usually contain library dependencies, a custom runtime, or configuration files.
In simple words, if your Python code requires 5, 10, 15 libraries to run the code, you build them separately and enable Lambda to use them on runtime.
Key benefits of Lambda Layers:
- 📦 Separate dependencies from your application code
- 🔄 Share layers across multiple Lambda functions
- 🌐 Share layers across multiple AWS accounts
- 💾 Reduce deployment package size
- ⚡ Faster deployments and cold starts
How to Build Lambda Layers
Manual Method (The Old Way)
The simplest way of building Layers is to install the Python dependencies in your local machine inside a Python directory, zip it, and upload it to AWS layers. Your Lambda function can use the layer.
⚠️ Important Note: Creating a zip package from a non-Linux environment (even Mac) will NOT WORK with Lambda.
Docker Method
Alternatively, you can use Docker with --platform linux/amd64
flag to build the layer with the help of sam/build-python Docker images.
docker run --platform linux/amd64 -v
"$PWD":/var/task "public.ecr.aws/sam/build-python3.10" /bin/sh
-c "pip install -r requirements.txt -t python/; exit"
Cross-Account Sharing
If you want to share your Layer in a different AWS account, add permissions by running:
aws lambda add-layer-version-permission
--layer-name <Layer_name>
--statement-id xaccount
--action lambda:GetLayerVersion
--principal <AWS_ACCOUNT_ID>
--version-number <Layer_Version>
The Problem: Managing Lambda Layers manually can be a lot of pain when every other month you need to update the libraries to use new versions of libraries to take your security team off your back.
The Automation Solution
Automating Layer Build with Terraform and GitHub Actions
Before we start writing Terraform, create the file structure in your VSCode as shown below:
lambda-python-layers/
├── .github/
│ └── workflows/
│ └── build-lambda-layers.yml
├── layers/
│ ├── layer1/
│ │ └── requirements.txt
│ ├── layer2/
│ │ └── requirements.txt
│ └── common-scripts/
│ ├── logger.py
│ └── db_connection.py
├── providers.tf
├── variables.tf
├── config.tf
├── s3.tf
└── main.tf
Defining Python Dependencies for Layers
layers/layer1/requirements.txt
aws-psycopg2
requests==2.32.3
urllib3==2.2.2
layers/layer2/requirements.txt
pyzipper==0.3.6
pycryptodome==3.20.0
cryptography==41.0.7
PyPDF2==3.0.1
Sometimes we also need to use common scripts such as logger, and db-connection scripts used by Lambda functions. We can package them with the dependencies to avoid packaging them with Lambda code or building separate layers for these scripts.
Place those Python scripts in layers/common-scripts/
path.
Implementation Guide
Writing Terraform Configuration
💡 Note: You can find the complete code for this blog in my public GitHub repo.
providers.tf
Setting up Terraform providers and remote backend using S3 bucket I have already created to store the Terraform state files:
terraform {
required_version = ">= 1.6"
required_providers {
aws = {
source = "hashicorp/aws"
version = "~> 5.0"
}
}
backend "s3" {
bucket = "your-terraform-state-bucket"
key = "lambda-layers/terraform.tfstate"
region = "ap-south-1"
}
}
provider "aws" {
region = var.aws_region
}
variables.tf
We will store default values for Python layers such as runtime, and compatible runtimes:
variable "aws_region" {
description = "AWS region"
type = string
default = "ap-south-1"
}
variable "python_runtime" {
description = "Python runtime version"
type = string
default = "python3.10"
}
config.tf
We will create local variables to store the layer details such as layer name and path to the dependencies:
locals {
layer_definitions = [
{
identifier = "layer1"
path = "${path.module}/layers/layer1"
},
{
identifier = "layer2"
path = "${path.module}/layers/layer2"
}
]
}
s3.tf
Use the data source to get the AWS account ID and use it as a prefix to ensure the unique name of the S3 bucket, and use it to store the layers packages in .zip format:
data "aws_caller_identity" "current" {}
resource "aws_s3_bucket" "lambda_layers" {
bucket = "${data.aws_caller_identity.current.account_id}-lambda-layers-bucket"
}
resource "aws_s3_bucket_versioning" "lambda_layers" {
bucket = aws_s3_bucket.lambda_layers.id
versioning_configuration {
status = "Enabled"
}
}
main.tf
It will use the Terraform module (terraform-aws-modules/lambda/aws) for Lambda to build Python layers. I will package the custom scripts along with the layers for usability:
module "layers" {
source = "terraform-aws-modules/lambda/aws"
version = "~> 6.0"
for_each = { for i in local.layer_definitions : i.identifier => i }
create_layer = true
layer_name = each.value.identifier
description = "Lambda layer for ${each.value.identifier}"
compatible_runtimes = [var.python_runtime]
source_path = [
{
path = each.value.path
pip_requirements = true
prefix_in_zip = "python"
},
{
path = "${path.module}/layers/common-scripts"
prefix_in_zip = "python"
}
]
store_on_s3 = true
s3_bucket = aws_s3_bucket.lambda_layers.bucket
}
Cross-Account Layer Sharing (Optional)
If you want to share the layers across multiple accounts, add these configurations:
Append to config.tf:
locals {
# List of layer names
layer_names = [for i in local.layer_definitions : i.identifier]
# Accounts that should have permission on layer version
allowed_accounts = ["AWS_ACCOUNT_ID_1", "AWS_ACCOUNT_ID_2"]
# Mapping layers -> aws accounts for permission on layer version
layers_to_accounts = flatten([
for layer in local.layer_names : [
for account in local.allowed_accounts : {
id = "${layer}-${account}"
layer = layer
account = account
}
]
])
# Map to be used by for_each loop for resource
layers_to_accounts_map = { for item in local.layers_to_accounts : item.id => item }
}
Append to main.tf:
resource "aws_lambda_layer_version_permission" "lambda_layer_permission" {
for_each = local.layers_to_accounts_map
layer_name = module.layers[each.value.layer].lambda_layer_layer_arn
version_number = module.layers[each.value.layer].lambda_layer_version
principal = each.value.account
action = "lambda:GetLayerVersion"
statement_id = "${each.value.layer}-${each.value.account}-${random_integer.random.result}"
}
resource "random_integer" "random" {
min = 1
max = 1000
}
GitHub Actions Workflow
Here is the GitHub Action workflow that will build the layers:
.github/workflows/build-lambda-layers.yml
name: Build Lambda Layers with Terraform
on:
# On push to main branch when there is a change in dependencies
push:
branches:
- main
paths:
- lambda-python-layers/layers/
# Run manually
workflow_dispatch:
jobs:
terraform:
runs-on: ubuntu-latest
env:
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_REGION: ap-south-1
TERRAFORM_VER: 1.6.6
TERRAFORM_PATH: lambda-python-layers/
PYTHON_VERSION: '3.10'
steps:
- name: Checkout Repository
uses: actions/checkout@v4
- name: Setup Terraform
uses: hashicorp/setup-terraform@v3
with:
terraform_version: ${{ env.TERRAFORM_VER }}
- name: Setup Python
uses: actions/setup-python@v5
with:
python-version: ${{ env.PYTHON_VERSION }}
- name: Terraform Init
working-directory: ${{ env.TERRAFORM_PATH }}
run: terraform init
- name: Terraform Plan
working-directory: ${{ env.TERRAFORM_PATH }}
run: terraform plan -out=tfplan
- name: Terraform Apply
working-directory: ${{ env.TERRAFORM_PATH }}
run: terraform apply -auto-approve tfplan
Workflow Explanation
This GitHub Action workflow is designed to deploy Terraform to build the layers for a specific path (layers/
) within a repository.
Triggers:
- On push to the main branch, specifically if files within the
lambda-python-layers/layers/
directory are updated - Can be triggered manually via
workflow_dispatch
Job Configuration:
- Job runs on the
ubuntu-latest
runner - AWS credentials and region are sourced from GitHub Secrets
- Terraform version (
1.6.6
) and Python version (3.10
) are specified
Steps:
- Checkout repository using
actions/checkout@v4
- Install Terraform using
hashicorp/setup-terraform@v3
- Install Python 3.10 using
actions/setup-python@v5
- Run Terraform commands to deploy the layers
Running the Automation
Step-by-Step Setup
You can clone the code from my Public GitHub Repo:
# Clone my GitHub repo
git clone https://github.com/akhileshmishrabiz/Devops-zero-to-hero.git
# Create a folder/directory
mkdir lambda-layer-example
cd lambda-layer-example
# Copy the terraform from Devops-zero-to-hero/lambda-python-layers
cp -r Devops-zero-to-hero/lambda-python-layers .
# Create a .github/workflows folder to store the GitHub Action workflow
mkdir -p .github/workflows
# Create a .yml file
touch .github/workflows/layer-build.yml
# Copy the workflow code from the cloned repo
cp Devops-zero-to-hero/.github/workflows/build-lambda-layer.yml .github/workflows/layer-build.yml
# Create a github repository and push the code from lambda-layer-example to your repo
Configure GitHub Secrets
Create access and secret keys and store them as secrets in your GitHub repo:
- Go to your repo → Settings → Secrets and variables → Actions
- Create two secrets:
-
AWS_ACCESS_KEY_ID
= Your AWS Access Key -
AWS_SECRET_ACCESS_KEY
= Your AWS Secret Key
-
Deploy the Automation
Push the code to trigger the workflow, or run it manually from the Actions tab.
You can see the layers in your AWS account: Lambda → Layers
AWS Lambda Layers in Console
Benefits of This Automation
Before Automation | After Automation |
---|---|
❌ Manual zip creation | ✅ Automated builds |
❌ Platform compatibility issues | ✅ Consistent Linux builds |
❌ Version management pain | ✅ Automated versioning |
❌ Manual uploads | ✅ Automatic deployment |
❌ Cross-account sharing complexity | ✅ Terraform-managed permissions |
Conclusion
This automation solution has saved my team countless hours of manual Lambda Layer management. Instead of spending time on repetitive packaging and uploading tasks, we can focus on writing better Lambda functions.
The combination of Terraform and GitHub Actions provides:
- Consistency across environments
- Reproducibility of builds
- Version control for dependencies
- Automated security updates
The Terraform code for this is available here, and the GitHub Action workflow is here.
About Me
I am Akhilesh Mishra, a self-taught DevOps engineer with 11+ years working on private and public cloud (GCP & AWS) technologies.
Connect with me:
Tags
#aws
#lambda
#terraform
#automation
#devops
#serverless
#githubactions
#infrastructure
#iac
#cicd
Found this automation helpful? Follow for more AWS and DevOps content!