Per AWS-Docs, AWS Lambda is a serverless computing service provided by Amazon Web Services (AWS) that allows developers to run code without provisioning or managing servers. With Lambda, you can execute code in response to events such as HTTP requests, database changes, or file uploads, scaling automatically with usage. It supports various programming languages and integrates seamlessly with other AWS services, enabling flexible and cost-effective application development. In this note, I demonstrate how to create an AWS Lambda function using Terraform and GitHub Actions. To add variety to the use case, I added three more functionalities: 1. Trigger the AWS Lambda function on a schedule. 2. Read a parameter value from the AWS Systems Manager Parameter Store. 3. Log the parameter value to the Amazon CloudWatch Log group log.
The use case’s scheduling aspect is managed by the Amazon EventBridge scheduler. You can read more about that at AWS-Docs: EventBridge Scheduler. You will also learn how I automated the provisioning process using GitHub Actions.
Please note that you probably won’t require such a use case in a real development scenario. I’ve only combined three specific use cases to show how these AWS Cloud Services interact.
Pre-requisites: Since this note demonstrates communication between AWS Cloud services and GitHub Actions, I created an AWS IAM Role with a trust relationship with this GitHub repository. You can read about that at -create-an-open-ID-connect.
If you are interested in learning about how to integrate Terraform workflow with GitHub Actions, please check out the following note -ci-cd-with-terraform-and-github-actions-to-deploy-to-aws.
Let us now walk through all the resources provisioned in this repository. If you are interested and want to follow along, here is the link to my GitHub repository: kunduso/add-aws-lambda-terraform. Please note that the branch name = add-lambda
.
Step 1: Create an AWS KMS Key to encrypt
In this use case, two AWS cloud resources and an AWS cloud resource property require encryption: the AWS Systems Manager Parameter Store parameter, the Amazon CloudWatch log group, and the environment variable of the AWS Lambda function. These resources are encrypted using the AWS KMS key.
A key policy is attached to the AWS KMS key as a security best practice. Key policies are essential because they ensure that only authorized entities can access and use the cryptographic keys stored in KMS.
Step 2: Create an Amazon CloudWatch Log group and stream
The Amazon CloudWatch log group and log stream store the logs generated from the AWS Lambda function execution. The AWS Lambda function will log the Systems Manager Parameter Store parameter value into the log stream. The log group is encrypted using the AWS KMS key created in the previous step.
Step 3: Create an AWS Systems Manager Parameter Store parameter
This is the AWS Cloud resource whose value the AWS Lambda function reads and then writes to the Amazon CloudWatch Log group’s log stream.
Step 4: Create an AWS IAM role and policy
AWS Lambda requires an IAM role to execute it. An IAM role has two policies: the assume role policy and the permission policy. The assume role policy states who or what can assume that role, and the permission policy states what operations that role can perform on which AWS cloud resources. Below is the code for the assume role policy, which states that the role can be assumed by the service principal
lambda.amazonaws.com
.
Below is the code for the permissions role policy.
This permission policy provides specific access to only three resources: the Systems Manager Parameter Store parameter to access the parameter value, the Amazon CloudWatch log group to create a log stream and put the log events, and the AWS KMS key to decrypt the key.
You might wonder why I added the logs:CreateLogStream
statement, which was already covered in the previous step. I used the same Log group for this Lambda function’s execution logs.
Step 5: Create the python file
The Lambda function passes a few variables to the Python file, which, based on that, creates an
ssm_client
and a logs_client
using the boto3
library. Then, the Python file reads the parameter value from the ssm_client
and writes that into logs_client
. It also logs a separate message to the Lambda function’s execution log using the logging.info()
function.
Step 6: Create the AWS Lambda
Terraform creates a zip file using the
archive_file
data. This zip file is then uploaded to the AWS Lambda function.
As you can examine from the preceding image, the AWS Lambda function consumes the Python file stored in the zip. It also requires the IAM role that we created previously. This IAM role provides limited access to the AWS Lambda function on operations that it can perform. The
handler
lists where to start execution -PythonFileName.PythonFunctionName
. The AWS KMS key is required to encrypt the environment variables used in the function. The logging_config
stores the log_group
information to store the execution logs along with the log_format
and system_log_level
. Finally, the environment
variables pass the variables shared with the function.
Step 7: Create an Amazon EventBridge Scheduler
The use case required the AWS Lambda to be triggered on a schedule. Three AWS Cloud resources are needed for that: the
aws_cloudwatch_event_rule
, which determines the frequency, which is set to 10 minutes
; the aws_cloudwatch_event_target
, which is set to the ARN of the AWS Lambda function; and finally, the aws_lambda_permission
, which enables the Amazon EventBridge Scheduler rule to invoke the AWS Lambda function.
Once the GitHub Actions pipeline deployed these resources, I logged into my AWS Console and verified that all the resources were created successfully. I also found two log streams; one had logs from the Lambda function, and the other was from the Lambda execution. Below is an image of the log stream that the AWS Lambda function wrote using the put_log_events()
function.
As an experiment, you may change the value of the Systems Manager Parameter Store parameter and see what value is written to the Amazon CloudWatch log on its subsequent execution.
That brings us to the end of this note. I hope you learned how to create an AWS Lambda function and other AWS services using Terraform and GitHub Actions. If you have any questions or suggestions, please feel free to reach out via the comments.