Use api Gateway to create / delete / read item in dynamodb using terraform
Contents
1. Project Overview
2. Prerequisite
3. Steps
- 3.1. Terraform code to deploy infrastructure
- 3.2. Yaml file for each Codebuild project
- 3.3. Create build project for each stages
- 3.4. Create pipeline
4. Output
5. Conclusion
1. Project Overview
- Define a DynamoDB table for storing and retrieving data
- Create a Lambda function for each API endpoint
- Define your API resources and methods in API Gateway
- Use Terraform to define your infrastructure as code
- Set up a CodePipeline to automatically deploy your infrastructure changes
2. Prerequisite
- Role for Codebuild and Codepipeline should be present with sufficient permission
3. Steps
- There are multiple steps present for this project. I will try to categorize to make it simple.
3.1.Terraform code to deploy infrastructure
The following are the required steps to start working with Terraform on AWS:
In codecommit repository add below files.
- Creating a dynamodb.tf: Creating a dynamodb.tf file allows you to define your tables, attributes, and other configurations using Terraform. This makes it easier to manage your DynamoDB tables in a consistent and scalable way.
resource "aws_dynamodb_table" "basic-dynamodb-table"
name = "student"
billing_mode = "PROVISIONED"
read_capacity = 20
write_capacity = 20
hash_key = "Id"
attribute {
name = "Id"
type = "S"
}
tags = {
Name = "dynamodb-table-1"
Environment = "production"
}
}
- Creating a create_item.tf : Creating a create_item.tf file in a Lambda function to use Terraform to define and manage the resources needed to create an item in a database, such as a DynamoDB table.
data "archive_file" "zip2"
type = "zip"
source_file = "create_item.py"
output_path = "create_item.zip"
}
resource "aws_lambda_function" "lambda" {
function_name = "terraform-lambda_create"
filename = data.archive_file.zip2.output_path
source_code_hash = data.archive_file.zip2.output_base64sha256
role = aws_iam_role.test_role2.arn
handler = "create_item.lambda_handler"
runtime = "python3.9"
timeout = 900
}
- Creating a get_item.tf : Creating a get_item.tf file in a Lambda function to use Terraform to define and manage the resources needed to retrieve an item from a database, such as a DynamoDB table
data "archive_file" "zip"
type = "zip"
source_file = "get_item.py"
output_path = "get_item.zip"
}
resource "aws_lambda_function" "lambda2" {
function_name = "terraform-lambda_get"
filename = data.archive_file.zip.output_path
source_code_hash = data.archive_file.zip.output_base64sha256
role = aws_iam_role.test_role1.arn
handler = "get_item.lambda_handler"
runtime = "python3.9"
timeout = 900
}
{
- Creating a delete_item.tf : Create a delete_item.tf file in a Lambda function to use Terraform to define and manage the resources needed to delete an item from a database, such as a DynamoDB table
data "archive_file" "zip3"
type = "zip"
source_file = "delete_item.py"
output_path = "del_item.zip"
}
resource "aws_lambda_function" "lambda3" {
function_name = "terraform-lambda_delete"
filename = data.archive_file.zip3.output_path
source_code_hash = data.archive_file.zip3.output_base64sha256
role = aws_iam_role.test_role3.arn
handler = "delete_item.lambda_handler"
runtime = "python3.9"
timeout = 900
}
- Creating a create_item.py : This code assumes that you have the necessary permissions to create items in the specified DynamoDB table, and that the boto3 library is installed in your Lambda function's environment.
import json
import boto3
def lambda_handler(event, context):
print(event['body'])
temp = json.loads(event['body'])
client = boto3.resource('dynamodb')
table = client.Table('student')
table.put_item(Item=temp)
return {
"statusCode": 200,
"body": "successfully inserted",
"headers": {
"content-type": "application/json"
}
}
- Create get_item.py: This code assumes that you have the necessary permissions to retrieve items from the specified DynamoDB table, and that the boto3 library is installed in your Lambda function's environment.
import json
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('student')
table.delete_item(
Key={
'Id': event['queryStringParameters']['id']
}
)
return {
"statusCode": 200,
"body": "successfully deleted",
"headers": {
"content-type": "application/json"
}
}
- Creating a delete_item.py : This code assumes that you have the necessary permissions to delete items from the specified DynamoDB table, and that the boto3 library is installed in your Lambda function's environment.
import json
import boto3
def lambda_handler(event, context):
dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('student')
table.delete_item(
Key={
'Id': event['queryStringParameters']['id']
}
)
return {
"statusCode": 200,
"body": "successfully deleted",
"headers": {
"content-type": "application/json"
}
}
- Creating a apigateway.tf: By creating an apigatway.tf using can manage your API infrastructure as code and keep it version controlled, which makes it easier to deploy, test, and maintain.
resource "aws_api_gateway_rest_api" "product_apigw"
name = "product_apigw"
description = "Product API Gateway"
endpoint_configuration {
types = ["REGIONAL"]
}
}
resource "aws_api_gateway_resource" "product" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
parent_id = aws_api_gateway_rest_api.product_apigw.root_resource_id
path_part = "read"
}
resource "aws_api_gateway_method" "createproduct" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
resource_id = aws_api_gateway_resource.product.id
http_method = "POST"
authorization = "NONE"
}
resource "aws_api_gateway_integration" "createproduct-lambda" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
resource_id = aws_api_gateway_method.createproduct.resource_id
http_method = aws_api_gateway_method.createproduct.http_method
integration_http_method = "POST"
type = "AWS_PROXY"
uri = aws_lambda_function.lambda2.invoke_arn
}
resource "aws_lambda_permission" "apigw-CreateProductHandler" {
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.lambda2.function_name
principal = "apigateway.amazonaws.com"
source_arn = "${aws_api_gateway_rest_api.product_apigw.execution_arn}/*"
}
resource "aws_api_gateway_resource" "product2" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
parent_id = aws_api_gateway_rest_api.product_apigw.root_resource_id
path_part = "create"
}
resource "aws_api_gateway_method" "createproduct2" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
resource_id = aws_api_gateway_resource.product2.id
http_method = "POST"
authorization = "NONE"
}
resource "aws_api_gateway_integration" "createproduct-lambda2" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
resource_id = aws_api_gateway_method.createproduct2.resource_id
http_method = aws_api_gateway_method.createproduct2.http_method
integration_http_method = "POST"
type = "AWS_PROXY"
uri = aws_lambda_function.lambda.invoke_arn
}
resource "aws_lambda_permission" "apigw-CreateProductHandler2" {
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.lambda.function_name
principal = "apigateway.amazonaws.com"
source_arn = "${aws_api_gateway_rest_api.product_apigw.execution_arn}/*"
}
resource "aws_api_gateway_resource" "product3" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
parent_id = aws_api_gateway_rest_api.product_apigw.root_resource_id
path_part = "Delete"
}
resource "aws_api_gateway_method" "createproduct3" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
resource_id = aws_api_gateway_resource.product3.id
http_method = "POST"
authorization = "NONE"
}
resource "aws_api_gateway_integration" "createproduct-lambda3" {
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
resource_id = aws_api_gateway_method.createproduct3.resource_id
http_method = aws_api_gateway_method.createproduct3.http_method
integration_http_method = "POST"
type = "AWS_PROXY"
uri = aws_lambda_function.lambda3.invoke_arn
}
resource "aws_lambda_permission" "apigw-CreateProductHandler3" {
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.lambda3.function_name
principal = "apigateway.amazonaws.com"
source_arn = "${aws_api_gateway_rest_api.product_apigw.execution_arn}/*"
}
resource "aws_api_gateway_deployment" "productapistageprod" {
depends_on = [
aws_api_gateway_integration.createproduct-lambda, aws_api_gateway_integration.createproduct-lambda2, aws_api_gateway_integration.createproduct-lambda3
]
rest_api_id = aws_api_gateway_rest_api.product_apigw.id
stage_name = "prod"
}
- Creating a provider.tf : provider.tf file is used to define the provider configuration for your Terraform project and is an essential part of creating and managing resources with Terraform.
terraform
required_providers {
aws = {
source = "hashicorp/aws"
version = "4.64.0"
}
}
}
provider "aws" {
region="us-east-1"
}
provider "archive" {}{
- Creating a role.tf : In Terraform, a role.tf file is typically used to define an IAM role in AWS. An IAM (Identity and Access Management) role is an AWS service that allows you to grant specific permissions to users or AWS resources within your account. Roles are useful in a number of scenarios, including granting access to AWS services, providing permissions to third-party applications, and enforcing security policies.
resource "aws_iam_role" "test_role1"
name = "test_get"
# Terraform's "jsonencode" function converts a
# Terraform expression result to valid JSON syntax.
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Sid = ""
Principal = {
Service = "lambda.amazonaws.com"
}
},
]
})
tags = {
tag-key = "tag-value"
}
}
resource "aws_iam_role_policy_attachment" "test-attach" {
role = aws_iam_role.test_role1.name
policy_arn = aws_iam_policy.rule1.arn
}
/* role for create */
resource "aws_iam_role" "test_role2" {
name = "test_create"
# Terraform's "jsonencode" function converts a
# Terraform expression result to valid JSON syntax.
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Sid = ""
Principal = {
Service = "lambda.amazonaws.com"
}
},
]
})
tags = {
tag-key = "tag-value"
}
}
resource "aws_iam_role_policy_attachment" "test-attach2" {
role = aws_iam_role.test_role2.name
policy_arn = aws_iam_policy.rule2.arn
}
/* role for delete item */
resource "aws_iam_role" "test_role3" {
name = "test_delete"
# Terraform's "jsonencode" function converts a
# Terraform expression result to valid JSON syntax.
assume_role_policy = jsonencode({
Version = "2012-10-17"
Statement = [
{
Action = "sts:AssumeRole"
Effect = "Allow"
Sid = ""
Principal = {
Service = "lambda.amazonaws.com"
}
},
]
})
tags = {
tag-key = "tag-value"
}
}
resource "aws_iam_role_policy_attachment" "test-attach3" {
role = aws_iam_role.test_role3.name
policy_arn = aws_iam_policy.rule3.arn
}{
- Creating a policy.tf : In Terraform, a policy.tf file is typically used to define an IAM policy in AWS. An IAM (Identity and Access Management) policy is a set of permissions that determine what actions are allowed or denied for a specific AWS resource or group of resources. Policies can be attached to users, groups, or roles, and can be used to grant or restrict access to AWS services and resources.
Recommended by LinkedIn
resource "aws_iam_policy" "rule1"
name = "role_get_policy"
description = "My test policy"
# Terraform's "jsonencode" function converts a
# Terraform expression result to valid JSON syntax.
policy = jsonencode({
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "dynamodb:*",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"logs:CreateLogStream",
"logs:CreateLogGroup",
"logs:PutLogEvents"
],
"Resource": "arn:aws:logs:*:*:*"
}
]
})
}
/* policy for create item */
resource "aws_iam_policy" "rule2" {
name = "role_create_policy"
description = "My test policy"
# Terraform's "jsonencode" function converts a
# Terraform expression result to valid JSON syntax.
policy = jsonencode({
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "dynamodb:*",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"logs:CreateLogStream",
"logs:CreateLogGroup",
"logs:PutLogEvents"
],
"Resource": "arn:aws:logs:*:*:*"
}
]
})
}
/* policy for delete item */
resource "aws_iam_policy" "rule3" {
name = "role_delete_policy"
description = "My test policy"
# Terraform's "jsonencode" function converts a
# Terraform expression result to valid JSON syntax.
policy = jsonencode({
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "dynamodb:*",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"logs:CreateLogStream",
"logs:CreateLogGroup",
"logs:PutLogEvents"
],
"Resource": "arn:aws:logs:*:*:*"
}
]
})
}{
- creating a backend.tf : backend.tf file is used to configure the backend that will be used to store the state of your infrastructure. It allows you to define the backend, share state data, ensure state consistency, and enable remote operations.
terraform
backend "s3" {
region = "us-east-1"
profile = "default"
key = "codebuild/dynamolambda.tfstate"
bucket = "rahul030198"
}
}{
3.4. Yaml file for each Codebuild project
- Creating a plan.yaml : Used this yaml file to create the plan of the terraform
version: 0.2
phases:
install:
commands:
- "apt install unzip -y"
- "wget https://releases.hashicorp.com/terraform/1.0.7/terraform_1.0.7_linux_amd64.zip"
- "unzip terraform_1.0.7_linux_amd64.zip"
- "mv terraform /usr/local/bin/"
pre_build:
commands:
- terraform init
build:
commands:
- terraform plan
- Creating a apply.yaml : Used this yaml file to apply the infrastructure
version: 0.2
phases:
install:
commands:
- "apt install unzip -y"
- "wget https://releases.hashicorp.com/terraform/1.0.7/terraform_1.0.7_linux_amd64.zip"
- "unzip terraform_1.0.7_linux_amd64.zip"
- "mv terraform /usr/local/bin/"
pre_build:
commands:
- terraform init
build:
commands:
- terraform apply -auto-approve22
- Creating a destroy.yaml: Used this yaml file to destroy infrastructure
version: 0.2
phases:
install:
commands:
- "apt install unzip -y"
- "wget https://releases.hashicorp.com/terraform/1.0.7/terraform_1.0.7_linux_amd64.zip"
- "unzip terraform_1.0.7_linux_amd64.zip"
- "mv terraform /usr/local/bin/"
pre_build:
commands:
- terraform init
build:
commands:
- terraform destroy -auto-approve22
3.3. Create build project for each stages : Now we need to configure our build project. Go to CodeBuild click on create build project and do the following configuration.
1. Build project for terraform plan :
- Give a suitable name for your build project.
- Give Source provider as codecommit and repository which you have already created. Branch as Reference type and give your corresponding branch.
- Select Environment image and operating system , I went with managed image and ubuntu.
- Specify the Role that we have created for codebuild and the pipeline , and add buildspec file ,for mine it is plan.yaml.
- Similarly I created two more build projects for apply and destroy, with buildspecfiles as apply.yaml and destroy.yaml.
3.4. Create pipeline: The AWS CodePipeline will be used for CI/CD (Continuous Integration/Continuous Delivery). Our pipeline consists of four stages — Sources, terraform-plan, terraform apply, approval for destroy , terraform destroy.
4. Output :
5. Conclusion : Using Terraform to manage your API Gateway and DynamoDB resources allows you to version and track changes to your infrastructure, collaborate with other developers more easily, and make changes more quickly and reliably. With Terraform, you can automate the process of creating and updating your infrastructure, reducing the risk of errors and increasing the speed at which you can deploy new features and updates.
This was pretty long, but I hope the extra details help anyone wanting to do something similar, CI/CD fully integrated into AWS.
Any thoughts? Drop a comment! Like the article? Clap a few times below and let me know how much you enjoyed it .
Thank you!
Carlos Bedoya
Cool
Really awesome.