Manage file Transfer between S3 Buckets using Lambda

Diagram

AWS Lambda Basics Explanation
Function Event: is the data who triggers the lambda function.
Function context: it's properties and methods allow your function to access vital information about its execution environment, which can be crucial for tasks such as logging, error handling, and resource management examples (function_name, invoked_function_arn,aws_request_id, and etc...).
Function environment variables: use it for configuration settings, and reusable function
Layers: Allow you to package libraries and other dependencies to reduce the size of deployment archives and makes it faster to deploy your code.
Differences between Sync and A Sync:
Synchronous Invocation:
The caller waits for the function to process the results.
The function's response is returned directly to the caller.
Suitable for real-time applications where immediate feedback is required.
Asynchronous Invocation:
The caller sends an event to Lambda and gets a quick success response, while Lambda processes the event in the background.
Lambda queues the event for processing and returns a
202 ACCEPTED
status code.Ideal for background tasks or operations where immediate results are not critical.
File Structure
.
├── .terraform/
├── README.md
├── .gitignore
├── .terraform.lock.hcl
├── iam.tf
├── lambda_function.py
├── lambda.tf
├── lambda.zip
├── provider.tf
├── s3.tf
├── terraform.tfvars
└── variable.tf
Steps
# Create External Bucket
resource "aws_s3_bucket" "forgtech-external-bucket" {
bucket = "frogtech-us-external"
force_destroy = true
tags = {
Enviroment = var.environment[0]
Owner = var.environment[1]
}
}
# Create Internal Bucket
resource "aws_s3_bucket" "forgtech-internal-bucket" {
bucket = "frogtech-us-internal"
force_destroy = true
tags = {
Enviroment = var.environment[0]
Owner = var.environment[1]
}
}
Created Two S3 Bucket ( Internal and External )
# allow lambda to use this IAM Role
data "aws_iam_policy_document" "lambda-assume-policy" {
statement {
actions = [
"sts:AssumeRole"
]
principals {
type = "Service"
identifiers = ["lambda.amazonaws.com"]
}
effect = "Allow"
}
}
resource "aws_iam_role" "lambda-s3-transfer-file-role" {
name = "lambda-role"
assume_role_policy = data.aws_iam_policy_document.lambda-assume-policy.json
tags = {
Enviroment: var.environment[0]
Owner: var.environment[1]
}
}
# Make Inline Policy to allow lambda that can Get and Upload Object Through S3 bucket
data "aws_iam_policy_document" "lambda-policy-to-access-s3" {
statement {
actions = [
"s3:GetObject",
"s3:PutObject",
"s3:DeleteObject"
]
resources = [
"${aws_s3_bucket.forgtech-external-bucket.arn}/*",
"${aws_s3_bucket.forgtech-internal-bucket.arn}/*"
]
effect = "Allow"
}
}
# Convert json to arn so i can attach policy to lambda role
resource "aws_iam_policy" "json-arn-lambda-policy" {
name = "lambda-policy"
description = "lambda s3 policy"
policy = data.aws_iam_policy_document.lambda-policy-to-access-s3.json
tags = {
Enviroment: var.environment[0]
Owner: var.environment[1]
}
}
# attach policy to lambda IAM role
resource "aws_iam_role_policy_attachment" "lambda-role-policy" {
role = aws_iam_role.lambda-s3-transfer-file-role.name
policy_arn = aws_iam_policy.json-arn-lambda-policy.arn
}
Created Assume Policy to allow Lambda use IAM Role that I created, Then created Inline policy that make anyone with this permissions could get, delete, and upload object in S3, Then created resource aws_iam_policy to convert policy to JSON and attach it to the role in aws_iam_role_policy_attachment
# Create Lambda function with the code to transfer file from external to internal s3
resource "aws_lambda_function" "forgtech-file-transfer-function" {
filename = "lambda.zip"
function_name = "s3-file-transfer-function"
role = aws_iam_role.lambda-s3-transfer-file-role.arn
handler = "lambda_function.lambda_handler"
runtime = "python3.9"
environment { # its Environment vars in the code
variables = {
SOURCE_BUCKET = aws_s3_bucket.forgtech-external-bucket.bucket # src that get object from external s3
DEST_BUCKET = aws_s3_bucket.forgtech-internal-bucket.bucket # dest that upload object to internal s3
}
}
}
# allow lambda s3 bucket event to incoke my lambda function from s3 external bucket
resource "aws_lambda_permission" "allow-s3" {
statement_id = "AllowS3Invoke"
action = "lambda:InvokeFunction"
function_name = aws_lambda_function.forgtech-file-transfer-function.function_name
principal = "events.amazonaws.com"
source_arn = "${aws_s3_bucket.forgtech-external-bucket.arn}"
}
# set trigger or event when object is created it trigger lambda function
resource "aws_s3_bucket_notification" "external_bucket_notification" {
bucket = aws_s3_bucket.forgtech-external-bucket.bucket
lambda_function {
events = ["s3:ObjectCreated:*"]
lambda_function_arn = aws_lambda_function.forgtech-file-transfer-function.arn
}
depends_on = [aws_lambda_permission.allow-s3]
}
Created aws_lambda_function and named it s3-file-transfer-function and put it the of the code lambda.zip, I used environment object for flexible and reusable code in lambda_function.py file, Then created aws_lambda_permission to allow s3 external bucket to trigger lambda, And finally created aws_s3_bucket_notification to create event when object is uploaded in external s3 bucket then trigger lambda function.
You can check whole Task in here ( Github )
Working Examples




Conclusion
I learned how to connect Lambda with S3 buckets. I used an IAM role and policy to allow Lambda to interact with the S3 buckets. Then, I created a Lambda function with the code to transfer files from one bucket to another. Finally, I set permissions and triggers to invoke the Lambda function when a new object is uploaded to the external S3 bucket.
Last updated