Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Minor: Implementation of the aws_lambda_event_source_mapping resource to easily set a stream or a queue as trigger for lambda. #75

Open
wants to merge 18 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -129,6 +129,7 @@ Available targets:
| [aws_iam_role_policy_attachment.ssm](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role_policy_attachment) | resource |
| [aws_iam_role_policy_attachment.vpc_access](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role_policy_attachment) | resource |
| [aws_iam_role_policy_attachment.xray](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role_policy_attachment) | resource |
| [aws_lambda_event_source_mapping.trigger](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_event_source_mapping) | resource |
| [aws_lambda_function.this](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_function) | resource |
| [aws_lambda_permission.invoke_function](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_permission) | resource |
| [aws_caller_identity.this](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/caller_identity) | data source |
Expand Down Expand Up @@ -187,6 +188,11 @@ Available targets:
| <a name="input_s3_key"></a> [s3\_key](#input\_s3\_key) | The S3 key of an object containing the function's deployment package. Conflicts with filename and image\_uri. | `string` | `null` | no |
| <a name="input_s3_object_version"></a> [s3\_object\_version](#input\_s3\_object\_version) | The object version containing the function's deployment package. Conflicts with filename and image\_uri. | `string` | `null` | no |
| <a name="input_source_code_hash"></a> [source\_code\_hash](#input\_source\_code\_hash) | Used to trigger updates. Must be set to a base64-encoded SHA256 hash of the package file specified with either<br/> filename or s3\_key. The usual way to set this is filebase64sha256('file.zip') where 'file.zip' is the local filename<br/> of the lambda function source archive. | `string` | `""` | no |
| <a name="input_source_mapping_arn"></a> [source\_mapping\_arn](#input\_source\_mapping\_arn) | The event source ARN - can be a Kinesis stream, DynamoDB stream, or SQS queue. | `string` | `null` | no |
| <a name="input_source_mapping_batch_size"></a> [source\_mapping\_batch\_size](#input\_source\_mapping\_batch\_size) | (Optional) The largest number of records that Lambda will retrieve from your event source at the time of invocation. Defaults to 100 for DynamoDB and Kinesis, 10 for SQS. | `number` | `100` | no |
| <a name="input_source_mapping_enabled"></a> [source\_mapping\_enabled](#input\_source\_mapping\_enabled) | Enables the source mapping to set a Kinesis stream, a DynamoDB stream, or SQS queue as trigger for lambda. | `bool` | `false` | no |
| <a name="input_source_mapping_starting_position"></a> [source\_mapping\_starting\_position](#input\_source\_mapping\_starting\_position) | The position in the stream where AWS Lambda should start reading. Must be one of AT\_TIMESTAMP (Kinesis only), LATEST or TRIM\_HORIZON if getting events from Kinesis or DynamoDB. Must not be provided if getting events from SQS. More information about these positions can be found in the AWS DynamoDB Streams API Reference and AWS Kinesis API Reference. | `string` | `null` | no |
| <a name="input_source_mapping_starting_position_timestamp"></a> [source\_mapping\_starting\_position\_timestamp](#input\_source\_mapping\_starting\_position\_timestamp) | (Optional) A timestamp in RFC3339 format of the data record which to start reading when using starting\_position set to AT\_TIMESTAMP. If a record with this exact timestamp does not exist, the next later record is chosen. If the timestamp is older than the current trim horizon, the oldest available record is chosen. | `string` | `null` | no |
| <a name="input_ssm_parameter_names"></a> [ssm\_parameter\_names](#input\_ssm\_parameter\_names) | List of AWS Systems Manager Parameter Store parameter names. The IAM role of this Lambda function will be enhanced<br/> with read permissions for those parameters. Parameters must start with a forward slash and can be encrypted with the<br/> default KMS key. | `list(string)` | `null` | no |
| <a name="input_stage"></a> [stage](#input\_stage) | ID element. Usually used to indicate role, e.g. 'prod', 'staging', 'source', 'build', 'test', 'deploy', 'release' | `string` | `null` | no |
| <a name="input_tags"></a> [tags](#input\_tags) | Additional tags (e.g. `{'BusinessUnit': 'XYZ'}`).<br/>Neither the tag keys nor the tag values will be modified by this module. | `map(string)` | `{}` | no |
Expand Down
6 changes: 6 additions & 0 deletions docs/terraform.md
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@
| [aws_iam_role_policy_attachment.ssm](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role_policy_attachment) | resource |
| [aws_iam_role_policy_attachment.vpc_access](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role_policy_attachment) | resource |
| [aws_iam_role_policy_attachment.xray](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role_policy_attachment) | resource |
| [aws_lambda_event_source_mapping.trigger](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_event_source_mapping) | resource |
| [aws_lambda_function.this](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_function) | resource |
| [aws_lambda_permission.invoke_function](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_permission) | resource |
| [aws_caller_identity.this](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/data-sources/caller_identity) | data source |
Expand Down Expand Up @@ -90,6 +91,11 @@
| <a name="input_s3_key"></a> [s3\_key](#input\_s3\_key) | The S3 key of an object containing the function's deployment package. Conflicts with filename and image\_uri. | `string` | `null` | no |
| <a name="input_s3_object_version"></a> [s3\_object\_version](#input\_s3\_object\_version) | The object version containing the function's deployment package. Conflicts with filename and image\_uri. | `string` | `null` | no |
| <a name="input_source_code_hash"></a> [source\_code\_hash](#input\_source\_code\_hash) | Used to trigger updates. Must be set to a base64-encoded SHA256 hash of the package file specified with either<br> filename or s3\_key. The usual way to set this is filebase64sha256('file.zip') where 'file.zip' is the local filename<br> of the lambda function source archive. | `string` | `""` | no |
| <a name="input_source_mapping_arn"></a> [source\_mapping\_arn](#input\_source\_mapping\_arn) | The event source ARN - can be a Kinesis stream, DynamoDB stream, or SQS queue. | `string` | `null` | no |
| <a name="input_source_mapping_batch_size"></a> [source\_mapping\_batch\_size](#input\_source\_mapping\_batch\_size) | (Optional) The largest number of records that Lambda will retrieve from your event source at the time of invocation. Defaults to 100 for DynamoDB and Kinesis, 10 for SQS. | `number` | `100` | no |
| <a name="input_source_mapping_enabled"></a> [source\_mapping\_enabled](#input\_source\_mapping\_enabled) | Enables the source mapping to set a Kinesis stream, a DynamoDB stream, or SQS queue as trigger for lambda. | `bool` | `false` | no |
| <a name="input_source_mapping_starting_position"></a> [source\_mapping\_starting\_position](#input\_source\_mapping\_starting\_position) | The position in the stream where AWS Lambda should start reading. Must be one of AT\_TIMESTAMP (Kinesis only), LATEST or TRIM\_HORIZON if getting events from Kinesis or DynamoDB. Must not be provided if getting events from SQS. More information about these positions can be found in the AWS DynamoDB Streams API Reference and AWS Kinesis API Reference. | `string` | `null` | no |
| <a name="input_source_mapping_starting_position_timestamp"></a> [source\_mapping\_starting\_position\_timestamp](#input\_source\_mapping\_starting\_position\_timestamp) | (Optional) A timestamp in RFC3339 format of the data record which to start reading when using starting\_position set to AT\_TIMESTAMP. If a record with this exact timestamp does not exist, the next later record is chosen. If the timestamp is older than the current trim horizon, the oldest available record is chosen. | `string` | `null` | no |
| <a name="input_ssm_parameter_names"></a> [ssm\_parameter\_names](#input\_ssm\_parameter\_names) | List of AWS Systems Manager Parameter Store parameter names. The IAM role of this Lambda function will be enhanced<br> with read permissions for those parameters. Parameters must start with a forward slash and can be encrypted with the<br> default KMS key. | `list(string)` | `null` | no |
| <a name="input_stage"></a> [stage](#input\_stage) | ID element. Usually used to indicate role, e.g. 'prod', 'staging', 'source', 'build', 'test', 'deploy', 'release' | `string` | `null` | no |
| <a name="input_tags"></a> [tags](#input\_tags) | Additional tags (e.g. `{'BusinessUnit': 'XYZ'}`).<br>Neither the tag keys nor the tag values will be modified by this module. | `map(string)` | `{}` | no |
Expand Down
11 changes: 7 additions & 4 deletions examples/complete/fixtures.us-east-2.tfvars
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,10 @@ namespace = "eg"
environment = "ue2"
stage = "test"

function_name = "example-complete"
handler = "handler.handler"
runtime = "nodejs20.x"
ephemeral_storage_size = 1024
function_name = "example-complete"
handler = "handler.handler"
runtime = "nodejs20.x"
ephemeral_storage_size = 1024
source_mapping_enabled = true
source_mapping_batch_size = 1
source_mapping_starting_position = "LATEST"
28 changes: 22 additions & 6 deletions examples/complete/main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -74,15 +74,31 @@ resource "aws_iam_role_policy_attachment" "outside" {
policy_arn = aws_iam_policy.outside[0].arn
}

module "dynamodb_table" {
source = "cloudposse/dynamodb/aws"
version = "0.37.0"

name = "first"
hash_key = "HashKey"
range_key = "RangeKey"
enable_autoscaler = false

context = module.this.context
}

module "lambda" {
source = "../.."

filename = join("", data.archive_file.lambda_zip[*].output_path)
function_name = module.label.id
handler = var.handler
runtime = var.runtime
iam_policy_description = var.iam_policy_description
ephemeral_storage_size = var.ephemeral_storage_size
filename = join("", data.archive_file.lambda_zip.*.output_path)
function_name = module.label.id
handler = var.handler
runtime = var.runtime
iam_policy_description = var.iam_policy_description
ephemeral_storage_size = var.ephemeral_storage_size
source_mapping_enabled = var.source_mapping_enabled
source_mapping_batch_size = var.source_mapping_batch_size
source_mapping_arn = module.dynamodb_table.table_stream_arn
source_mapping_starting_position = var.source_mapping_starting_position

custom_iam_policy_arns = [
"arn:aws:iam::aws:policy/job-function/ViewOnlyAccess",
Expand Down
30 changes: 30 additions & 0 deletions examples/complete/variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -31,3 +31,33 @@ variable "ephemeral_storage_size" {
description = "The amount of storage available to the function at runtime. Defaults to 512."
default = 512
}

variable "source_mapping_enabled" {
type = bool
description = "Enables the source mapping to set a Kinesis stream, a DynamoDB stream, or SQS queue as trigger for lambda."
default = false
}

variable "source_mapping_batch_size" {
type = number
description = "(Optional) The largest number of records that Lambda will retrieve from your event source at the time of invocation. Defaults to 100 for DynamoDB and Kinesis, 10 for SQS."
default = 100
}

variable "source_mapping_arn" {
type = string
description = "The event source ARN - can be a Kinesis stream, DynamoDB stream, or SQS queue."
default = null
}

variable "source_mapping_starting_position" {
type = string
description = "The position in the stream where AWS Lambda should start reading. Must be one of AT_TIMESTAMP (Kinesis only), LATEST or TRIM_HORIZON if getting events from Kinesis or DynamoDB. Must not be provided if getting events from SQS. More information about these positions can be found in the AWS DynamoDB Streams API Reference and AWS Kinesis API Reference."
default = null
}

variable "source_mapping_starting_position_timestamp" {
type = string
description = "(Optional) A timestamp in RFC3339 format of the data record which to start reading when using starting_position set to AT_TIMESTAMP. If a record with this exact timestamp does not exist, the next later record is chosen. If the timestamp is older than the current trim horizon, the oldest available record is chosen."
default = null
}
11 changes: 11 additions & 0 deletions main.tf
Original file line number Diff line number Diff line change
Expand Up @@ -107,3 +107,14 @@ data "aws_region" "this" {
data "aws_caller_identity" "this" {
count = local.enabled ? 1 : 0
}

resource "aws_lambda_event_source_mapping" "trigger" {
count = local.enabled && var.source_mapping_enabled ? 1 : 0

function_name = join("", aws_lambda_function.this[*].function_name)
event_source_arn = var.source_mapping_arn
batch_size = var.source_mapping_batch_size
starting_position = var.source_mapping_starting_position
starting_position_timestamp = var.source_mapping_starting_position == "AT_TIMESTAMP" && var.source_mapping_starting_position_timestamp != null ? var.source_mapping_starting_position_timestamp : null
}

30 changes: 30 additions & 0 deletions variables.tf
Original file line number Diff line number Diff line change
Expand Up @@ -254,3 +254,33 @@ variable "invoke_function_permissions" {
description = "Defines which external source(s) can invoke this function (action 'lambda:InvokeFunction'). Attributes map to those of https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/lambda_permission. NOTE: to keep things simple, we only expose a subset of said attributes. If a more complex configuration is needed, declare the necessary lambda permissions outside of this module"
default = []
}

variable "source_mapping_enabled" {
type = bool
description = "Enables the source mapping to set a Kinesis stream, a DynamoDB stream, or SQS queue as trigger for lambda."
default = false
}

variable "source_mapping_batch_size" {
type = number
description = "(Optional) The largest number of records that Lambda will retrieve from your event source at the time of invocation. Defaults to 100 for DynamoDB and Kinesis, 10 for SQS."
default = 100
}

variable "source_mapping_arn" {
type = string
description = "The event source ARN - can be a Kinesis stream, DynamoDB stream, or SQS queue."
default = null
}

variable "source_mapping_starting_position" {
type = string
description = "The position in the stream where AWS Lambda should start reading. Must be one of AT_TIMESTAMP (Kinesis only), LATEST or TRIM_HORIZON if getting events from Kinesis or DynamoDB. Must not be provided if getting events from SQS. More information about these positions can be found in the AWS DynamoDB Streams API Reference and AWS Kinesis API Reference."
default = null
}

variable "source_mapping_starting_position_timestamp" {
type = string
description = "(Optional) A timestamp in RFC3339 format of the data record which to start reading when using starting_position set to AT_TIMESTAMP. If a record with this exact timestamp does not exist, the next later record is chosen. If the timestamp is older than the current trim horizon, the oldest available record is chosen."
default = null
}