Skip to content
Merged
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .project_automation/static_tests/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,6 @@ RUN pip3 install checkov

RUN gem install mdl

ENV TERRAFORM_DOCS_VERSION=v0.16.0
ENV TERRAFORM_DOCS_VERSION=v0.20.0
RUN wget https://github.com/terraform-docs/terraform-docs/releases/download/${TERRAFORM_DOCS_VERSION}/terraform-docs-${TERRAFORM_DOCS_VERSION}-linux-amd64.tar.gz && \
tar -C /usr/local/bin -xzf terraform-docs-${TERRAFORM_DOCS_VERSION}-linux-amd64.tar.gz && chmod +x /usr/local/bin/terraform-docs
16 changes: 13 additions & 3 deletions .project_automation/static_tests/entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -62,14 +62,24 @@ else
fi
#********** Terraform Docs *************
echo 'Starting terraform-docs'
TDOCS="$(terraform-docs --config ${PROJECT_PATH}/.config/.terraform-docs.yaml --lockfile=false ./)"
git add -N README.md
TDOCS="$(terraform-docs --config ${PROJECT_PATH}/.config/.terraform-docs.yaml --lockfile=false ./ --recursive)"

# Process examples directories individually
for example_dir in examples/*/; do
if [ -d "$example_dir" ] && [ -f "${example_dir}main.tf" ]; then
echo "Processing terraform-docs for $example_dir"
terraform-docs --config ${PROJECT_PATH}/.config/.terraform-docs.yaml --lockfile=false "$example_dir"
fi
done

git add -N README.md examples/*/README.md
GDIFF="$(git diff --compact-summary)"
if [ -z "$GDIFF" ]
then
echo "Success - Terraform Docs creation verified!"
else
echo "Failure - Terraform Docs creation failed, ensure you have precommit installed and running before submitting the Pull Request"
echo "Failure - Terraform Docs creation failed, ensure you have precommit installed and running before submitting the Pull Request. TIPS: false error may occur if you have unstaged files in your repo"
echo "$GDIFF"
exit 1
fi
#***************************************
Expand Down
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -237,8 +237,8 @@ No modules.

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| <a name="input_efs_locations"></a> [efs\_locations](#input\_efs\_locations) | A list of EFS locations and associated configuration | <pre>list(object({<br> name = string<br> access_point_arn = optional(string)<br> ec2_config_security_group_arns = list(string)<br> ec2_config_subnet_arn = string<br> efs_file_system_arn = string<br> file_system_access_role_arn = optional(string)<br> in_transit_encryption = optional(string)<br> subdirectory = optional(string)<br> tags = optional(map(string))<br> }))</pre> | `[]` | no |
| <a name="input_s3_locations"></a> [s3\_locations](#input\_s3\_locations) | A list of S3 locations and associated configuration | <pre>list(object({<br> name = string<br> agent_arns = optional(list(string))<br> s3_bucket_arn = string<br> s3_config_bucket_access_role_arn = optional(string)<br> s3_storage_class = optional(string)<br> subdirectory = optional(string)<br> tags = optional(map(string))<br> create_role = optional(bool)<br> }))</pre> | `[]` | no |
| <a name="input_efs_locations"></a> [efs\_locations](#input\_efs\_locations) | A list of EFS locations and associated configuration | <pre>list(object({<br/> name = string<br/> access_point_arn = optional(string)<br/> ec2_config_security_group_arns = list(string)<br/> ec2_config_subnet_arn = string<br/> efs_file_system_arn = string<br/> file_system_access_role_arn = optional(string)<br/> in_transit_encryption = optional(string)<br/> subdirectory = optional(string)<br/> tags = optional(map(string))<br/> }))</pre> | `[]` | no |
| <a name="input_s3_locations"></a> [s3\_locations](#input\_s3\_locations) | A list of S3 locations and associated configuration | <pre>list(object({<br/> name = string<br/> agent_arns = optional(list(string))<br/> s3_bucket_arn = string<br/> s3_config_bucket_access_role_arn = optional(string)<br/> s3_storage_class = optional(string)<br/> subdirectory = optional(string)<br/> tags = optional(map(string))<br/> create_role = optional(bool)<br/> }))</pre> | `[]` | no |

## Outputs

Expand Down
4 changes: 2 additions & 2 deletions examples/efs-to-s3/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@ This example demonstrates how to create an EFS to S3 replication scenario using

| Name | Version |
|------|---------|
| <a name="provider_aws"></a> [aws](#provider\_aws) | 5.65.0 |
| <a name="provider_random"></a> [random](#provider\_random) | 3.6.2 |
| <a name="provider_aws"></a> [aws](#provider\_aws) | >= 3.72.0 |
| <a name="provider_random"></a> [random](#provider\_random) | n/a |

## Modules

Expand Down
2 changes: 1 addition & 1 deletion examples/efs-to-s3/efs.tf
Original file line number Diff line number Diff line change
Expand Up @@ -81,7 +81,7 @@ resource "aws_security_group" "MyEfsSecurityGroup" {
cidr_blocks = [var.vpc_cidr_block]
}
#outbound connections for EFS Mount Target to reach to AWS services
#tfsec:ignore:aws-ec2-no-public-egress-sgr
#checkov:skip=CKV_AWS_382: EFS mount target requires egress to AWS services for DataSync operations
egress {
from_port = 0
to_port = 0
Expand Down
14 changes: 14 additions & 0 deletions examples/s3-to-s3-cross-account/.header.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,3 +3,17 @@
This example demonstrates how to create a S3 to S3 replication scenario across AWS Accounts using the AWS DataSync module.

![AWS Datasync S3 to S3 Cross Account](./datasync-examples-cross-account.png)

In this example, the default source and destination account AWS CLI profiles are named as `source-account` and `destination-account` in [variables.tf](./variables.tf). Add the following text to the shared credentials file. Replace the sample values with the credentials you copied. Review [Authenticating with short-term credentials for the AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-authentication-short-term.html) documentation for more information.

```bash
[source-account]
aws_access_key_id = EXAMPLE
aws_secret_access_key = EXAMPLEKEY
aws_session_token = LONGSTRINGEXAMPLE

[destination-account]
aws_access_key_id = EXAMPLE
aws_secret_access_key = EXAMPLEKEY
aws_session_token = LONGSTRINGEXAMPLE
```
14 changes: 14 additions & 0 deletions examples/s3-to-s3-cross-account/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,20 @@ This example demonstrates how to create a S3 to S3 replication scenario across A

![AWS Datasync S3 to S3 Cross Account](./datasync-examples-cross-account.png)

In this example, the default source and destination account AWS CLI profiles are named as `source-account` and `destination-account` in [variables.tf](./variables.tf). Add the following text to the shared credentials file. Replace the sample values with the credentials you copied. Review [Authenticating with short-term credentials for the AWS CLI](https://docs.aws.amazon.com/cli/latest/userguide/cli-authentication-short-term.html) documentation for more information.

```bash
[source-account]
aws_access_key_id = EXAMPLE
aws_secret_access_key = EXAMPLEKEY
aws_session_token = LONGSTRINGEXAMPLE

[destination-account]
aws_access_key_id = EXAMPLE
aws_secret_access_key = EXAMPLEKEY
aws_session_token = LONGSTRINGEXAMPLE
```

## Requirements

| Name | Version |
Expand Down
14 changes: 8 additions & 6 deletions modules/datasync-locations/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ To configure one or more S3 Locations use the `s3_locations` variable. It is a l
- `s3_bucket_arn` - (Required) Amazon Resource Name (ARN) of the S3 Bucket.
- `s3_config_bucket_access_role_arn` - (Optional) ARN of the IAM Role used to connect to the S3 Bucket. Must be provided if `create_role` is set to false.
- `s3_storage_class` - (Optional) The Amazon S3 storage class that you want to store your files in when this location is used as a task destination.
- `3_source_bucket_kms_arn` - (Optional) ARN of the KMS Customer Managed Key to encrypt the source S3 Objects.
- `3_source_bucket_kms_arn` - (Optional) ARN of the KMS Customer Managed Key to encrypt the source S3 Objects.
- `3_source_bucket_kms_arn` - (Optional) ARN of the KMS Customer Managed Key to encrypt the destination S3 Objects.
- `subdirectory` - (Optional) Prefix to perform actions as source or destination.
- `tags` - (Optional) Key-value pairs of resource tags to assign to the DataSync Location.
Expand Down Expand Up @@ -57,20 +57,22 @@ No modules.
|------|------|
| [aws_datasync_location_efs.efs_location](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/datasync_location_efs) | resource |
| [aws_datasync_location_s3.s3_location](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/datasync_location_s3) | resource |
| [aws_iam_policy.datasync_role_kms](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_policy) | resource |
| [aws_iam_role.datasync_role_s3](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role) | resource |
| [aws_iam_role_policy_attachment.datasync_role_kms_policy_attachement](https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_role_policy_attachment) | resource |

## Inputs

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| <a name="input_efs_locations"></a> [efs\_locations](#input\_efs\_locations) | A list of EFS locations and associated configuration | <pre>list(object({<br> name = string<br> access_point_arn = optional(string)<br> ec2_config_security_group_arns = list(string)<br> ec2_config_subnet_arn = string<br> efs_file_system_arn = string<br> file_system_access_role_arn = optional(string)<br> in_transit_encryption = optional(string)<br> subdirectory = optional(string)<br> tags = optional(map(string))<br> }))</pre> | `[]` | no |
| <a name="input_s3_locations"></a> [s3\_locations](#input\_s3\_locations) | A list of S3 locations and associated configuration | <pre>list(object({<br> name = string<br> agent_arns = optional(list(string))<br> s3_bucket_arn = string<br> s3_config_bucket_access_role_arn = optional(string)<br> s3_storage_class = optional(string)<br> subdirectory = optional(string)<br> tags = optional(map(string))<br> create_role = optional(bool)<br> }))</pre> | `[]` | no |
| <a name="input_efs_locations"></a> [efs\_locations](#input\_efs\_locations) | A list of EFS locations and associated configuration | <pre>list(object({<br/> name = string<br/> access_point_arn = optional(string)<br/> ec2_config_security_group_arns = list(string)<br/> ec2_config_subnet_arn = string<br/> efs_file_system_arn = string<br/> file_system_access_role_arn = optional(string)<br/> in_transit_encryption = optional(string)<br/> subdirectory = optional(string)<br/> tags = optional(map(string))<br/> }))</pre> | `[]` | no |
| <a name="input_s3_locations"></a> [s3\_locations](#input\_s3\_locations) | A list of S3 locations and associated configuration | <pre>list(object({<br/> name = string<br/> agent_arns = optional(list(string))<br/> s3_bucket_arn = string<br/> s3_config_bucket_access_role_arn = optional(string)<br/> s3_storage_class = optional(string)<br/> s3_source_bucket_kms_arn = optional(string)<br/> s3_dest_bucket_kms_arn = optional(string)<br/> subdirectory = optional(string)<br/> tags = optional(map(string))<br/> create_role = optional(bool)<br/> }))</pre> | `[]` | no |

## Outputs

| Name | Description |
|------|-------------|
| <a name="output_datasync_role_arn"></a> [datasync\_role\_arn](#output\_datasync\_role\_arn) | DataSync IAM Role |
| <a name="output_efs_locations"></a> [efs\_locations](#output\_efs\_locations) | DataSync Location ARN for EFS |
| <a name="output_s3_locations"></a> [s3\_locations](#output\_s3\_locations) | DataSync Location ARN for S3 |
| <a name="output_datasync_role_arn"></a> [datasync\_role\_arn](#output\_datasync\_role\_arn) | DataSync Task ARN |
| <a name="output_efs_locations"></a> [efs\_locations](#output\_efs\_locations) | DataSync EFS Location ARN |
| <a name="output_s3_locations"></a> [s3\_locations](#output\_s3\_locations) | DataSync S3 Location ARN |
<!-- END_TF_DOCS -->
4 changes: 2 additions & 2 deletions modules/datasync-task/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -70,11 +70,11 @@ No modules.

| Name | Description | Type | Default | Required |
|------|-------------|------|---------|:--------:|
| <a name="input_datasync_tasks"></a> [datasync\_tasks](#input\_datasync\_tasks) | A list of task configurations | <pre>list(object({<br> destination_location_arn = string<br> source_location_arn = string<br> cloudwatch_log_group_arn = optional(string)<br> excludes = optional(object({filter_type = string, value = string}))<br> includes = optional(object({filter_type = string, value = string}))<br> name = optional(string)<br> options = optional(map(string))<br> schedule_expression = optional(string)<br> tags = optional(map(string))<br> }))</pre> | `[]` | no |
| <a name="input_datasync_tasks"></a> [datasync\_tasks](#input\_datasync\_tasks) | A list of task configurations | <pre>list(object({<br/> destination_location_arn = string<br/> source_location_arn = string<br/> cloudwatch_log_group_arn = optional(string)<br/> excludes = optional(object({ filter_type = string, value = string }))<br/> includes = optional(object({ filter_type = string, value = string }))<br/> name = optional(string)<br/> options = optional(map(string))<br/> schedule_expression = optional(string)<br/> tags = optional(map(string))<br/> }))</pre> | `[]` | no |

## Outputs

| Name | Description |
|------|-------------|
| <a name="output_datasync_tasks"></a> [datasync\_tasks](#output\_datasync\_tasks) | n/a |
| <a name="output_datasync_tasks"></a> [datasync\_tasks](#output\_datasync\_tasks) | DataSync Task ARN |
<!-- END_TF_DOCS -->
Loading