Skip to content

Commit

Permalink
Adding in creation complete event to state machine
Browse files Browse the repository at this point in the history
Ignoring check I3042 for cfn-lint

Adding in EventBus to deployment account

Adding in x-ray tracing to pipeline management lambda functions

Changing event content, adding in xray layer to pipeline management lambda functions

Documentation

Mega Lint Fixes

Forgot to hit save :(
  • Loading branch information
StewartW committed Sep 8, 2022
1 parent 3452301 commit e011008
Show file tree
Hide file tree
Showing 18 changed files with 150 additions and 10 deletions.
1 change: 1 addition & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,3 +36,4 @@ within the AWS Console.
- Refer to the [User Guide](docs/user-guide.md) for using ADF once it is setup.
- Refer to the [Samples Guide](docs/samples-guide.md) for a detailed walk
through of the provided samples.
- Refer to the [Integrations Guide](docs/integrations-guide.md) for information on events produced by the ADF.
38 changes: 38 additions & 0 deletions docs/integrations-guide.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Integrations Guide
## Introduction
The AWS Deployment Framework enables integrations with external workflows via an Event Bus deployed into the organisational root account.

## Account Management Events
The account management events are emitted at various stages during an execution of the Account Management State Machine.
Currently - events are emitted for the following states:
- ACCOUNT_PROVISIONED
Emitted when an AWS account is created.
Contains the account definition from the .yml file as well as the account_id.
- ENTERPRISE_SUPPORT_REQUESTED
Emitted when the support ticket to AWS Support is raised.
Contains the account definition from the .yml file as well as the account_id.
- ACCOUNT_ALIAS_CONFIGURED
Emitted when the accounts alias is configured by ADF.
The details section contains the account id and the alias value. The resource field also contains the account id
- ACCOUNT_TAGS_CONFIGURED
Emitted when the accounts tags are updated by ADF.
The details section contains the account id and the tags. The resource field also contains the account id
- DEFAULT_VPC_DELETED
Emitted when the default VPC in a region is deleted.
The details section contains the account id and the region of the VPC. The resource field contains the deleted VPC id.
- ACCOUNT_CREATION_COMPLETE
Emitted when the state machine completes successfully.
Contains the account definition from the .yml file as well as the account_id in the resource field.




## Pipeline Management Events
- CROSS_ACCOUNT_RULE_CREATED_OR_UPDATED
Emitted when a rule is created to trigger pipelines from a different account.
The details sections contains the source_account_id (The account where the CodeCommit repository is located) and the resource sections contains the deployment account Id (The account where the CodePipeline is located)
- REPOSITORY_CREATED_OR_UPDATED
Emitted when a codecommit repository is created in a different account than the deployment account.
The details sections contains the repository_account_id (The account where the CodeCommit repository is located) as well as the stack_name (The CloudFormation stack that creates the repository) and the resource sections contains the repository account Id and the pipeline name


Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
LOGGER = configure_logger(__name__)
ADF_ROLE_NAME = os.getenv("ADF_ROLE_NAME")
AWS_PARTITION = os.getenv("AWS_PARTITION")
EVENTS = ADFEvents(boto3.client("events"), "AccountManagement.Alias")
EVENTS = ADFEvents(boto3.client("events"), "AccountManagement")


def delete_account_aliases(account, iam_client, current_aliases):
Expand Down Expand Up @@ -80,7 +80,7 @@ def lambda_handler(event, _):
"adf_account_alias_config",
)
ensure_account_has_alias(event, role.client("iam"))
EVENTS.put_event(detail=json.dumps(event), detailType="ACCOUNT_ALIAS_CONFIGURED", resources=[account_id])
EVENTS.put_event(detail=json.dumps({"account_id": account_id, "alias_value": event.get("alias")}), detailType="ACCOUNT_ALIAS_CONFIGURED", resources=[account_id])
else:
LOGGER.info(
"Account: %s does not need an alias",
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@
from events import ADFEvents

patch_all()
EVENTS = ADFEvents(boto3.client("events"), "AccountManagement.Tags")
EVENTS = ADFEvents(boto3.client("events"), "AccountManagement")
LOGGER = configure_logger(__name__)


Expand All @@ -40,7 +40,7 @@ def lambda_handler(event, _):
event.get("tags"),
organizations,
)
EVENTS.put_event(detail=json.dumps(event), detailType="ACCOUNT_TAGS_CONFIGURED", resources=[event.get('account_id')])
EVENTS.put_event(detail=json.dumps({"tags": event.get("tags"), "account_id": event.get("account_id")}), detailType="ACCOUNT_TAGS_CONFIGURED", resources=[event.get('account_id')])
else:
LOGGER.info(
"Account: %s does not need tags configured",
Expand Down
2 changes: 1 addition & 1 deletion src/lambda_codebase/account_processing/create_account.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@

LOGGER = configure_logger(__name__)
ADF_ROLE_NAME = os.getenv("ADF_ROLE_NAME")
EVENTS = ADFEvents(boto3.client("events"), "AccountManagement.AccountProvisioning")
EVENTS = ADFEvents(boto3.client("events"), "AccountManagement")



Expand Down
4 changes: 2 additions & 2 deletions src/lambda_codebase/account_processing/delete_default_vpc.py
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@
LOGGER = configure_logger(__name__)
ADF_ROLE_NAME = os.getenv("ADF_ROLE_NAME")
AWS_PARTITION = os.getenv("AWS_PARTITION")
EVENTS = ADFEvents(boto3.client("events"), "AccountManagement.VPC")
EVENTS = ADFEvents(boto3.client("events"), "AccountManagement")



Expand Down Expand Up @@ -84,7 +84,7 @@ def lambda_handler(event, _):
)
ec2_resource = role.resource("ec2", region_name=event.get("region"))
delete_default_vpc(ec2_resource, ec2_client, default_vpc_id)
EVENTS.put_event(detail=json.dumps(event), detailType="DEFAULT_VPC_DELETED", resources=[event.get("account_id"), default_vpc_id])
EVENTS.put_event(detail=json.dumps({"region": event.get("region"), "account_id":event.get("account_id")}), detailType="DEFAULT_VPC_DELETED", resources=[default_vpc_id])


return {"Payload": event}
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,13 @@ Globals:
CodeUri: lambda_codebase
Runtime: python3.9

Mappings:
OrganisationPartitionRegionMapping:
aws:
region: "us-east-1"
aws-us-gov:
region: "us-gov-west-1"

Resources:
LambdaLayerVersion:
Type: "AWS::Serverless::LayerVersion"
Expand Down Expand Up @@ -183,6 +190,7 @@ Resources:
CrossAccountAccessRole: !Ref CrossAccountAccessRole
PipelineBucket: !Ref PipelineBucket
RootAccountId: !Ref MasterAccountId
RootAccountRegion: !FindInMap [OrganisationPartitionRegionMapping, !Ref "AWS::Partition", "region"]
CodeBuildImage: !Ref Image
CodeBuildComputeType: !Ref ComputeType
SharedModulesBucket: !Ref SharedModulesBucket
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -5,20 +5,25 @@
"""

import os
import json
import boto3

from cache import Cache
from rule import Rule
from logger import configure_logger
from cloudwatch import ADFMetrics
from events import ADFEvents
from aws_xray_sdk.core import patch_all


patch_all()
LOGGER = configure_logger(__name__)
DEPLOYMENT_ACCOUNT_REGION = os.environ["AWS_REGION"]
DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"]
PIPELINE_MANAGEMENT_STATEMACHINE = os.getenv("PIPELINE_MANAGEMENT_STATEMACHINE_ARN")
CLOUDWATCH = boto3.client("cloudwatch")
METRICS = ADFMetrics(CLOUDWATCH, "PIPELINE_MANAGEMENT/RULE")
EVENTS = ADFEvents(boto3.client("events", region_name=os.getenv("ADF_EVENTBUS_REGION")), "PipelineManagement")

_cache = None

Expand Down Expand Up @@ -56,5 +61,6 @@ def lambda_handler(pipeline, _):
METRICS.put_metric_data(
{"MetricName": "CreateOrUpdate", "Value": 1, "Unit": "Count"}
)
EVENTS.put_event(detail=json.dumps({"source_account_id": _source_account_id}), detailType="CROSS_ACCOUNT_RULE_CREATED_OR_UPDATED", resources=[DEPLOYMENT_ACCOUNT_ID])

return pipeline
Original file line number Diff line number Diff line change
Expand Up @@ -4,19 +4,23 @@
"""

import os
import json
import boto3
from repo import Repo

from logger import configure_logger
from cloudwatch import ADFMetrics
from parameter_store import ParameterStore
from events import ADFEvents


CLOUDWATCH = boto3.client("cloudwatch")
METRICS = ADFMetrics(CLOUDWATCH, "PIPELINE_MANAGEMENT/REPO")
LOGGER = configure_logger(__name__)
DEPLOYMENT_ACCOUNT_REGION = os.environ["AWS_REGION"]
DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"]
EVENTS = ADFEvents(boto3.client("events", region_name=os.getenv("ADF_EVENTBUS_REGION")), "PipelineManagement")



def lambda_handler(pipeline, _):
Expand Down Expand Up @@ -52,5 +56,15 @@ def lambda_handler(pipeline, _):
METRICS.put_metric_data(
{"MetricName": "CreateOrUpdate", "Value": 1, "Unit": "Count"}
)
EVENTS.put_event(
detail=json.dumps({
"repository_account_id": code_account_id,
"stack_name": repo.stack_name
}),
detailType="REPOSITORY_CREATED_OR_UPDATED",
resources=[
f'{code_account_id}:{pipeline.get("name")}'
]
)

return pipeline
Original file line number Diff line number Diff line change
Expand Up @@ -13,8 +13,10 @@
from sts import STS
from logger import configure_logger
from partition import get_partition
from aws_xray_sdk.core import patch_all


patch_all()
LOGGER = configure_logger(__name__)
DEPLOYMENT_ACCOUNT_REGION = os.environ["AWS_REGION"]
DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,10 @@
from logger import configure_logger
from deployment_map import DeploymentMap
from parameter_store import ParameterStore
from aws_xray_sdk.core import patch_all


patch_all()
LOGGER = configure_logger(__name__)
S3_BUCKET_NAME = os.environ["S3_BUCKET_NAME"]
DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,8 +14,10 @@
import boto3
from botocore.exceptions import ClientError
from logger import configure_logger
from aws_xray_sdk.core import patch_all


patch_all()
LOGGER = configure_logger(__name__)
DEPLOYMENT_ACCOUNT_REGION = os.environ["AWS_REGION"]
DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -9,8 +9,10 @@
import boto3

from logger import configure_logger
from aws_xray_sdk.core import patch_all


patch_all()
LOGGER = configure_logger(__name__)
DEPLOYMENT_ACCOUNT_REGION = os.environ["AWS_REGION"]
DEPLOYMENT_ACCOUNT_ID = os.environ["ACCOUNT_ID"]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,10 @@ Parameters:
Type: String
MinLength: "1"

RootAccountRegion:
Type: String
MinLength: "1"

CodeBuildImage:
Type: String
MinLength: "1"
Expand Down Expand Up @@ -61,8 +65,19 @@ Globals:
Tracing: Active
Layers:
- !Ref LambdaLayer
- !Ref PipelineManagementLayerVersion

Resources:
PipelineManagementLayerVersion:
Type: "AWS::Serverless::LayerVersion"
Properties:
ContentUri: "../../adf-build/shared/"
CompatibleRuntimes:
- python3.9
Description: "Common dependencies for ADF Pipeline Management Functions"
LayerName: pipeline_management_layer
Metadata:
BuildMethod: python3.9
ADFPipelineMangementLambdaBasePolicy:
Type: "AWS::IAM::ManagedPolicy"
Properties:
Expand All @@ -79,6 +94,10 @@ Resources:
- "xray:PutTraceSegments"
- "cloudwatch:PutMetricData"
Resource: "*"
- Effect: Allow
Action:
- "events:PutEvents"
Resource: !Sub "arn:${AWS::Partition}:events:*:${RootAccountId}:event-bus/ADF-Event-Bus"
Roles:
- !Ref DeploymentMapProcessingLambdaRole
- !Ref CreateOrUpdateRuleLambdaRole
Expand Down Expand Up @@ -823,6 +842,8 @@ Resources:
ADF_LOG_LEVEL: !Ref ADFLogLevel
PIPELINE_MANAGEMENT_STATE_MACHINE: !Sub "arn:${AWS::Partition}:states:${AWS::Region}:${AWS::AccountId}:stateMachine:ADFPipelineManagementStateMachine"
ADF_ROLE_NAME: !Ref CrossAccountAccessRole
ADF_EVENTBUS_ARN: !Sub "arn:${AWS::Partition}:events:${RootAccountRegion}:${RootAccountId}:event-bus/ADF-Event-Bus"
ADF_EVENTBUS_REGION: !Ref RootAccountRegion
FunctionName: DeploymentMapProcessorFunction
Role: !GetAtt DeploymentMapProcessingLambdaRole.Arn
Events:
Expand Down Expand Up @@ -861,6 +882,8 @@ Resources:
ADF_LOG_LEVEL: !Ref ADFLogLevel
ADF_ROLE_NAME: !Ref CrossAccountAccessRole
S3_BUCKET_NAME: !Ref PipelineBucket
ADF_EVENTBUS_ARN: !Sub "arn:${AWS::Partition}:events:${RootAccountRegion}:${RootAccountId}:event-bus/ADF-Event-Bus"
ADF_EVENTBUS_REGION: !Ref RootAccountRegion
FunctionName: ADFPipelineCreateOrUpdateRuleFunction
Role: !GetAtt CreateOrUpdateRuleLambdaRole.Arn

Expand All @@ -877,6 +900,8 @@ Resources:
ADF_LOG_LEVEL: !Ref ADFLogLevel
ADF_ROLE_NAME: !Ref CrossAccountAccessRole
S3_BUCKET_NAME: !Ref PipelineBucket
ADF_EVENTBUS_ARN: !Sub "arn:${AWS::Partition}:events:${RootAccountRegion}:${RootAccountId}:event-bus/ADF-Event-Bus"
ADF_EVENTBUS_REGION: !Ref RootAccountRegion
FunctionName: ADFPipelineCreateRepositoryFunction
Role: !GetAtt CreateRepositoryLambdaRole.Arn

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,5 +37,5 @@ def put_event(self, detailType, detail, resources=[]): # pylint: disable=W0102
}
trace_id = os.getenv("_X_AMZN_TRACE_ID")
if trace_id:
payload["TraceHeader"] = trace_id
payload["TraceHeader"] = trace_id.split(";")[0]
self.events.put_events(Entries=[payload])
Original file line number Diff line number Diff line change
Expand Up @@ -59,3 +59,4 @@ pyyaml>=5.4.1
schema~=0.7.5
tenacity==8.0.1
urllib3~=1.26.12
aws-xray-sdk==2.10.0
Loading

0 comments on commit e011008

Please sign in to comment.