This sample solution includes 2 State machines to take backup of AWS services :
- Single AWS service backup workflow
Execution of this state machine requires below input parameters (by Lambda functions) :
{
"host_ip": "x.y.z.w",
"username": "ubuntu",
"s3-key-bucket": "my_key_bucket",
"key-path": "keys/my_key.pem",
"s3-script-bucket": "my_script_bucket",
"pre-script-path": "step_functions/scripts/pre-script.sh",
"volume_id": "vol-xxxxx",
"region_name": "aws_region",
"post-script-path": "step_functions/scripts/post-script.sh",
}
- Two AWS services backup (in parallel) workflow
Execution of this state machine requires below input parameters (by Lambda functions) :
{
"host_ip": "x.y.z.w",
"username": "ubuntu",
"s3-key-bucket": "my_key_bucket",
"key-path": "keys/my_key.pem",
"s3-script-bucket": "my_script_bucket",
"pre-script-path": "step_functions/scripts/pre-script.sh",
"volume_id": "vol-xxxxx",
"db_instance_id": "my_db_instance_id",
"region_name": "aws_region",
"post-script-path": "step_functions/scripts/post-script.sh",
}
The state machines execute below AWS Lambda functions for each step :
NOTE : These lambda functions can be packaged (including all libraries) with a deployment package using Python's virtualenv as mentioned here
-
pre_script_handler.py
Performs below operations :
a. Download the EC2 instance SSH key and pre-script from the given location in S3 bucket
b. Copy the script to a temp directory in the EC2 instance, change its permissions and execute
-
volume_snapshot_launch_handler.py
Launch Volume snapshot for given <volume_id> and <region_name>, and store the snapshot id in <volume_snapshot_id>.
-
volume_snapshot_status_check_handler.py
Get volume snapshot status for the <volume_snapshot_id>.
-
db_snapshot_launch_handler.py
Launch DB snapshot for the given <db_instance_id> and <region_name>, and store the snapshot id in <db_instance_snapshot_id> (db_instance_id + "%Y-%m-%d-%H-%M-%S").
-
db_snapshot_status_check_handler.py
Get the DB snapshot status for the <db_instance_snapshot_id>.
-
post_script_handler.py
Performs below operations :
a. Download the EC2 instance SSH key and post-script from the given location in S3 bucket
b. Copy the script to a temp directory in the EC2 instance, change its permissions and execute
The MIT License (MIT). Please see License File for more information.