Skip to content

damoodamoo/azure_message_processing

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Azure Secure Message Processing Pipeline

This repo acts as an exemplar / 'starter for 10' when building a secure message processing solution on Azure.

What you get

Secure Processing Diagram

Message Journey

  • A JSON message is POSTed to the /submit endpoint at API Manager
  • A policy is applied to the message which attaches a client certificate and forwards the request on to the ingest function.
  • The ingest function receives the message, validates the attached client certificate, and pushes the message to a Service Bus Queue. The ingest function is VNET integrated so all outbound traffic happens through the VNET, which allows it to communicate with Service Bus.
  • The processor function is also VNET integrated, and is triggered by the Service Bus. It has no public connectivity and is considered part of the 'backend' stage of the processing. In a real world scenario it's likely this would become more involved and perhaps many functions would exist here.
  • The processor function sends the message to Cosmos, via the VNET.

In short, after the message is received at ingest, it only travels via the VNET.

NOTE: If it was required that the ingest function also had no public connectivity, it could be achieved with APIM Premium, which has VNET integration. In this scenario we use APIM standard with function key + certificate auth to the ingest func.

Get Going: Clone, Build, Deploy, Test, Load

Follow these steps to get the solution up and running in a subscription of your choice.

  1. Clone the repo locally
  2. Open the folder in VS Code, and elect to build the dev container. If dev containers are new to you, take a minute and see what they are (here)[https://code.visualstudio.com/docs/remote/containers].
  3. Run az login in the dev container and make sure to set your target subscription.
  4. Set up your local environment variables. Open scripts/environments and copy local.env.example to local.env. Update values as needed:
    • The workspace name (use something simple and unique to you)
    • The location (UK South is the default)
    • Whether or not you want to deploy APIM (takes an extra ~45mins - see below for more on this choice)
  5. All the commands are set in a Makefile. Follow these in order:
    • make vpn: Set up our networking and VPN. NOTE: This can take ~20mins or so - grab a coffee.
    • make vpn-client: Connect to our VPN via a VPN client
    • make build-and-deploy: Now Let's build and deploy the code and infrastructure
    • make int-test: Run some integration tests
    • make load-test-with-insights: Run some load tests, wait and then verify all the results from app insights and cosmos automatically

That's it: the messaging pipeline is built, deployed and load tested.

To APIM or Not To APIM?

The environment .var files allow you to set TF_VAR_use_apim=false. If you do this, APIM will not be deployed and the ingest func will not require client certificates. The integration and load tests will read the use_apim variable and will target the ingest func directly if needed. Everything will function with or without APIM.

Dev Loop

As you work and make changes to the code, just run make build-and-deploy. This will - you guessed it - build the code, and deploy it with any infrastructure updates.

Local debugging

VS Code tasks are set up for debugging the 2x functions locally. Bash scripts exist in the test-scripts folder in the ingest function to assist with calling it locally or remotely. The process here is:

  • Run make build-and-deploy once to deploy all the inf to Azure (we're not using local emulators, we'll use the Service Bus and Storage in Azure even when running locally)
  • Run make get-func-settings to download all the settings the functions need to their own local.settings.json files. This avoids you having to set that up manually.

Viewing logs / outputs

  • workbooks / AI / azbrowse - TODO

Cleaning up

  1. Clean up the main infrastructure: make clean-inf
  2. Clean up the networking / VPN: make clean-vpn

Taking to CI / Production

Since all the commands to build and deploy this solution are wrapped up in the Makefile, and run inside the dev container, the process of getting CI set up can be fairly straight forward. Your pipeline would do something like:

  • Set up the environment file for the target environment
  • Build the dev container
  • Run the following commands inside the dev container, passing in the required env vars from the pipeline (see dev.env)
    • make build
    • make test
    • make inf
    • make int-test
  • It's likely you'd want a separate pipeline to run the load tests, perhaps on a nightly schedule.

About

Reusable accelerator for scaled and reliable message delivery through Azure

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published