Skip to content

Commit c173c7a

Browse files
authored
Refactor: remove role assignment from Terraform (#359)
2 parents 3f49b0f + d04bbe9 commit c173c7a

File tree

5 files changed

+22
-28
lines changed

5 files changed

+22
-28
lines changed

docs/deployment/infrastructure.md

Lines changed: 22 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -12,6 +12,7 @@ The following things in Azure are managed outside of Terraform:
1212
- Active Directory (users, groups, service principals, etc.)
1313
- Service connections
1414
- Configuration files, stored as blobs
15+
- Role assignments
1516

1617
## Environments
1718

@@ -136,3 +137,24 @@ In general, the steps that must be done manually before the pipeline can be run
136137
- Create Terraform workspace for each environment
137138
- Trigger a pipeline run to verify `plan` and `apply`
138139
- Known chicken-and-egg problem: Terraform both creates the Key Vault and expects a secret within it, so will always fail on the first deploy. Add the Benefits slack email secret and re-run the pipeline.
140+
141+
Once the pipeline has run, there are a few more steps to be done manually in the Azure portal. These are related to configuring the service principal used for ETL:
142+
143+
- [Create the service principal](https://learn.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal#app-registration-app-objects-and-service-principals)
144+
- Give the ETL service principal access to the `prod` storage account created by the pipeline:
145+
- Navigate to the storage account container
146+
- Select **Access Control (IAM)**
147+
- Select **Add**, then select **Add role assignment**
148+
- In the **Role** tab, select `Storage Blob Data Contributor`
149+
- In the **Members** tab, select `Select Members` and search for the ETL service principal. Add it to the role.
150+
- Also in the **Members** tab, add a description of `This role assignment gives write access only for the path of the hashed data file.`
151+
- In the **Conditions** tab, select **Add condition** and change the editor type to `Code`
152+
- Add the following condition into the editor, filling in `<filename>` with the appropriate value:
153+
154+
```text
155+
(
156+
(
157+
@Resource[Microsoft.Storage/storageAccounts/blobServices/containers/blobs:path] StringLike '<filename>'
158+
)
159+
)
160+
```

terraform/mst/azure-vars.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,4 +4,3 @@ variables:
44
TF_VAR_AGENCY_CARD: "mst-courtesy-cards"
55
TF_VAR_AGENCY_RESOURCE_GROUP_PREFIX: "courtesy-cards"
66
TF_VAR_AGENCY_STORAGE_ACCOUNT_PREFIX: "mstcceligibility"
7-
TF_VAR_AGENCY_CARD_DATA_ETL_FILE: "velocity.csv"

terraform/roles.tf

Lines changed: 0 additions & 16 deletions
This file was deleted.

terraform/sbmtd/azure-vars.yml

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -4,4 +4,3 @@ variables:
44
TF_VAR_AGENCY_CARD: "sbmtd-mobility-pass"
55
TF_VAR_AGENCY_RESOURCE_GROUP_PREFIX: "sbmtd-mobility-pass"
66
TF_VAR_AGENCY_STORAGE_ACCOUNT_PREFIX: "sbmtdmobilitypass"
7-
TF_VAR_AGENCY_CARD_DATA_ETL_FILE: "mobilitypass.csv"

terraform/variables.tf

Lines changed: 0 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -8,16 +8,6 @@ variable "AGENCY_CARD" {
88
type = string
99
}
1010

11-
variable "AGENCY_CARD_DATA_ETL_APP_OBJECT_ID" {
12-
description = "Object ID from the registered application for the Agency Card server ETL uploading: https://cloudsight.zendesk.com/hc/en-us/articles/360016785598-Azure-finding-your-service-principal-object-ID"
13-
type = string
14-
}
15-
16-
variable "AGENCY_CARD_DATA_ETL_FILE" {
17-
description = "The name of the hashed data file that's uploaded to the storage account"
18-
type = string
19-
}
20-
2111
variable "AGENCY_RESOURCE_GROUP_PREFIX" {
2212
description = "The prefix to the name of the resource group for each environment"
2313
type = string

0 commit comments

Comments
 (0)