Skip to content

Commit

Permalink
Merge branch 'release/1.14.0' into bugfix/OPT-793
Browse files Browse the repository at this point in the history
  • Loading branch information
PacoCid committed May 15, 2023
2 parents eb496cf + 6566140 commit 32beb61
Show file tree
Hide file tree
Showing 129 changed files with 8,988 additions and 1,735 deletions.
18 changes: 18 additions & 0 deletions .coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
[run]
source = .
branch = false
omit =
./venv/*
*/tests/*
*__init__.py
setup.py
run_tests.py

[report]
fail_under = 80

[html]
directory = coveragereport

[xml]
output = coveragereport/coverage.xml
19 changes: 19 additions & 0 deletions .github/actions/documentation/action.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
name: Config documentation environment
description: 'Config documentation environment'

runs:
using: "composite"
steps:

- name: Set up Python 3.8
uses: actions/setup-python@v3
with:
python-version: "3.8"

- name: Configure git username
run: git config user.name 'github-actions[bot]' && git config user.email 'github-actions[bot]@users.noreply.github.com'
shell: bash

- name: Install doc dependencies
run: pip install -r docs/requirements.txt
shell: bash
18 changes: 18 additions & 0 deletions .github/workflows/documentation-check.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
name: Documentation Check

on: [pull_request]

jobs:

check-documentation:
runs-on: ubuntu-latest

steps:
- name: Checkout the project from Git
uses: actions/checkout@v3

- name: Config documentation environment
uses: ./.github/actions/documentation

- name: Check doc build
run: mkdocs build
21 changes: 21 additions & 0 deletions .github/workflows/documentation-deploy.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
name: Documentation Deploy

on:
workflow_dispatch: {}
push:
branches: [ main ]

jobs:

deploy-documentation:
runs-on: ubuntu-latest

steps:
- name: Checkout the project from Git
uses: actions/checkout@v3

- name: Config documentation environment
uses: ./.github/actions/documentation

- name: Publish docs
run: mkdocs gh-deploy --force
20 changes: 0 additions & 20 deletions .github/workflows/documentation.yml

This file was deleted.

69 changes: 69 additions & 0 deletions .github/workflows/sonarcloud.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
# This workflow uses actions that are not certified by GitHub.
# They are provided by a third-party and are governed by
# separate terms of service, privacy policy, and support
# documentation.

# This workflow helps you trigger a SonarCloud analysis of your code and populates
# GitHub Code Scanning alerts with the vulnerabilities found.
# Free for open source project.

name: SonarCloud analysis

on:
pull_request:
branches: [feature/*]
workflow_dispatch:

permissions:
pull-requests: read # allows SonarCloud to decorate PRs with analysis results

jobs:
Analysis:
runs-on: ubuntu-latest
steps:
- name: Checkout the project from Git
uses: actions/checkout@v3
with:
fetch-depth: 0
- name: Setup Python 3.8
uses: actions/setup-python@v3
with:
python-version: "3.8"
- name: Setup Graphviz
uses: ts-graphviz/setup-graphviz@v1
- name: Install dependencies
run: pip install -e ".[setup,test]"
- name: Run test using coverage
run: coverage run -m pytest
- name: Generate coverage report
run: coverage xml
- name: Analyze with SonarCloud
# You can pin the exact commit or the version.
# uses: SonarSource/sonarcloud-github-action@commithas or tag
uses: SonarSource/[email protected]
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }} # Needed to get PR information
SONAR_TOKEN: ${{ secrets.SONAR_TOKEN }} # Generate a token on Sonarcloud.io, add it to the secrets of this repo with the name SONAR_TOKEN (Settings > Secrets > Actions > add new repository secret)
with:
# Additional arguments for the sonarcloud scanner
args:
-Dsonar.projectKey=startleft
-Dsonar.organization=continuumsec
-Dsonar.python.version=3.8,3.9,3.10,3.11
-Dsonar.qualitygate.wait=true
-Dsonar.python.coverage.reportPaths=coveragereport/coverage.xml

# Args explanation
# Unique keys of your project and organization. You can find them in SonarCloud > Information (bottom-left menu)
# mandatory
# -Dsonar.projectKey=
# -Dsonar.organization=

# Version of supported python versions to get a more precise analysis
# -Dsonar.python.version=

# Flag to way for Analysis Quality Gate results, if fail the steps it will be marked as failed too.
# -Dsonar.qualitygate.wait=

# The path for coverage report to use in the SonarCloud analysis, it must be in XML format.
# -Dsonar.python.coverage.reportPaths=
11 changes: 5 additions & 6 deletions .github/workflows/startleft-unit-integration-fast.yml
Original file line number Diff line number Diff line change
Expand Up @@ -27,15 +27,14 @@ jobs:
with:
python-version: "3.8"

- name: Update pip version to 23.0.1
run: python -m pip install --upgrade pip==23.0.1

- name: Setup Graphviz
uses: ts-graphviz/setup-graphviz@v1

- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install .
pip install -e ".[setup,test]"
run: pip install -e ".[setup,test]"

- name: Test with pytest
run: |
python run_tests.py --log-level debug
run: python run_tests.py --log-level debug
26 changes: 13 additions & 13 deletions .github/workflows/startleft-unit-integration-full.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,31 +26,31 @@ jobs:
git fetch --prune --unshallow
git fetch --depth=1 origin +refs/tags/*:refs/tags/*
- name: Setup Graphviz
uses: ts-graphviz/setup-graphviz@v1

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v3
with:
python-version: ${{ matrix.python-version }}

- if: runner.os == 'Windows'
- name: Update pip version to 23.0.1
run: python -m pip install --upgrade pip==23.0.1

- name: Setup Graphviz
uses: ts-graphviz/setup-graphviz@v1

- name: Configure Graphviz in Windows
if: runner.os == 'Windows'
shell: bash
run: |
pip install --global-option=build_ext --global-option="-IC:\Program files\Graphviz\include" --global-option="-LC:\Program files\Graphviz\lib" pygraphviz
echo "C:\Program Files\Graphviz\bin" >> $GITHUB_PATH
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install .
pip install -e ".[setup,test]"
run: pip install -e ".[setup,test]"

# This step MUST be after the general installation of StartLeft
- if: runner.os == 'Windows'
run: |
pip install python-magic-bin
- name: Install libmagic in Windows
if: runner.os == 'Windows'
run: pip install python-magic-bin

- name: Test with pytest
run: |
python run_tests.py --log-level debug
run: python run_tests.py --log-level debug
4 changes: 4 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -51,6 +51,10 @@ coverage.xml
.hypothesis/
.pytest_cache/
test-reports/
/coveragereport/

# SonarLint plugin
.scannerwork

# Translations
*.mo
Expand Down
45 changes: 20 additions & 25 deletions _sl_build/modules.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,33 +2,28 @@

ROOT_DIR = os.path.dirname(os.path.dirname(os.path.realpath(__file__)))

PROCESSORS = [
{'name': 'slp_base', 'type': 'processor',
'forbidden_dependencies': ['startleft', 'slp_cft', 'slp_tf', 'slp_visio', 'slp_mtmt']},
{'name': 'slp_cft', 'type': 'processor', 'provider_type': 'CLOUDFORMATION',
'forbidden_dependencies': ['startleft', 'slp_tf', 'slp_visio', 'slp_mtmt']},
{'name': 'slp_tf', 'type': 'processor', 'provider_type': 'TERRAFORM',
'forbidden_dependencies': ['startleft', 'slp_cft', 'slp_visio', 'slp_mtmt']},
{'name': 'slp_visio', 'type': 'processor', 'provider_type': 'VISIO',
'forbidden_dependencies': ['startleft', 'slp_cft', 'slp_tf', 'slp_mtmt']},
{'name': 'slp_visio', 'type': 'processor', 'provider_type': 'LUCID',
'forbidden_dependencies': ['startleft', 'slp_cft', 'slp_tf', 'slp_mtmt']},
{'name': 'slp_mtmt', 'type': 'processor', 'provider_type': 'MTMT',
'forbidden_dependencies': ['startleft', 'slp_cft', 'slp_tf', 'slp_visio']}
]

_general_modules_forbidden_dependencies = ['startleft'] + [processor['name'] for processor in PROCESSORS]
GENERAL_MODULES = [
{'name': 'sl_util', 'type': 'general', 'forbidden_dependencies': _general_modules_forbidden_dependencies},
{'name': 'otm', 'type': 'general', 'forbidden_dependencies': _general_modules_forbidden_dependencies}
]

STARTLEFT_MODULE = {'name': 'startleft', 'type': 'general', 'allowed_imports': ['slp_base', 'otm', 'sl_util']}
# TODO Startleft needs to depend on TF and CFT processors until a decision is token about the search function
_startleft_forbidden_dependencies = [p['name'] for p in PROCESSORS if 'provider_type' in p and p['name'] not in ['slp_cft', 'slp_tf']]
STARTLEFT_MODULE = [{'name': 'startleft', 'type': 'general', 'forbidden_dependencies': _startleft_forbidden_dependencies}]

ALL_MODULES = PROCESSORS + GENERAL_MODULES + STARTLEFT_MODULE
STARTLEFT_MODULE['allowed_imports'].extend(['slp_cft', 'slp_tf'])

# TODO Dependency between otm and sl_util must be removed
OTM_MODULE = {'name': 'otm', 'type': 'general', 'allowed_imports': ['sl_util']}

SL_UTIL_MODULE = {'name': 'sl_util', 'type': 'general', 'allowed_imports': ['otm']}

_slp_allowed_imports = ['slp_base', 'sl_util', 'otm']
PROCESSORS = [
{'name': 'slp_base', 'type': 'processor', 'allowed_imports': _slp_allowed_imports},
{'name': 'slp_cft', 'type': 'processor', 'provider_type': 'CLOUDFORMATION', 'allowed_imports': _slp_allowed_imports},
{'name': 'slp_tf', 'type': 'processor', 'provider_type': 'TERRAFORM', 'allowed_imports': _slp_allowed_imports},
{'name': 'slp_tfplan', 'type': 'processor', 'provider_type': 'TFPLAN', 'allowed_imports': _slp_allowed_imports},
{'name': 'slp_visio', 'type': 'processor', 'provider_type': 'VISIO', 'allowed_imports': _slp_allowed_imports},
{'name': 'slp_visio', 'type': 'processor', 'provider_type': 'LUCID', 'allowed_imports': _slp_allowed_imports},
{'name': 'slp_mtmt', 'type': 'processor', 'provider_type': 'MTMT', 'allowed_imports': _slp_allowed_imports}
]

"""
All the StartLeft modules are defined here, along with their dependencies. Further information is available in:
https://iriusrisk.github.io/startleft/development/Architecture
"""
ALL_MODULES = [STARTLEFT_MODULE] + [OTM_MODULE] + [SL_UTIL_MODULE] + PROCESSORS
12 changes: 4 additions & 8 deletions _sl_build/secure_importer.py
Original file line number Diff line number Diff line change
Expand Up @@ -5,12 +5,12 @@


def _build_dependencies_map():
module_dependencies = [{module['name']: module['forbidden_dependencies']} for module in ALL_MODULES]
module_dependencies = [{module['name']: module['allowed_imports']} for module in ALL_MODULES]
return {name: dependencies for module in module_dependencies for name, dependencies in module.items()}


_module_names = [module['name'] for module in ALL_MODULES]
_forbidden_dependencies = _build_dependencies_map()
_allowed_imports = _build_dependencies_map()


def _get_base_module_name(full_name):
Expand All @@ -22,15 +22,11 @@ def _is_module_restricted(importing_module: str, imported_module: str):
base_imported_module = _get_base_module_name(imported_module)

if not base_importing_module or not base_imported_module or \
base_importing_module == base_imported_module or \
base_importing_module not in _module_names or base_imported_module not in _module_names:
return False

forbidden_dependencies = _forbidden_dependencies[base_importing_module]
for fd in forbidden_dependencies:
if fd == base_imported_module:
return True

return False
return base_imported_module not in _allowed_imports[base_importing_module]


def _secure_importer(name, globals=None, locals=None, fromlist=(), level=0):
Expand Down
2 changes: 1 addition & 1 deletion deployment/Dockerfile.docs
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ FROM squidfunk/mkdocs-material

RUN pip install --upgrade pip

RUN pip install mkdocs-glightbox
RUN pip install -r requirements.txt

COPY /docs ./docs
COPY mkdocs.yml .
6 changes: 3 additions & 3 deletions docs/About.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,12 +39,12 @@ Deploy the documentation using the provided `docker-compose.yml` file inside the
- With docker installed from debian/ubuntu packages (docker.io) and the docker-compose plugin
```shell
cd deployment
docker-compose up -d docs
docker-compose up -d startleft-docs
```
- With docker installed from docker.com packages
```shell
cd deployment
docker compose up -d docs
docker compose up -d startleft-docs
```


Expand All @@ -56,7 +56,7 @@ Now you can access the docs in [http://localhost:8000](http://localhost:8000).
## Launch StartLeft documentation by mkdocs serve
Run into StartLeft root folder
```shell
pip install -e ".[doc]"
pip install -r docs/requirements.txt
mkdocs serve
```

Expand Down
6 changes: 3 additions & 3 deletions docs/Quickstart-Guide-for-Beginners.md
Original file line number Diff line number Diff line change
Expand Up @@ -174,16 +174,16 @@ files used for External Threat Model conversions as MTMT (Microsoft Threat Model

#### **OTM**
These files may have been generated by StartLeft or handcrafted by any user. To see how to validate
an OTM file, we can download an example from the `examples/manual` folder.
an OTM file, we can download an example from the `examples/otm` folder.
```shell
wget https://raw.githubusercontent.com/iriusrisk/startleft/main/examples/manual/manual.otm
wget https://raw.githubusercontent.com/iriusrisk/startleft/main/examples/otm/manual_threat_model.otm
```

And then validate it by executing:
???+ example "OTM example"

```shell
startleft validate --otm-file manual.otm
startleft validate --otm-file manual_threat_model.otm
```

???+ warning "Mapping file and otm validation"
Expand Down
2 changes: 1 addition & 1 deletion docs/Troubleshooting.md
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ library as indicated in the [prerequisites section](Quickstart-Guide-for-Beginne
When trying to launch StartLeft documentation by `mkdocs serve` using IntelliJ, you may get an
error stating that the `glightbox` package is not installed.

This requires re-running the `pip install -e ".[doc]"`
This requires re-running the `pip install -r docs/requirements.txt`
command and restarting the IDE.
---

Expand Down
Loading

0 comments on commit 32beb61

Please sign in to comment.