Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unit testing #676

Merged
merged 27 commits into from
May 29, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
27 commits
Select commit Hold shift + click to select a range
7345051
Start of Redis and Disk Queue unit testing
gcglinton May 3, 2023
11ed506
Fully functional RedisQueue test suite
gcglinton May 4, 2023
022ba0e
Fully functional DiskQueue test suite
gcglinton May 5, 2023
c0db77f
Refined the RedisQueue test framework
gcglinton May 5, 2023
921a64b
Added pip requirements for unit tests
gcglinton May 5, 2023
bb89cec
Tweaked unit test output
gcglinton May 5, 2023
eba5350
Move pytests to tests directory to keep it clean
gcglinton May 5, 2023
bbf1b57
Bump min version of pytest to 7
gcglinton May 5, 2023
73fe205
Add gitignore to obsolete
gcglinton May 5, 2023
85ada17
Add testing README
gcglinton May 5, 2023
9be1d9e
Complete retry unit test
gcglinton May 8, 2023
6d462b4
fix add_option mock
gcglinton May 8, 2023
ba09129
Updated path separator
gcglinton May 8, 2023
f71f9ae
More work to retry unit test
gcglinton May 8, 2023
0451f2a
Preliminary work on NoDupe unit test
gcglinton May 8, 2023
80af69a
Retry unit test works again
gcglinton May 8, 2023
c2eab90
Reconfigure Unit testing to be run from repo root
gcglinton May 15, 2023
e0ca36b
Add more functionality/features to test suite
gcglinton May 15, 2023
7243913
Unit testing GH workflow
gcglinton May 15, 2023
d06f53e
Merge branch 'MetPX:main' into UnitTesting
gcglinton May 16, 2023
ce6ec8a
Add more tests to nodupe
gcglinton May 19, 2023
a6110e1
Change name of unit testing workflow
gcglinton May 23, 2023
e8d720c
Install sr3 for testing
gcglinton May 23, 2023
d103c41
Fix artifact upload names
gcglinton May 23, 2023
033b191
Add "step"-based testing sample
gcglinton May 23, 2023
71a92a3
Changed retry testing to compare disk to redis
gcglinton May 23, 2023
8d8f1d1
Change remaining retry tests to compare drivers
gcglinton May 24, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
78 changes: 78 additions & 0 deletions .github/workflows/unit-test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
name: Unit Testing

on:
pull_request:
types: [opened, edited, reopened]
push:
branches:
- v03_wip

jobs:
build:

strategy:
fail-fast: false
matrix:
osver: [ "ubuntu-20.04", "ubuntu-22.04" ]

runs-on: ${{ matrix.osver }}

name: Unit test on ${{ matrix.osver }}
timeout-minutes: 30

steps:
- uses: actions/checkout@v3

- name: Install dependencies
run: |
python -m pip install --upgrade pip

pip install -r requirements.txt
pip install -e .
pip install -r tests/requirements.txt

- name: Test with pytest
run: |
pytest tests --junitxml=tests/junit/test-results.xml \
--cov-config=tests/.coveragerc --cov=sarracenia --cov-report=html --cov-report=lcov --cov-report=xml \
--html=tests/report.html --self-contained-html

- name: Upload pytest junit results
uses: actions/upload-artifact@v3
with:
name: results-junit-${{ matrix.osver }}
path: tests/junit/test-results.xml
# Use always() to always run this step to publish test results when there are test failures
if: ${{ always() }}

- name: Upload pytest HTML report
uses: actions/upload-artifact@v3
with:
name: results-report-${{ matrix.osver }}
path: tests/report.html
# Use always() to always run this step to publish test results when there are test failures
if: ${{ always() }}

- name: Upload code coverage report (HTML)
uses: actions/upload-artifact@v3
with:
name: coverage-report-${{ matrix.osver }}
path: tests/coverage/html_report
# Use always() to always run this step to publish test results when there are test failures
if: ${{ always() }}

- name: Upload code coverage report (LCOV)
uses: actions/upload-artifact@v3
with:
name: coverage-lcov-${{ matrix.osver }}
path: tests/coverage/coverage.lcov
# Use always() to always run this step to publish test results when there are test failures
if: ${{ always() }}

- name: Upload code coverage report (XML)
uses: actions/upload-artifact@v3
with:
name: coverage-xml-${{ matrix.osver }}
path: tests/coverage/coverage.xml
# Use always() to always run this step to publish test results when there are test failures
if: ${{ always() }}
1 change: 1 addition & 0 deletions obsolete/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
__pycache__
38 changes: 38 additions & 0 deletions tests/.coveragerc
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# .coveragerc to control coverage.py
[run]
branch = True
data_file = tests/coverage/.coverage
relative_files = True

[report]
# Regexes for lines to exclude from consideration
exclude_also =
# Don't complain about missing debug-only code:
def __repr__
if self\.debug

# Don't complain if tests don't hit defensive assertion code:
raise AssertionError
raise NotImplementedError

# Don't complain if non-runnable code isn't run:
if 0:
if __name__ == .__main__.:

# Don't complain about abstract methods, they aren't run:
@(abc\.)?abstractmethod

ignore_errors = True

[html]
directory = tests/coverage/html_report

[xml]
output = tests/coverage/coverage.xml

[json]
output = tests/coverage/coverage.json
pretty_print = True

[lcov]
output = tests/coverage/coverage.lcov
4 changes: 4 additions & 0 deletions tests/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
__pycache__
junit
coverage
report.html
60 changes: 60 additions & 0 deletions tests/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
# Unit Testing

## Details
Unit tests are currently setup to use [Pytest](https://docs.pytest.org/en/7.3.x/contents.html).

## Setup

Setting up an environment for testing is quite simple.

From a clean Linux image you run the following commands, and you'll be all set.

1. Download the Sarracenia source
`git clone https://github.com/MetPX/sarracenia`
2. Checkout the branch you're working on (optional)
`git checkout <BranchName>`
3. Update PIP
`pip3 install -U pip`
4. Install base requirements for Sarracenia
`pip install -r requirements.txt`
5. Install sarracenia Python modules
`pip3 install -e .`
6. Install requirements for PyTest
`pip install -r tests/requirements.txt`

That's it, you're now ready to run the tests.

## Running

From within the repository root directory, simply run `pytest tests` and you'll see the results. There's a full set of arguments/options that modify the output, outlined [here](https://docs.pytest.org/en/7.3.x/reference/reference.html#ini-options-ref).

As configured, it will output a bit of system information, total tests, followed by a line per file, with dots/letters to indicate the status of each test in that file.
Letter meanings:
(`f`)ailed, (`E`)rror, (`s`)kipped, (`x`)failed, (`X`)passed, (`p`)assed, (`P`)assed with output, (`a`)ll except passed (`p`/`P`), (`w`)arnings, (`A`)ll

Specifying the `-v` option will make it a bit more verbose, listing each tests, and it's pass/skip/fail status.

Application logs captured during tests can be output with the `-o log_cli=true` argument.

### Reporting
Basic HTML report of tests can be generated by adding `--html=tests/report.html --self-contained-html` to the command-line.

Code coverage can be generated by adding `--cov-config=tests/.coveragerc --cov=sarracenia --cov-report=xml --cov-report=html` to the command-line.

Junit-compatible unit test report can be had with by adding `--junitxml=tests/junit/test-results.xml` to the command-line.


## Docker
You can also run the exact same tests from within a Docker container if you want to avoid having to (re)-provision clean installs.

If the code is already present on the host system, you can use the `python` image, and map the code into the container for the tests:
`docker run --rm -it --name sr3-pytest -v $(pwd):/app -w /app python:3 bash`

Then you run the Setup section above, starting at Step 3.

If the code isn't already on your system, then the following should get you setup:
- Run container
`docker run --rm -it --name sr3-pytest ubuntu bash`
- Install dependencies (Ubuntu/Debian)
`apt-get update && apt-get install -y git python3 python3-pip`
- Then follow the Setup directions above
11 changes: 11 additions & 0 deletions tests/pytest.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
[pytest]
minversion = 7.0
# With Code Coverage
#addopts = --cov-config=tests/.coveragerc --cov=sarracenia --cov-report=html --cov-report=lcov
# With JUnit test report
#addopts = --junitxml=tests/junit/test-results.xml
norecursedirs = obsolete docs debian docker tools
python_files = *_test.py
python_functions = test_*
log_cli = False
testpaths = tests
9 changes: 9 additions & 0 deletions tests/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,9 @@
pytest>=7.3
pytest-cov>=4.0
pytest-bug>=1.2
pytest-depends>=1.0
pytest-html>=3.2

python-redis-lock>=4
fakeredis>=2.11
fakeredis[lua]>=2.11
Loading