Skip to content

Commit

Permalink
Add additional test checks
Browse files Browse the repository at this point in the history
Add additional checks definition to the spec files and extend
the jsonschema with them.

Signed-off-by: Miroslav Vadkerti <[email protected]>
  • Loading branch information
thrix committed Oct 4, 2022
1 parent 2f4eae4 commit 9485aad
Show file tree
Hide file tree
Showing 4 changed files with 91 additions and 22 deletions.
2 changes: 2 additions & 0 deletions spec/plans/discover.fmf
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,8 @@ description: |
result: respect
tag: [tag]
tier: 1
check:
- avc

/test/two:
summary: Short test summary.
Expand Down
31 changes: 29 additions & 2 deletions spec/plans/execute.fmf
Original file line number Diff line number Diff line change
Expand Up @@ -25,6 +25,10 @@ description: |
/test/one:
result: OUTCOME
log: PATH
check:
CHECK_NAME:
result: TEST_RESULT
log: PATH

/test/two:
result: OUTCOME
Expand All @@ -45,8 +49,9 @@ description: |
warn
A problem appeared during test execution which does
not affect test results but might be worth checking
and fixing. For example test cleanup phase failed.
Automation must treat this as a failed test.
and fixing. For example test cleanup phase failed
or a check failure for a passing test. Automation
must treat this as a failed test.
error
Undefined problem encountered during test execution.
Human inspection is needed to investigate whether it
Expand All @@ -64,6 +69,13 @@ description: |
The ``DURATION`` is an optional section stating how long did
the test run. Its value is in the ``hh:mm:ss`` format.

The ``CHECK_NAME`` specifies a check which was executed during
the testing. Multiple checks can end up here.

Checks can be overridden by the ``check`` property of the
``execute`` step. The property in execute overrides all checks
from the test attribute.

/upgrade:
summary: Perform system upgrades during testing
story:
Expand Down Expand Up @@ -318,3 +330,18 @@ description: |
systemctl start httpd
echo foo > /var/www/html/index.html
curl http://localhost/ | grep foo


/avc:
summary: A shell script with AVC check enabled
description:
The ``avc`` check is enabled when running this test.
example: |
execute:
check:
- avc
script: |
dnf -y install httpd curl
systemctl start httpd
echo foo > /var/www/html/index.html
curl http://localhost/ | grep foo
56 changes: 36 additions & 20 deletions spec/plans/report.fmf
Original file line number Diff line number Diff line change
Expand Up @@ -93,43 +93,59 @@ description:
result: OVERALL_RESULT
plans:
/plan/one:
result: PLAN_RESULT
result: PLAN_OUTCOME
tests:
/test/one:
result: TEST_RESULT
result: TEST_OUTCOME
log: LOG_PATH

/test/two:
result: TEST_RESULT
result: TEST_OUTCOME
log:
- LOG_PATH
- LOG_PATH
- LOG_PATH
/plan/two:
result: PLAN_RESULT
result: PLAN_OUTCOME
/test/one:
result: TEST_RESULT
result: TEST_OUTCOME
log: LOG_PATH

Where ``OVERALL_RESULT`` is the overall result of all plan
results. It is counted the same way as ``PLAN_RESULT``.

Where ``TEST_RESULT`` is the same as in `execute`_ step
definition:

* info - test finished and produced only information
message
* passed - test finished and passed
* failed - test finished and failed
* error - a problem encountered during test execution

Note the priority of test results is as written above,
results. It is counted the same way as ``PLAN_OUTCOME``.

Where ``TEST_OUTCOME`` is the same as ``OUTCOME`` in
the `execute`_ step definition:

pass
Test execution successfully finished and passed.
info
Test finished but only produced an informational
message. Represents a soft pass, used for skipped
tests and for tests with the :ref:`/spec/tests/result`
attribute set to *ignore*. Automation must treat
this as a passed test.
warn
A problem appeared during test execution which does
not affect test results but might be worth checking
and fixing. For example test cleanup phase failed
or a check failure for a passing test. Automation
must treat this as a failed test.
error
Undefined problem encountered during test execution.
Human inspection is needed to investigate whether it
was a test bug, infrastructure error or a real test
failure. Automation must treat it as a failed test.
fail
Test execution successfully finished and failed.

Note the priority of test results is as written above,
with ``info`` having the lowest priority and ``error`` has
the highest. This is important for ``PLAN_RESULT``.
the highest. This is important for ``PLAN_OUTCOME``.

Where ``PLAN_RESULT`` is the overall result or all test
Where ``PLAN_OUTCOME`` is the overall result or all test
results for the plan run. It has the same values as
``TEST_RESULT``. Plan result is counted according to the
``TEST_OUTCOME``. Plan result is counted according to the
priority of the test outcome values. For example:

* if the test results are info, passed, passed - the
Expand Down
24 changes: 24 additions & 0 deletions spec/tests/check.fmf
Original file line number Diff line number Diff line change
@@ -0,0 +1,24 @@
summary: Specify additional checks which should be enabled by the test runner.

story: As a tester I want to enable additional checks when running my tests.

description: |
In some cases we want to run additional checks while running a test. A nice
example is a check for unexpected SELinux AVCs produced during the test,
this can point to additional issues a user can run into and is very valuable
for Red Hat based and Fedora distros.

These checks can alter the test result and usually provide a separate log
output. Note that the value set here can be overridden by L2 metadata.

Currently the following additional checks are recognized:

avc
SELinux AVCs are inspected during the test time
test-inspector
Run test inspector

By default no additional checks are run.

example: |
checks: -avc test-inspector

0 comments on commit 9485aad

Please sign in to comment.