Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

the results are not sent to the test run if there is at least one test without an qase.id #217

Open
Golom3402 opened this issue Jun 6, 2024 · 12 comments

Comments

@Golom3402
Copy link

so, has three tests, two with qase.id, and one without this.
During the test run, a report on the test run is created in the CASE system. But it turns out to be empty.
if i exclude a test without an ID, all results are sent successfully.
during chunk sending appear 400 error.

Exception in thread Thread-2 (_send_results_threaded):
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/threading.py", line 1016, in _bootstrap_inner
self.run()
File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/threading.py", line 953, in run
self._target(*self._args, **self._kwargs)
File ".../venv/lib/python3.10/site-packages/qase/commons/reporters/testops.py", line 76, in _send_results_threaded
self.client.send_results(self.project_code, self.run_id, results)
File ".../venv/lib/python3.10/site-packages/qase/commons/client/api_v1_client.py", line 127, in send_results
api_results.create_result_bulk(
File ".../venv/lib/python3.10/site-packages/pydantic/validate_call_decorator.py", line 58, in wrapper_function
return validate_call_wrapper(*args, **kwargs)
File ".../venv/lib/python3.10/site-packages/pydantic/_internal/_validate_call.py", line 81, in call
res = self.pydantic_validator.validate_python(pydantic_core.ArgsKwargs(args, kwargs))
File ".../venv/lib/python3.10/site-packages/qase/api_client_v1/api/results_api.py", line 445, in create_result_bulk
return self.api_client.response_deserialize(
File ".../venv/lib/python3.10/site-packages/qase/api_client_v1/api_client.py", line 321, in response_deserialize
raise ApiException.from_response(
File ".../venv/lib/python3.10/site-packages/qase/api_client_v1/exceptions.py", line 143, in from_response
raise BadRequestException(http_resp=http_resp, body=body, data=data)
qase.api_client_v1.exceptions.BadRequestException: (400)
Reason: Bad Request
HTTP response headers: HTTPHeaderDict({'Date': 'Thu, 06 Jun 2024 17:03:20 GMT', 'Content-Type': 'application/json', 'Transfer-Encoding': 'chunked', 'Connection': 'keep-alive', 'Cache-Control': 'no-cache, private', 'X-RateLimit-Limit': '1500', 'X-RateLimit-Remaining': '1496', 'Strict-Transport-Security': 'max-age=31536000; includeSubDomains'})
HTTP response body: {"status":false,"errorMessage":"Data is invalid.","errorFields":[{"field":"results.2.case_id","error":"The case id field is required."},{"field":"results.2","error":"Invalid step result definition."}]}

[Qase][19:03:21][info] Run 2228 was completed successfully
[Qase][19:03:21][info] Overhead for Qase Report: 1183ms

Installed follow packages and versions:
qase-pytest v6.0.1
qase-python-commons v3.0.3
qase-api-client 1.0.1
qase-api-v2-client 1.0.0

and of course and unfortunately , the "Auto create test cases" mechanism does not work.

@NickVolynkin
Copy link
Contributor

Thanks for reporting, we will check it

@Golom3402
Copy link
Author

sorry, I just noticed, maybe it's an incompatibility with pydantic library.
We use pydantic v 2.6.1

@NickVolynkin
Copy link
Contributor

@Golom3402 did it resolve with a different version of pydantic?

If not, do you have "Auto create test cases" enabled in your repository settings?

@Golom3402
Copy link
Author

No, using different versions of pydantic, is not resolve the problem.
And "auto create test case" option in test run settings in enabled.

@gibiw
Copy link
Contributor

gibiw commented Jul 18, 2024

@Golom3402 Can you share your tests? And also share the command you are running the tests with?

@Golom3402
Copy link
Author

i have one file pytest.ini, where has section [pytest] and has line "addopts= ..."
in addopts line i define next values:
"--qase-testops-api-token="my_api_token" --qase-testops-project="SKLV" --qase-mode=testops"
I run the tests themselves using the command "pytest -v tests/ --qase-testops-run-title="run_title" --qase-debug=true -n 8"

@Golom3402
Copy link
Author

the recipe is as follows:

  • creating 10 dummy tests
  • are adding existing qase.id ones to 9 of them
  • turn this
    ...
    An empty report will be created in qaseio testops. And in the code execution log, there will be a 400 error when the qase client tries to send a chunk with the results

@Golom3402
Copy link
Author

Golom3402 commented Jul 18, 2024

in addition to the absence qase.id - the problem happens when there are even small differences in qase.steps. For example, the repository specifies 4 steps, and the autotest specifies 3 qase.step. In this case, sending a chunk with a set of results will also fail. And empty report will be created in qaseio testops.

@Golom3402
Copy link
Author

and, another important detail: in the project settings in the qaseio testops service, the "Auto create test cases" option should be disabled

@gibiw
Copy link
Contributor

gibiw commented Aug 1, 2024

@Golom3402 This is currently the correct behavior. If the "Auto create test cases" option is disabled and the test does not have a QaseID or the number of steps is different, then we cannot load this result.

We will work on this issue in the near future.

@Golom3402
Copy link
Author

wait, are you saying that if I have 100 autotests, and one of them does not have a qaseid (for example, it has just been written), then it is possible not to accept the results for the remaining 99 cases? And is this normal in your opinion?

@gibiw
Copy link
Contributor

gibiw commented Aug 1, 2024

We think this is wrong behaviour. But the current implementation works this way and we are working on fixing this issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants