Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Redirect luatest stderr to stdout with new option #410

Draft
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

ochaplashkin
Copy link

@ochaplashkin ochaplashkin commented Sep 18, 2023

Description

By default, test-run captures the stderr stream and redirects its to the log file:

$ cat /tmp/t/001_foo-luatest/foo_test.log:
[001] Some error log
[001] My warning log
...

Use the new option --show-capture (abbr. -c) to redirect stderr to stdout. Use it instead of the deprecated --verbose option.

The --verbose option will be ignored and output:

Argument ['--verbose'] is deprecated and is ignored.

What's new in usage

🟢 -c, --show-capture - a new option. Show test logs in the stdout.

$ ./test-run.py -c engine-luatest/log_test.lua
or
$ ./test-run.py --show-capture engine-luatest/log_test.lua
...
WORKR TEST                                            PARAMS          RESULT
---------------------------------------------------------------------------------
[001] engine-luatest/log_test.lua                                     
[001] Tarantool version is 3.0.0-alpha1-176-ga5d7f3429
[001] PRINT INTO test
[001] TAP version 13
[001] 1..1
[001] # Started on Mon Sep 18 18:27:13 2023
[001] # Starting group: engine-luatest.log
[001] PRINT INTO TEST INTERNAL
[001]  log
[001] [WARNING] log
[001] [INFO] log
[001] ok     1	engine-luatest.log.test_log
[001] # Ran 1 tests in 0.006 seconds, 1 succeeded, 0 failed
[001] [ pass ]

🔴 --verbose has been deprecated.

$ ./test-run.py --verbose engine-luatest/log_test.lua
Argument ['--verbose'] is deprecated and is ignored.
Started ./test-run.py --verbose engine-luatest/log_test.lua
...

Close tarantool/luatest#308

@ochaplashkin ochaplashkin force-pushed the add-show-capture-option branch 2 times, most recently from 16c2ad2 to fe1d6dc Compare September 18, 2023 13:59
@coveralls
Copy link

coveralls commented Sep 18, 2023

Coverage Status

coverage: 62.62% (+0.1%) from 62.479% when pulling dfc7559 on ochaplashkin:add-show-capture-option into 8ebb3aa on tarantool:master.

@ochaplashkin ochaplashkin marked this pull request as draft September 18, 2023 14:07
@Totktonada
Copy link
Member

What is the point of renaming the --verbose option?

I would also highlight that mixing of the stdout and stderr streams reopens #392.

tarantool/luatest#308 is too common, as for me. What is the test-run problem you want to solve with the change? Show stderr content from a luatest test when --verbose is given? If so, let's just show the log file content instead of mixing it into stdout that feeds it to a TAP13 parser.

By default, test-run captures the stderr stream and redirects its to the
log file:

    $ cat /tmp/t/001_foo-luatest/foo_test.log:
    [001] Some error log
    [001] My warning log
    ...

Use the new option `--show-capture` (abbr. `-c`) to redirect stderr to
stdout. Use it instead of the deprecated `--verbose` option.

The `--verbose` option will be ignored and output:

    Argument ['--verbose'] is deprecated and is ignored.

Close tarantool/luatest#308
@ochaplashkin ochaplashkin force-pushed the add-show-capture-option branch 3 times, most recently from ca0b2dd to de998a4 Compare September 25, 2023 10:40
@ochaplashkin
Copy link
Author

What's new in the second commit?

We have a two tests with logs:

Passed test
...
print('PRINT INTO test')
g.test_log = function()
    print('PRINT INTO TEST INTERNAL')
    log.error(' log')
    log.warn('[WARNING] log')
    log.info('[INFO] log')
    log.verbose('[VERBOSE] log')
    log.debug('[DEBUG] log')
    g.server:exec(function()
        require('log').error('[ERROR] server log')
        require('log').warn('[WARNING] server log')
        require('log').info('[INFO] server log')
        require('log').verbose('[VERBOSE] server log')
        require('log').debug('[DEBUG] server log')
    end)
end
Failed test
...
print('PRINT INTO test')
g.test_log = function()
    print('PRINT INTO TEST INTERNAL')
    log.error(' log')
    log.warn('[WARNING] log')
    log.info('[INFO] log')
    log.verbose('[VERBOSE] log')
    log.debug('[DEBUG] log')
    g.server:exec(function()
        require('log').error('[ERROR] server log')
        require('log').warn('[WARNING] server log')
        require('log').info('[INFO] server log')
        require('log').verbose('[VERBOSE] server log')
        require('log').debug('[DEBUG] server log')
    end)
    t.assert(false) <-- Boom!
end

So, I tried to add the same logic that exists in luatest: duplicate stderr using dup2_io.

What will we see when test passed?

  1. Run without options
$ ./test-run.py passed_test.lua
[001] engine-luatest/passed_test.lua                                     [ pass ]
  1. Run with -c option
$ ./test-run.py -c passed_test.lua
log
[WARNING] log
[INFO] log
[001] engine-luatest/passed_test.lua                                     [ pass ]
  1. Run with -c -v options
$ ./test-run.py -c -v passed_test.lua
 log
[WARNING] log
[INFO] log
[001] engine-luatest/passed_test.lua                                     
[001] Tarantool version is 3.0.0-alpha1-176-ga5d7f3429
[001] PRINT INTO test
[001] TAP version 13
[001] 1..1
[001] # Started on Mon Sep 25 14:52:56 2023
[001] # Starting group: engine-luatest.log
[001] PRINT INTO TEST INTERNAL
[001] ok     1	engine-luatest.passed.test_log
[001] # Ran 1 tests in 0.001 seconds, 1 succeeded, 0 failed
[001] [ pass ]

What will we see when test failed?

  1. Run without options
$ ./test-run.py failed_test.lua
[001] engine-luatest/failed_test.lua                                    [ fail ]
[001] Test failed! Output from reject file /tmp/t/rejects/engine-luatest/log2.reject:
[001] Tarantool version is 3.0.0-alpha1-176-ga5d7f3429
[001] TAP version 13
[001] 1..1
[001] # Started on Mon Sep 25 14:56:55 2023
[001] # Starting group: engine-luatest.log2
[001] not ok 1	engine-luatest.failed.test_log
[001] #   ...workspace/vk/tarantool/test/engine-luatest/failed_test.lua:26: expected: a value evaluating to true, actual: false
[001] #   stack traceback:
[001] #   	...workspace/vk/tarantool/test/engine-luatest/failed_test.lua:26: in function 'engine-luatest.log2.test_log'
[001] #   	...
[001] #   	[C]: in function 'xpcall'
[001] #   artifacts:
[001] #   server -> /tmp/t/001_engine-luatest/artifacts/server-lMMUEtMlRl5d
[001] # Ran 1 tests in 0.007 seconds, 0 succeeded, 1 failed
[001] 
[001] [test-run server "luatest_server"] The log file /tmp/t/001_engine-luatest/failed_test.log has zero size
[Main process] Got failed test; gently terminate all workers...
[001] Worker "001_engine-luatest" got failed test; stopping the server...
  1. Run with -c option
$ ./test-run.py -c failed_test.lua
 log
[WARNING] log
[INFO] log
[001] engine-luatest/failed_test.lua                                    [ fail ]
[001] Test failed! Output from reject file /tmp/t/rejects/engine-luatest/failed.reject:
[001] Tarantool version is 3.0.0-alpha1-176-ga5d7f3429
[001] PRINT INTO test
[001] TAP version 13
[001] 1..1
[001] # Started on Mon Sep 25 14:58:09 2023
[001] # Starting group: engine-luatest.failed
[001] PRINT INTO TEST INTERNAL
[001] not ok 1	engine-luatest.failed.test_log
  1. Run with -c -v options

The result is the same as when running with the -c parameter.

We synchronize the options of two tools: test-run and luatest.
When luatest runs separate tarantool instance it duplicates the streams
thereby we can see logs(warning, error, etc)  and prints. We have added
a context manager to create the same behavior with luatest.

Close tarantool/luatest#308
@ochaplashkin ochaplashkin force-pushed the add-show-capture-option branch from de998a4 to dfc7559 Compare September 25, 2023 11:04
@ochaplashkin ochaplashkin marked this pull request as ready for review September 26, 2023 15:57
@ylobankov ylobankov marked this pull request as draft October 10, 2023 09:17
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Show log of tests
3 participants