`pytest` output missing when test triggers a segfault
Asked Answered
S

1

1

I've got a strange behavior with pytest output when nesting it into an entr call for reruns. My command is

(rapids) rapids@compose:~/cuspatial/kvikio/python$ find . -name "*.py" | entr -s "pytest -s tests/test_nvcomp.py::test_lz4_newlib 2>&1 | head -n15 -"
============================= test session starts ==============================
platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/tcomer/mnt/NVIDIA/rapids-docker/cuspatial/kvikio/python
plugins: cases-3.6.13, benchmark-3.4.1, forked-1.4.0, xdist-2.5.0, hypothesis-6.47.1
collected 1 item

tests/test_nvcomp.py Fatal Python error: Segmentation fault

Current thread 0x00007ff36f52b740 (most recent call first):

The segfault is expected. What's not expected is that pytest's output is being partially removed by head. If I drop the head statement I get the full output:

(rapids) rapids@compose:~/cuspatial/kvikio/python$ find . -name "*.py" | entr -s "pytest -s tests/test_nvcomp.py::test_lz4_newlib"
============================ test session starts ============================
platform linux -- Python 3.8.13, pytest-7.1.2, pluggy-1.0.0
benchmark: 3.4.1 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/tcomer/mnt/NVIDIA/rapids-docker/cuspatial/kvikio/python
plugins: cases-3.6.13, benchmark-3.4.1, forked-1.4.0, xdist-2.5.0, hypothesis-6.47.1
collected 1 item                                                            

tests/test_nvcomp.py compressor = libnvcomp.LZ4Compressor(stream=s)
-- Create _lib._LZ4Compressor python object --
0
-- compress(data)
self.config
{'uncompressed_buffer_size': 1024, 'max_compressed_buffer_size': 65888, 'num_chunks': 1}
140088516281856
65888
[0 0 0 ... 0 0 0]
[140088516281856]
out_buffer_final
Fatal Python error: Segmentation fault
...

I'm stumped why some of my output is disappearing when I'm redirecting stderr to stdout in the above command. Any suggestions?

Saleme answered 9/8, 2022 at 15:28 Comment(2)
Pytest's output is probably buffered. The buffer isn't flushed when the segmentation fault happens.Regardful
Awesome! It is actually python buffering all the output, then losing it in the segfault. I can disable it in the answer:Saleme
S
0

The issue is that python itself buffers all the output, which is being discarded in the segfault See here. The solution is

find . -name "*.py" | entr -s "python -u -m pytest -s --sw tests/test_nvcomp.py::test_lz4_newlib 2>&1 | head -n25 -"

Which enables me to rerun the tests over and over, any time I make a .py change, and only visualize the first 25 lines of output. I'm doing this because debugging in Cython/C++ when running pytests is a whole different can of worms, so I'm tracking flow control on a bug involving, obviously, input buffers that aren't correct and cause the segfault.

Saleme answered 9/8, 2022 at 15:58 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.