Python doctest: skip a test conditionally
Asked Answered
L

2

17

I know how to skip a doctest using # doctest: +SKIP, but I can't figure out how to skip a test sometimes, based on a runtime condition. For example:

>>> if os.path.isfile("foo"):
...    open("foo").readlines()
... else:
...    pass # doctest: +SKIP
['hello', 'world']

That's the sort of thing I want to do. I would also accept a solution which runs the test, but changes the expected result to an exception with traceback if the condition is not met (i.e. run the test unconditionally but modify the expected result).

Louis answered 10/3, 2017 at 1:56 Comment(6)
For a proper test, you should have control over whether foo does or does not exist, allowing you to know with 100% certainty what the output should be.Rummage
@chepner: I have a diverse set of machines on which my large test suite runs. A few tests require large files which do not exist on some specific machines. I do not have control over whether foo exists simply because there's no way to deploy it to some specific machines. If you prefer I could rephrase the requirement as "Ignore specific test cases on specific machines which are known to be unable to run them."Louis
@JohnZwinck, did you make any progress on this? I would be very interested.Diakinesis
@JonasAdler: No, it's basically impossible. You can use silly hacks like writing a wrapper function that does whatever you want and returns True if it passed (or if the test was skipped).Louis
How do you run your test suite? If you are using unittest as detailed here: docs.python.org/2/library/doctest.html#unittest-api you may have more control on what tests are executed or not and when.Zirconia
@PatrickMevzek: I do it from the command line, e.g. python -m doctest myfile.py, not using the API.Louis
M
7

You can return a special value if you don't want the output to be tested. Let's call _skip this special value:

  • if the appropriate flag is set and if the value you got is _skip, then the test is a success no matter what was expected
  • otherwise (ie. no flag or a normal return value), perform the test a usual.

You need a custom OutputChecker:

_skip = object()
COND_SKIP = doctest.register_optionflag('COND_SKIP')

class CondSkipChecker(doctest.OutputChecker):
    def check_output(self, want, got, optionflags):
        if optionflags & COND_SKIP and got.strip() == str(_skip):
            return True
        else:
            return super(CondSkipChecker, self).check_output(want, got, optionflags)

Here's a proof of concept (the doctest API is a bit cumbersome: one woukld like to use testmod̀ with a checker argument):

"""
>>> 1 if True else _skip
2
>>> 1 if False else _skip
2
>>> 1 if True else _skip # doctest: +COND_SKIP
2
>>> 1 if False else _skip # doctest: +COND_SKIP
2
"""

import doctest, sys
_skip = object()
COND_SKIP = doctest.register_optionflag('COND_SKIP')

class CondSkipChecker(doctest.OutputChecker):
    def check_output(self, want, got, optionflags):
        if optionflags & COND_SKIP and got.strip() == str(_skip):
            return True
        else:
            return super(CondSkipChecker, self).check_output(want, got, optionflags)

finder = doctest.DocTestFinder()
runner = doctest.DocTestRunner(CondSkipChecker())
m = sys.modules.get('__main__')
for test in finder.find(m, m.__name__):
    runner.run(test)
print(runner.summarize())

Output:

**********************************************************************
File "temp.py", line 2, in __main__
Failed example:
    1 if True else _skip
Expected:
    2
Got:
    1
**********************************************************************
File "temp.py", line 4, in __main__
Failed example:
    1 if False else _skip
Expected:
    2
Got:
    <object object at 0x0033B8A8>
**********************************************************************
File "temp.py", line 6, in __main__
Failed example:
    1 if True else _skip # doctest: +COND_SKIP
Expected:
    2
Got:
    1
**********************************************************************
1 items had failures:
   3 of   4 in __main__
***Test Failed*** 3 failures.
TestResults(failed=3, attempted=4)

The two tests without doctest annotation fail as expected. Note: you can easily add a warning if _skip is used without the COND_SKIP flag.

The third test fails (got 1 vs expected 2), but the fourth is a success (got ̀_skip + COND_SKIP vs anything is okay).

Median answered 2/4, 2019 at 17:57 Comment(1)
So it works in IDE testing, I needed to put this into conftest.pyZilvia
L
0

You could template the docstring conditionally so that it contains the doctest fixture only when the feature flag is set:

def myFunction():
  '''
  >>> if TEST_AVAILABLE:
  ...   do_one_thing
  ... else:
  ...   do_other_thing
  %s
  '''
  ...
myFunction.__doc__ %= ('''expected result''' if TEST_AVAILABLE else '')

Drawback: Won't work with Python -O.

Lag answered 24/9, 2021 at 12:8 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.