Python 3.6.8 (default, Oct 7 2019, 12:59:55)
Type 'copyright', 'credits' or 'license' for more information
IPython 7.9.0 -- An enhanced Interactive Python. Type '?' for help.
In [1]: def yield_from_generator():
...: yield from (i for i in range(10000))
...:
In [2]: def yield_from_list():
...: yield from [i for i in range(10000)]
...:
In [3]: import timeit
In [4]: timeit.timeit(lambda: list(yield_from_generator()), number=10000)
Out[4]: 5.3820097140014695
In [5]: timeit.timeit(lambda: list(yield_from_list()), number=10000)
Out[5]: 4.333915593000711
I run yield from
generator and yield from
list many times. List version gives always better performance, while my intuition tells me rather opposite conclusions - making list requires i.e. memory allocation at startup. Why we can notice such performance differences?
next()
(and potentially handingStopIteration
) can be expensive – Spragueyield from mylist
somehow special-cased? Does that not donext
/StopIteration
? I also triedyield from iter([i for i in range(10000)])
(i.e., with explicititer(...)
) and that didn't make it any slower... – Scevourdef yield_from_generator(): yield from range(10000)
anddef yield_from_list(): yield from list(range(10000))
:-) – Sheathbilldef yield_from_list(lst=list(range(10000))): yield from lst
– Delanosyield_from_generator
is (slightly) faster – Nanetteyield from
is irrelevant - a plainreturn
gives essentially the same result. This question seems to be a duplicate of this one: #11964630. – Iversi for i in
which saved agenexpr
and sped things up). @Ivers I agree with that dup – Nanettein
operator on it, which is handled by the__contains__
method and thus iterates "from the inside". – Scevouryield from
doesn't play a role (what I am not sure about), yes that's a dup. – Commutativeyield from
not existing in Python 2, list comps are slightly different in Python 3: they now create their own scope, but IIRC some improvements have been made so that having their own scope doesn't have a big impact on speed. So I'm hesitant to dupe-close this question to a Python 2 dupe target. – Epithalamium__contains__
method and thein
operator` falls back to "from the outside" iteration. – Scevouryield from
, which has distinct advantages overyield
in a Python loop. – Epithalamium__contains__
are separate from that, which is reflected in the two answers given. – Iversyield from
adds extra overhead, but it affects list-comps and generators in exactly the same way - so I don't think it has any real relevance to the question. – Ivers