How to join two generators (or other iterables) in Python?
Asked Answered
F

15

273

I want to change the following code

for directory, dirs, files in os.walk(directory_1):
    do_something()

for directory, dirs, files in os.walk(directory_2):
    do_something()

to this code:

for directory, dirs, files in os.walk(directory_1) + os.walk(directory_2):
    do_something()

I get the error:

unsupported operand type(s) for +: 'generator' and 'generator'

How to join two generators in Python?

Fin answered 9/7, 2010 at 8:29 Comment(0)
S
361

itertools.chain() should do it. It takes multiple iterables and yields from each one by one, roughly equivalent to:

def chain(*iterables):
    for it in iterables:
        for element in it:
            yield element

Usage example:

from itertools import chain

g = (c for c in 'ABC')  # Dummy generator, just for example
c = chain(g, 'DEF')  # Chain the generator and a string
for item in c:
    print(item)

Output:

A
B
C
D
E
F
Steep answered 9/7, 2010 at 8:30 Comment(6)
One should keep in mind that the return value of itertools.chain() does not return a types.GeneratorType instance. Just in case the exact type is crucial.Darindaring
See @andrew-pate anser for itertools.chain.from_iterable() reference to return a types.GeneratorType instance.Demolish
itertools.chain() would give all the elements in one directory and then shift to the other directory. Now, how do we pick the first elements of both directories and perform some operations, and then shift to the next pair and so on? Any idea would be appreciated.Jackscrew
@Jackscrew Iterate over those directories manually using the built-in function next.Cecilia
@Jackscrew you might like zip. It does precisely that, pick out the first, second etc. values and put them in tuples.Aleppo
@Randelung: Given the two trees likely don't have the same number of elements, zip is a bad idea, as it will truncate the longer set of outputs. itertools.zip_longest would get them paired, and consume everything (leaving a None or chosen default when the shorter iterable is exhausted). But usually, this sort of thing is best solved by the itertools recipe for a roundrobin function, which would alternate single entries from each source until one runs out, then produce the remaining values from the longer source.Anjaanjali
S
114

A example of code:

from itertools import chain

def generator1():
    for item in 'abcdef':
        yield item

def generator2():
    for item in '123456':
        yield item

generator3 = chain(generator1(), generator2())
for item in generator3:
    print item
Sunup answered 1/4, 2015 at 18:54 Comment(4)
Why not add this example to the already existing, highly upvoted itertools.chain() answer?Mace
Um. Because it would have cost him 850 rep. The guy has 851. You do you, cesio.Dietrich
@Jean-FrançoisCorbett the person who wrote the "already existing" answer could have done that really... okay? :)Faubourg
An example has been added to the top answer, making this redundant.Mundy
U
92

In Python (3.5 or greater) you can do:

def concat(a, b):
    yield from a
    yield from b
Urina answered 1/12, 2017 at 11:13 Comment(8)
So much pythonic.Stites
More general: def chain(*iterables): for iterable in iterables: yield from iterable (Put the def and for on separate lines when you run it.)Mundy
Is everything from a yielded before anything from b is yielded or are they being alternated?Kuhns
@problemofficer Yup. Only a is checked until everything is yielded from it, even if b isn't an iterator. The TypeError for b not being an iterator will come up later.Bergerac
@Mundy just for curiosity of efficiency: If you have iterables and use * to unpack so it can be used in this function, then it will cause lower efficiency compared to iterating of arg which is an iterable type, eg. def f(): return tuple(x for x in chain(*(str(s).split() for s in range(10000)))) when change your func to def chain(iterable): ... and passing iterables and one arg instead of *args- the difference using %timeit f() is 5.89 ms ± 122µs vs 4.5 ms ± 67.3µs per loop (mean ± std. dev. of 7 runs, 100 loops each).Roundlet
@Roundlet Sorry, I'm not sure what you're talking about. I never said anything about unpacking, and you can't substitute *iterables with iterable in the function I wrote.Mundy
@Mundy instead of passing it like: chian(a, b), you can pass it as chain((a,b)) so when using it with iterable of things to chain you don't have to unpack it like chian(*(a,b)), am i wrong? It's just a note, not so important and depends on where you want to use it.Roundlet
@Roundlet Oh OK, I see what you're saying. It looks like you made a typo, which confused me: def chain(iterable) should be def chain(iterables). (Also, x for x in is redundant.) Anyway, there's already a tool in the stdlib that does that: itertools.chain.from_iterable. And beyond performance, if you had an infinite iterable of iterables, it wouldn't be possible to use unpacking.Mundy
G
41

Simple example:

from itertools import chain
x = iter([1,2,3])      #Create Generator Object (listiterator)
y = iter([3,4,5])      #another one
result = chain(x, y)   #Chained x and y
Gilbart answered 23/7, 2017 at 9:51 Comment(5)
Why not add this example to the already existing, highly upvoted itertools.chain() answer?Mace
This isn't quite right, since itertools.chain returns an iterator, not a generator.Empiricism
Can't you just do chain([1, 2, 3], [3, 4, 5])?Abyssal
To be pedantic, a list_iterator isn't a generator, but it is an iterator, which is what OP's effectively actually asking about, since generators don't behave any differently from iterators in this context.Mundy
An example has been added to the top answer, making this redundant.Mundy
P
15

Here it is using a generator expression with nested fors:

range_a = range(3)
range_b = range(5)
result = ( item
           for one_range in (range_a, range_b)
           for item in one_range )
assert list(result) == [0, 1, 2, 0, 1, 2, 3, 4]

The for ... in ... are evaluated left-to-right. The identifier after for establishes a new variable. While one_range in used in the following for ... in ..., the item from the second one is used in the „final” assignment expression of which there is only one (in the very beginning).

Related question: How do I make a flat list out of a list of lists?.

Portamento answered 30/4, 2018 at 15:29 Comment(0)
W
13

With itertools.chain.from_iterable you can do things like:

def genny(start):
  for x in range(start, start+3):
    yield x

y = [1, 2]
ab = [o for o in itertools.chain.from_iterable(genny(x) for x in y)]
print(ab)
Whacking answered 15/1, 2016 at 11:19 Comment(7)
You're using an unnecessary list comprehension. You're also using an unnecessary generator expression on genny when it already returns a generator. list(itertools.chain.from_iterable(genny(x))) is much more concise.Abyssal
The !ist comprehension was an easy way to create the two generators, as per the question. Maybe my answer is a little convoluted in that respect.Whacking
I guess the reason I added this answer to the existing ones was to help those who happen to have lots of generators to deal with.Whacking
It isn't an easy way, there are many easier ways. Using generator expressions on an existing generator will lower performance, and the list constructor is much more readable then the list comprehension. Your method is much more unreadable in those regards.Abyssal
Corman, I agree your list constructor is indeed more readable. It would be good to see your 'many easier ways' though ... I think wjandrea's comment above looks to do the same as itertools.chain.from_iterable it would be good to race them and see whos fastest.Whacking
The two easier ways, as mentioned before, are using list and genny(x) over a list comprehension and a generator. The speed race would almost certainly favor the list comprehension because you're doing less computations.Abyssal
Corman is right the list constructor would be more readable, although some say list comprehensions are more pythonic.. The performance is the same.. It is worth pointing out though the bit: (genny(x) for x in y) was simply to create two generators (the first going 1,2,3 the second going 2,3,4).. I think I was involved with methods that created an iterator of itterables at the time.. The answer by wjandrea below is much clearer.Whacking
T
8

2020 update: Work in both Python 3 and Python 2

import itertools

iterA = range(10,15)
iterB = range(15,20)
iterC = range(20,25)

first option

for i in itertools.chain(iterA, iterB, iterC):
    print(i)

# 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

alternative option, introduced in python 2.6

for i in itertools.chain.from_iterable( [iterA, iterB, iterC] ):
    print(i)

# 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24

itertools.chain() is the basic.

itertools.chain.from_iterable() is handy if you have an iterable of iterables. For example a list of files per subdirectory like [ ["src/server.py", "src/readme.txt"], ["test/test.py"] ].

Twopence answered 27/9, 2020 at 16:11 Comment(1)
Python 2 went EOL on January 1, 2020, so I'm surprised you mention itMundy
K
4

One can also use unpack operator *:

concat = (*gen1(), *gen2())

NOTE: Works most efficiently for 'non-lazy' iterables. Can also be used with different kind of comprehensions. Preferred way for generator concat would be from the answer from @Uduse

Kwang answered 19/9, 2019 at 14:12 Comment(2)
It's sad that there is no lazy evaluation of *generator, because it would have made this a marvelous solution...Asis
–1 this will immediately consume both generators into a tuple!Madrigalist
V
2

If you want to keep the generators separate but still iterate over them at the same time you can use zip():

NOTE: Iteration stops at the shorter of the two generators

For example:

for (root1, dir1, files1), (root2, dir2, files2) in zip(os.walk(path1), os.walk(path2)):

    for file in files1:
        #do something with first list of files

    for file in files2:
        #do something with second list of files
Viosterol answered 10/8, 2017 at 17:2 Comment(0)
I
2

I would say that, as suggested in comments by user "wjandrea", the best solution is

def concat_generators(*gens):
    for gen in gens:
        yield from gen

It does not change the returned type and is really Pythonic.

Inessential answered 10/7, 2020 at 12:39 Comment(3)
Which is what itertools.chain.from_iterable() will do for you. See @andrew-pate 's answer.Demolish
Don't reinvent the wheel, use itertools.chain. My comment wasn't meant to suggest "the best solution", it was just to improve a mediocre solution. Anyway, you also changed the names and made them confusing: concat_generators can work on any iterable, not just generators, so it should be renamed along with gen; and args is vague, so I'd use iterables instead (or gens, following your incorrect naming scheme).Mundy
Oops, actually, I take most of that back. If you're using generator-specific features, like .send(), .throw(), and .close(), then this is the better solution because it actually lets you use them, which itertools.chain doesn't. But in OP's case, they're not using any of those features, so it's simpler to use chain. (Also, I should have linked generator iterator instead of generator. The glossary is arguably wrong for this term.)Mundy
C
2

(Disclaimer: Python 3 only!)

Something with syntax similar to what you want is to use the splat operator to expand the two generators:

for directory, dirs, files in (*os.walk(directory_1), *os.walk(directory_2)):
    do_something()

Explanation:

This effectively performs a single-level flattening of the two generators into an N-tuple of 3-tuples (from os.walk) that looks like:

((directory1, dirs1, files1), (directory2, dirs2, files2), ...)

Your for-loop then iterates over this N-tuple.

Of course, by simply replacing the outer parentheses with brackets, you can get a list of 3-tuples instead of an N-tuple of 3-tuples:

for directory, dirs, files in [*os.walk(directory_1), *os.walk(directory_2)]:
    do_something()

This yields something like:

[(directory1, dirs1, files1), (directory2, dirs2, files2), ...]

Pro:

The upside to this approach is that you don't have to import anything and it's not a lot of code.

Con:

The downside is that you dump two generators into a collection and then iterate over that collection, effectively doing two passes and potentially using a lot of memory.

Chairmanship answered 7/4, 2021 at 5:33 Comment(2)
This is not flattening at all. Rather, it is a zip.Mcclean
A bit puzzled by your comment @jpaugh. This concatenates two iterables. It doesn't create pairs from them. Maybe the confusion is from the fact that os.walk already yields 3-tuples?Chairmanship
M
0

Lets say that we have to generators (gen1 and gen 2) and we want to perform some extra calculation that requires the outcome of both. We can return the outcome of such function/calculation through the map method, which in turn returns a generator that we can loop upon.

In this scenario, the function/calculation needs to be implemented via the lambda function. The tricky part is what we aim to do inside the map and its lambda function.

General form of proposed solution:

def function(gen1,gen2):
        for item in map(lambda x, y: do_somethin(x,y), gen1, gen2):
            yield item
Musty answered 5/11, 2017 at 21:57 Comment(0)
A
0

If you would like get list of files paths from a knows directories before and after, you can do this:

for r,d,f in os.walk(current_dir):
    for dir in d:
        if dir =='after':
                after_dir = os.path.abspath(os.path.join(current_dir, dir))
                for r,d,f in os.walk(after_dir): 
                    after_flist.append([os.path.join(r,file)for file in f if file.endswith('json')])
                              
        elif dir =='before': 
                before_dir = os.path.abspath(os.path.join(current_dir, dir))
                for r,d,f in os.walk(before_dir):
                    before_flist.append([os.path.join(r,file)for file in f if file.endswith('json')])

I know there are better answers, this is simple code I felt.

Attributive answered 19/9, 2022 at 17:41 Comment(0)
D
-1

You can put any generator into a list. And while you can't combine generators, you can combine lists. The cons of this is you actually created 3 lists in memory but the pros are that this is very readable, requires no imports, and is a single line idiom.

Solution for the OP.

for directory, dirs, files in list(os.walk(directory_1)) + list(os.walk(directory_2)):
    do_something()
a = range(20)
b = range(10,99,3)
for v in list(a) + list(b):
    print(v) 
Dietrich answered 28/8, 2022 at 23:31 Comment(0)
A
-2

If you just need to do it once and do not wish to import one more module, there is a simple solutions...

just do:

for dir in directory_1, directory_2:
    for directory, dirs, files in os.walk(dir):
        do_something()

If you really want to "join" both generators, then do :

for directory, dirs, files in (
        x for osw in [os.walk(directory_1), os.walk(directory_2)] 
               for x in osw
        ):
    do_something()
Asis answered 4/2, 2019 at 22:29 Comment(3)
The second snippet of code gives an indentation error. It can be fixed with surrounding the list comprehension with parentheses: the opening parenthesis should be on the same line as in and the closing after the list comp ends. Regardless of this error, I think this is a bad example to follow. It reduces readability by mixing up indentation. The itertools.chain answers are massively more readable and easier to use.Spontaneous
You don't need to add parenthesis. I just moved the opening bracket on the previous line to solve this. by the way, you may not like my example, but I still think it's a good idea to know how to do things by yourself, because it makes you able to write the library yourself instead of resorting to someone else's work when you need it.Asis
sure, it is a good idea to learn how to do things by yourself. I never debated that. Sorry if I was unclear. The use of a list comprehension here reduces readability and is not really needed. List comprehensions are cool, long list comprehensions become hard to read & fix. The code could be improved by creating the list before and then iterating over it. Sorry about my parenthesis comment if it was incorrect.Spontaneous

© 2022 - 2024 — McMap. All rights reserved.