Start async function without importing the asyncio package
Asked Answered
S

5

36

Is is possible to start a function like this

async def foo():
    while True:
        print("Hello!")

without importing the asyncio package (and getting the event loop)?

I am looking for a principle similar to Go's goroutines, where one can launch a coroutine with only go statement.

Edit: The reason why I'm not importing the asyncio package is simply because I think it should be possible to launch coroutine without event loop (explicit). I don't understand why async def and similar statements are part of core language (even part of syntax) and the way to launch created coroutines is available only through package.

Stunk answered 23/2, 2016 at 19:0 Comment(3)
Regarding true coroutines per Python 3.5, I doubt it, because something has to crank the wheel, so to speak. Though you could try implement a coroutine the 'old-fashioned' way by creating a python generator and feeding it with send() statementsEcthyma
Why don't you want to import asyncio?Meshed
@dim because (presumably) OP wants to understand how async await works in Python. I have the same question. In other languages it is possible to dispatch asynchronous code without importing an event loop runtime. NodeJS has an event loop baked into the interpreter, so everything is async by default. Low level languages like C++ allow you to create std::future from a std::promise or by running an function asynchronously, either via a thread or via a coroutine.Kavita
R
62

Of course it is possible to start an async function without explicitly using asyncio. After all, asyncio is written in Python, so all it does, you can do too (though sometimes you might need other modules like selectors or threading if you intend to concurrently wait for external events, or paralelly execute some other code).

In this case, since your function has no await points inside, it just needs a single push to get going. You push a coroutine by sending None into it.

>>> foo().send(None)
Hello!
Hello!
...

Of course, if your function (coroutine) had yield expressions inside, it would suspend execution at each yield point, and you would need to push additional values into it (by coro.send(value) or next(gen)) - but you already know that if you know how generators work.

import types

@types.coroutine
def bar():
    to_print = yield 'What should I print?'
    print('Result is', to_print)
    to_return = yield 'And what should I return?'
    return to_return

>>> b = bar()
>>> next(b)
'What should I print?'
>>> b.send('Whatever you want')
Result is Whatever you want
'And what should I return?'
>>> b.send(85)
Traceback...
StopIteration: 85

Now, if your function had await expressions inside, it would suspend at evaluating each of them.

async def baz():
    first_bar, second_bar = bar(), bar()
    print('Sum of two bars is', await first_bar + await second_bar)
    return 'nothing important'

>>> t = baz()
>>> t.send(None)
'What should I print?'
>>> t.send('something')
Result is something
'And what should I return?'
>>> t.send(35)
'What should I print?'
>>> t.send('something else')
Result is something else
'And what should I return?'
>>> t.send(21)
Sum of two bars is 56
Traceback...
StopIteration: nothing important

Now, all these .sends are starting to get tedious. It would be nice to have them semiautomatically generated.

import random, string

def run_until_complete(t):
    prompt = t.send(None)
    try:
        while True:
            if prompt == 'What should I print?':
                prompt = t.send(random.choice(string.ascii_uppercase))
            elif prompt == 'And what should I return?':
                prompt = t.send(random.randint(10, 50))
            else:
                raise ValueError(prompt)
    except StopIteration as exc:
        print(t.__name__, 'returned', exc.value)
        t.close()

>>> run_until_complete(baz())
Result is B
Result is M
Sum of two bars is 56
baz returned nothing important

Congratulations, you just wrote your first event loop! (Didn't expect it to happen, did you?;) Of course, it is horribly primitive: it only knows how to handle two types of prompts, it doesn't enable t to spawn additional coroutines that run concurrently with it, and it fakes events by a random generator.

(In fact, if you want to get philosophical: what we did above that manually, could also be called an event loop: Python REPL was printing prompts to a console window, and it was relying on you to provide events by typing t.send(whatever) into it.:)

asyncio is just an immensely generalized variant of the above: prompts are replaced by Futures, multiple coroutines are kept in queues so each of them eventually gets its turn, and events are much richer and include network/socket communication, filesystem reads/writes, signal handling, thread/process side-execution, and so on. But the basic idea is still the same: you grab some coroutines, juggle them in the air routing the Futures from one to another, until they all raise StopIteration. When all coroutines have nothing to do, you go to external world and grab some additional events for them to chew on, and continue.

I hope it's all much clearer now. :-)

Rajkot answered 28/7, 2016 at 6:7 Comment(2)
Wonderful, I can't wait to test it!Stunk
If you have any more questions, just ask. I'll be glad to answer.Rajkot
S
4

Coroutines should be able to

  1. run

  2. yield control to caller (optionally producing some intermediate results)

  3. be able to get some information from caller and resume

So, here is a small demo of async functions (aka native coroutines) which do it without using asyncio or any other modules/frameworks which provide event loop. At least python 3.5 is required. See comments inside the code.

#!/usr/bin/env python

import types

# two simple async functions
async def outer_af(x):
    print("- start outer_af({})".format(x))
    val = await inner_af(x)  # Normal way to call native coroutine.
                             # Without `await` keyword it wouldn't
                             # actually start
    print("- inner_af result: {}".format(val))
    return "outer_af_result"


async def inner_af(x):
    print("-- start inner_af({})".format(x))
    val = await receiver()  # 'await' can be used not only with native
                            # coroutines, but also with `generator-based`
                            # coroutines!
    print("-- received val {}".format(val))
    return "inner_af_result"


# To yiled execution control to caller it's necessary to use
# 'generator-based' coroutine: the one created with types.coroutine
# decorator
@types.coroutine
def receiver():
    print("--- start receiver")
    # suspend execution / yield control / communicate with caller
    r = yield "value request"
    print("--- receiver received {}".format(r))
    return r

def main():
    # We want to call 'outer_af' async function (aka native coroutine)
    # 'await' keyword can't be used here!
    # It can only be used inside another async function.
    print("*** test started")
    c = outer_af(42)  # just prepare coroutine object. It's not running yet.
    print("*** c is {}".format(c))

    # To start coroutine execution call 'send' method.
    w = c.send(None)  # The first call must have argument None

    # Execution of coroutine is now suspended. Execution point is on
    # the 'yield' statement inside the 'receiver' coroutine.
    # It is waiting for another 'send' method to continue.
    # The yielded value can give us a hint about what exectly coroutine
    # expects to receive from us.
    print("*** w = {}".format(w))

    # After next 'send' the coroutines execution would finish.
    # Even though the native coroutine object is not iterable it will
    # throw StopIteration exception on exit!
    try:
        w = c.send(25)
        # w here would not get any value. This is unreachable.
    except StopIteration as e:
        print("*** outer_af finished. It returned: {}".format(e.value))


if __name__ == '__main__':
    main()

Output looks like:

*** test started
*** c is <coroutine object outer_af at 0x7f4879188620>
- start outer_af(42)
-- start inner_af(42)
--- start receiver
*** w = value request
--- receiver received 25
-- received val 25
- inner_af result: inner_af_result
*** outer_af finished. It returned: outer_af_result

Additional comment. Looks like it's not possible to yield control from inside native coroutine. yield is not permitted inside async functions! So it is necessary to import types and use coroutine decorator. It does some black magic! Frankly speaking I do not understand why yield is prohibited so that a mixture of native and generator-based coroutines is required.

Sequela answered 13/12, 2017 at 7:56 Comment(1)
It is certainly very strange that a native coroutine can't do what the decorator version can do. Why should the depricated syntax be able to do more than the replacement syntax can? I also don't fully understand this yetKavita
O
2

Python coroutines are a syntactic sugar for generators, with some added restrictions in their behavior (so that their purpose is explicitly different and doesn't mix). You can't do:

next(foo())
TypeError: 'coroutine' object is not an iterator

because it's disabled explicitly. However you can do:

foo().send(None)
Hello
Hello
Hello
...

Which is equivalent to next() for a generator.

Oligosaccharide answered 28/5, 2016 at 20:47 Comment(0)
K
0

The other answers here are pretty great, but I think it's helpful to understand some low-level systems programming to get a full appreciation for what coroutines in Python are doing.


Before digging into this in more detail it is helpful to consider how asynchronous dispatch works in a range of languages.

C / C++:

Broadly speaking it works like this:

  • All code is specified by functions
  • There is no asynchronous version of a function - functions just describe code which is to be executed somehow, how the code is executed (multiple thread, process fork, single thread, synchronously, asynchronously) is not specified by the function itself
  • You can create a thing called a std::future which represents the result of some function execution which may occur in the future, either asynchronously, as part of the same thread in a std::promise or in a different thread
  • There are additional flexibilities. Not only can a std::future be created and resolved by dispatching some code using std::async, but it can also be resolved by a std::promise which is a thing which you can "move" between different threads.
  • This is a very explicit, and very flexible, way of programming. There are lots of options and possibilities
  • There are no event loops. If you want one of those, you either have to write it yourself or compile someone else's library

For more detail consult the concurrency support section of C++ language documentation at cppreference.

Node.js:

Broadly speaking, it's the other extreme:

  • The NodeJS runtime (interpreter) is an event loop
  • async and await manage events which are placed on a queue and then later resolved by the event loop
  • The runtime contains a few threads for different purposes. Some perform io, and one thread is responsible for running the user code
  • It's very structured, but rigid
  • It is pretty much the case that all code ends up being async by default - things end up being this way because at some point you probably want to call an asynchronous function. That means you have to await it. await can't be used outside of an async function so the "chicken-and-egg" problem is fixed by making everything in all levels of the call stack async. You can of course write non-async functions and just call them like regular code from within an async function, but what ends up happening is function main() necessarily ends up being async function main().
  • BUT this is ok, because unlike Python, NodeJS knows how to call an asynchronous function by default. The event loop is built in, you can just call main(), regardless of whether main is an async function or just a plain old function.

Python:

  • Initially, Python seems a bit weird
  • Just like with NodeJS, you can't call an async function without awaiting it
  • You can't use await outside of an async def
  • Doesn't this mean, like with NodeJS, that if we have an asynchronous function call somewhere in the call stack, that inevitably the top level function ends up being async?

Well - yes it does. What makes Python seem a bit odd is it sits part way between the C++ way of doing things and the NodeJS way of doing things. We don't have an event loop baked in by default, but on the other hand the explicitness of the low level way of doing things in C++ land is hidden or abstracted away from us.

We are left with a language which has async and await as keywords by default (baked into the core language) but that we don't seem to be able to use very easily.

If you search for tutorials about Python async/await everything talks about event loops and asyncio.

So what's going on?


Let's discuss Python in a little more detail: Python is a high level language. Everything it has to offer typically wraps some lower level (systems-level) code. Let's consider some examples:

  • sockets and network programming
  • file io
  • threads and multiprocess (fork)
  • asynchronous code

There are probably other examples which could be added to this list.

The key point is this: In each case, these high level interfaces wrap some lower level interface, which is typically written in C and talks to the OS.

Without going into too much detail, reading and writing files use the same 'w' and 'r' as the C level interface. Network sockets use the same AF_INET. I add these examples merely to give some indication that there is a very tight similarity between Python and C when it comes to interfacing with the OS.

What Python provides is an interpreted environment layered on top of some lower level systems environment. It provides you with a garbage collector and a memory model based around references to objects which prevents the need for the programmer to think deeply amount manually allocating and freeing memory. (As would be required in a systems level language like C or C++.)

A key point to note is the Python interpreter is a single threaded executor. There isn't that much difference between some interpreted Python code being executed and some lower level compiled language code like C or C++ being executed. They both run in a synchronous way as a single thread.

Given all of the above:

  • we should expect coroutines and async / await to function in pretty much the same way as in some other language like C++

But they don't appear to: at least not if you read the documentation for asyncio which talks in terms of the higher level concept of event loops. What are these mysterious event loops?


Other answers have already provided part of the answer. Unlike NodeJS where the event loop is baked into the runtime, in Python we have to import it.

  • Ok great, so this explains why async / await looks like the async / await in NodeJS, but an async def main() can't be called by default in Python, whereas it can be called by default in NodeJS.
  • To make things work in Python we need to import and event loop, and run our async code using it, or do some manual work to get async code to run by ourselves

Instinctively, we now know we should be able to do this.

  • We know Python provides a high level interface around the C code which our operating system is (most likely) written in and so we should be able to do the same things as can be done in C++ (which itself is built on top of C, in exactly the same way as Python is)
  • The OS manages execution of program code

So how do we write this C++ code in Python?

// https://en.cppreference.com/w/cpp/thread/future

#include <future>
#include <iostream>

int main() {

    // future from an async()
    // Note: The latter part is a lambda function: `[] { return 8; }`

    std::future<int> f2 = 
        std::async(
            std::launch::async,
            []{ return 8; }
        );

    f2.wait();
    std::cout << "Done: f2.get() = " << f2.get() << std::endl;
}

Here's how we do it.

#!/usr/bin/env python3


async def lambda_function(arg:int) -> int:
    
    print(f'in lambda substitute function: arg={arg}')

    return arg + 1


def main():

    coroutine = lambda_function(42)
    
    try:
        coroutine.send(None)
    except StopIteration as stop_exception:
        returned_value = stop_exception.value
        print(f'returned_value={returned_value}')
        # Note: also works, but less explicit:
        # print(stop_exception)


if __name__ == '__main__':
    main()

Output:

in lambda substitute function: arg=42
returned_value=43

You can see that all of the concepts we have in C++ are there in the Python code too. coroutine.send(None) does the same thing as f2.wait() for example.

We can even dig into it a little further with the use of this helper function.

def utility_print_public_attributes(obj:object, name:str) -> None:

    print(f'attributes of {name}:')

    for d in dir(obj):
        if not d.startswith('__'):
            print(d)
            print(type(getattr(obj, d)))

    print(f'')
def baz(arg):

    print(f'bar -> arg={arg}')
    return 'return value from baz'


baz_coro = baz(42)
print(type(baz_coro))
utility_print_public_attributes(baz_coro, 'baz_coro')

Output:

<class 'coroutine'>
attributes of baz_coro:
close
<class 'builtin_function_or_method'>
cr_await
<class 'NoneType'>
cr_code
<class 'code'>
cr_frame
<class 'frame'>
cr_origin
<class 'NoneType'>
cr_running
<class 'bool'>
cr_suspended
<class 'bool'>
send
<class 'builtin_function_or_method'>
throw
<class 'builtin_function_or_method'>

So we can use baz.close(), baz.send() and baz.throw() to control what the coroutine does.

Similarly, for the returned value:

print(f'launching coro')

try:
    baz_coro.send(None)
except StopIteration as stop_iteration:
    print(type(stop_iteration))

    utility_print_public_attributes(stop_iteration, 'stop_iteration')

    print(stop_iteration.value)
    print(stop_iteration)
launching coro
bar -> arg=42
<class 'StopIteration'>
attributes of stop_iteration:
add_note
<class 'builtin_function_or_method'>
args
<class 'tuple'>
value
<class 'str'>
with_traceback
<class 'builtin_function_or_method'>

We can access the attributes args and value to get the inputs and outputs to the coroutine.

Kavita answered 23/2, 2024 at 12:21 Comment(0)
D
-3

No, that is not possible. You need an event loop. Take a look at what happens if you just call foo():

>>> f = foo()
>>> print(f)
<coroutine object foo at 0x7f6e13edac50>

So you get a coroutine object, nothing get's executed right now! Only by passing it to an event loop does it get executed. You can use asyncio or another event loop like Curio.

Dowd answered 24/2, 2016 at 13:57 Comment(0)

© 2022 - 2025 — McMap. All rights reserved.