Python 3.6 added the ability to create Asynchronous Comprehensions and Asynchronous Generators. You can read about asynchronous comprehension in PEP 530 while the asynchronous generators are described in PEP 525. The documentation states that you can now create asynchronous list, set and dict comprehensions and generator expressions. Their example looks like this:
result = [i async for i in aiter() if i % 2]Basically you just need to add Python's new async keyword into your expression and call a callable that has implemented __aiter__. Trying to follow this syntax will actually result in a SyntaxError though:
>>> result = [i async for i in range(100) if i % 2] File "", line 1 result = [i async for i in range(100) if i % 2] ^ SyntaxError: invalid syntaxThis is actually be definition. If you look in PEP 530, you will see that it states the following: Asynchronous comprehensions are only allowed inside an async def function. Of course, you cannot put Python's await in a comprehension either as that keyword is to be used only inside of an async def function's body. Just for fun, I tried defining an async def function to see if I could make my idea work:
import asyncio async def test(): return [i async for i in range(100) if i % 2] loop = asyncio.get_event_loop() loop.run_until_complete(test())If you run this code, you will get a TypeError: 'async for' requires an object with __aiter__ method, got range. What you really want to do is call another async def function instead of calling range directly. Here's an example:
import asyncio async def numbers(numbers): for i in range(numbers): yield i await asyncio.sleep(0.5) async def main(): odd_numbers = [i async for i in numbers(10) if i % 2] print(odd_numbers) if __name__ == '__main__': event_loop = asyncio.get_event_loop() try: event_loop.run_until_complete(main()) finally: event_loop.close()Technically the numbers function is an asynchronous generator that is yielding values to our asynchronous list comprehension.
Wrapping Up
Creating an asynchronous list comprehension is quite a bit different than creating a regular list comprehension. As you can see, it takes a lot more code to make it work. Other than adding the ability to do more asynchronously out of the box, the new asynchronous generator is actually supposed to be 2x faster than an equivalent implemented as an asynchronous iterator according to PEP 525. While I currently don't have a use case for this new functionality, it is really neat to see what I can do now and I'll be filing these new concepts away for some future application.
Related Reading
This proposal introduces the concept of asynchronous generators to Python.
This specification presumes knowledge of the implementation of generators and coroutines in Python (PEP 342, PEP 380 and PEP 492).
A Python generator is any function containing one or more yield expressions:
def func(): # a function return def genfunc(): # a generator function yield
We propose to use the same approach to define asynchronous generators:
async def coro(): # a coroutine function await smth() async def asyncgen(): # an asynchronous generator function await smth() yield 42
The result of calling an asynchronous generator function is an asynchronous generator object, which implements the asynchronous iteration protocol defined in PEP 492.
It is a SyntaxError to have a non-empty return statement in an asynchronous generator.
PEP 492 requires an event loop or a scheduler to run coroutines. Because asynchronous generators are meant to be used from coroutines, they also require an event loop to run and finalize them.
Asynchronous generators can have try..finally blocks, as well as async with. It is important to provide a guarantee that, even when partially iterated, and then garbage collected, generators can be safely finalized. For example:
async def square_series(con, to): async with con.transaction(): cursor = con.cursor( 'SELECT generate_series(0, $1) AS i', to) async for row in cursor: yield row['i'] ** 2 async for i in square_series(con, 1000): if i == 100: break
The above code defines an asynchronous generator that uses async with to iterate over a database cursor in a transaction. The generator is then iterated over with async for, which interrupts the iteration at some point.
The square_series() generator will then be garbage collected, and without a mechanism to asynchronously close the generator, Python interpreter would not be able to do anything.
To solve this problem we propose to do the following:
- Implement an
aclose
method on asynchronous generators returning a special awaitable. When awaited it throws aGeneratorExit
into the suspended generator and iterates over it until either aGeneratorExit
or aStopAsyncIteration
occur.This is very similar to what the close() method does to regular Python generators, except that an event loop is required to execute aclose().
- Raise a
RuntimeError
, when an asynchronous generator executes ayield
expression in itsfinally
block (usingawait
is fine, though):async def gen(): try: yield finally: await asyncio.sleep(1) # Can use 'await'. yield # Cannot use 'yield', # this line will trigger a # RuntimeError.
- Add two new methods to the
sys
module:set_asyncgen_hooks()
andget_asyncgen_hooks()
.
The idea behind sys.set_asyncgen_hooks() is to allow event loops to intercept asynchronous generators iteration and finalization, so that the end user does not need to care about the finalization problem, and everything just works.
sys.set_asyncgen_hooks() accepts two arguments:
firstiter
: a callable which will be called when an asynchronous generator is iterated for the first time.finalizer
: a callable which will be called when an asynchronous generator is about to be GCed.
When an asynchronous generator is iterated for the first time, it stores a reference to the current finalizer.
When an asynchronous generator is about to be garbage collected, it calls its cached finalizer. The assumption is that the finalizer will schedule an aclose() call with the loop that was active when the iteration started.
For instance, here is how asyncio is modified to allow safe finalization of asynchronous generators:
# asyncio/base_events.py class BaseEventLoop: def run_forever(self): ... old_hooks = sys.get_asyncgen_hooks() sys.set_asyncgen_hooks(finalizer=self._finalize_asyncgen) try: ... finally: sys.set_asyncgen_hooks(*old_hooks) ... def _finalize_asyncgen(self, gen): self.create_task(gen.aclose())
The second argument, firstiter, allows event loops to maintain a weak set of asynchronous generators instantiated under their control. This makes it possible to implement “shutdown” mechanisms to safely finalize all open generators and close the event loop.
sys.set_asyncgen_hooks() is thread-specific, so several event loops running in parallel threads can use it safely.
sys.get_asyncgen_hooks() returns a namedtuple-like structure with firstiter and finalizer fields.
The asyncio event loop will use sys.set_asyncgen_hooks() API to maintain a weak set of all scheduled asynchronous generators, and to schedule their aclose() coroutine methods when it is time for generators to be GCed.
To make sure that asyncio programs can finalize all scheduled asynchronous generators reliably, we propose to add a new event loop coroutine method loop.shutdown_asyncgens(). The method will schedule all currently open asynchronous generators to close with an aclose() call.
After calling the loop.shutdown_asyncgens() method, the event loop will issue a warning whenever a new asynchronous generator is iterated for the first time. The idea is that after requesting all asynchronous generators to be shutdown, the program should not execute code that iterates over new asynchronous generators.
An example of how shutdown_asyncgens coroutine should be used:
try: loop.run_forever() finally: loop.run_until_complete(loop.shutdown_asyncgens()) loop.close()
The object is modeled after the standard Python generator object. Essentially, the behaviour of asynchronous generators is designed to replicate the behaviour of synchronous generators, with the only difference in that the API is asynchronous.
The following methods and properties are defined:
agen.__aiter__()
: Returnsagen
.agen.__anext__()
: Returns an awaitable, that performs one asynchronous generator iteration when awaited.agen.asend(val)
: Returns an awaitable, that pushes theval
object in theagen
generator. When theagen
has not yet been iterated,val
must beNone
.Example:
async def gen(): await asyncio.sleep(0.1) v = yield 42 print(v) await asyncio.sleep(0.2) g = gen() await g.asend(None) # Will return 42 after sleeping # for 0.1 seconds. await g.asend('hello') # Will print 'hello' and # raise StopAsyncIteration # (after sleeping for 0.2 seconds.)
agen.athrow(typ,
[val,
[tb]])
: Returns an awaitable, that throws an exception into theagen
generator.Example:
async def gen(): try: await asyncio.sleep(0.1) yield 'hello' except ZeroDivisionError: await asyncio.sleep(0.2) yield 'world' g = gen() v = await g.asend(None) print(v) # Will print 'hello' after # sleeping for 0.1 seconds. v = await g.athrow(ZeroDivisionError) print(v) # Will print 'world' after $ sleeping 0.2 seconds.
agen.aclose()
: Returns an awaitable, that throws aGeneratorExit
exception into the generator. The awaitable can either return a yielded value, ifagen
handled the exception, oragen
will be closed and the exception will propagate back to the caller.agen.__name__
andagen.__qualname__
: readable and writable name and qualified name attributes.agen.ag_await
: The object thatagen
is currently awaiting on, orNone
. This is similar to the currently availablegi_yieldfrom
for generators andcr_await
for coroutines.agen.ag_frame
,agen.ag_running
, andagen.ag_code
: defined in the same way as similar attributes of standard generators.
StopIteration and StopAsyncIteration are not propagated out of asynchronous generators, and are replaced with a RuntimeError.
Asynchronous generator object (PyAsyncGenObject) shares the struct layout with PyGenObject. In addition to that, the reference implementation introduces three new objects:
PyAsyncGenASend
: the awaitable object that implements__anext__
andasend()
methods.PyAsyncGenAThrow
: the awaitable object that implementsathrow()
andaclose()
methods._PyAsyncGenWrappedValue
: every directly yielded object from an asynchronous generator is implicitly boxed into this structure. This is how the generator implementation can separate objects that are yielded using regular iteration protocol from objects that are yielded using asynchronous iteration protocol.
PyAsyncGenASend and PyAsyncGenAThrow are awaitables (they have __await__ methods returning self) and are coroutine-like objects (implementing __iter__, __next__, send() and throw() methods). Essentially, they control how asynchronous generators are iterated:
PyAsyncGenASend is a coroutine-like object that drives __anext__ and asend() methods and implements the asynchronous iteration protocol.
agen.asend(val) and agen.__anext__() return instances of PyAsyncGenASend (which hold references back to the parent agen object.)
The data flow is defined as follows:
- When
PyAsyncGenASend.send(val)
is called for the first time,val
is pushed to the parentagen
object (using existing facilities ofPyGenObject
.)Subsequent iterations over the PyAsyncGenASend objects, push None to agen.
When a _PyAsyncGenWrappedValue object is yielded, it is unboxed, and a StopIteration exception is raised with the unwrapped value as an argument.
- When
PyAsyncGenASend.throw(*exc)
is called for the first time,*exc
is thrown into the parentagen
object.Subsequent iterations over the PyAsyncGenASend objects, push None to agen.
When a _PyAsyncGenWrappedValue object is yielded, it is unboxed, and a StopIteration exception is raised with the unwrapped value as an argument.
return
statements in asynchronous generators raiseStopAsyncIteration
exception, which is propagated throughPyAsyncGenASend.send()
andPyAsyncGenASend.throw()
methods.
PyAsyncGenAThrow is very similar to PyAsyncGenASend. The only difference is that PyAsyncGenAThrow.send(), when called first time, throws an exception into the parent agen object (instead of pushing a value into it.)
The proposal is fully backwards compatible.
In Python 3.5 it is a SyntaxError to define an async def function with a yield expression inside, therefore it’s safe to introduce asynchronous generators in 3.6.