The growth of Internet and general connectivity has triggered the
proportionate need for responsive and scalable code. This proposal
aims to answer that need by making writing explicitly asynchronous,
concurrent Python code easier and more Pythonic.

It is proposed to make coroutines a proper standalone concept in
Python, and introduce new supporting syntax. The ultimate goal
is to help establish a common, easily approachable, mental
model of asynchronous programming in Python and make it as close to
synchronous programming as possible.

This PEP assumes that the asynchronous tasks are scheduled and
coordinated by an Event Loop similar to that of stdlib module
asyncio.events.AbstractEventLoop. While the PEP is not tied to any
specific Event Loop implementation, it is relevant only to the kind of
coroutine that uses yield as a signal to the scheduler, indicating
that the coroutine will be waiting until an event (such as IO) is
completed.

We believe that the changes proposed here will help keep Python
relevant and competitive in a quickly growing area of asynchronous
programming, as many other languages have adopted, or are planning to
adopt, similar features: [2], [5], [6], [7], [8], [10].

Feedback on the initial beta release of Python 3.5 resulted in a
redesign of the object model supporting this PEP to more clearly
separate native coroutines from generators - rather than being a
new kind of generator, native coroutines are now their own
completely distinct type (implemented in [17]).

This change was implemented based primarily due to problems
encountered attempting to integrate support for native coroutines
into the Tornado web server (reported in [18]).

In CPython 3.5.2, the __aiter__ protocol was updated.

Before 3.5.2, __aiter__ was expected to return an awaitable
resolving to an asynchronous iterator. Starting with 3.5.2,
__aiter__ should return asynchronous iterators directly.

If the old protocol is used in 3.5.2, Python will raise a
PendingDeprecationWarning.

In CPython 3.6, the old __aiter__ protocol will still be
supported with a DeprecationWarning being raised.

In CPython 3.7, the old __aiter__ protocol will no longer be
supported: a RuntimeError will be raised if __aiter__
returns anything but an asynchronous iterator.

Current Python supports implementing coroutines via generators (PEP
342), further enhanced by the yield from syntax introduced in PEP
380. This approach has a number of shortcomings:

It is easy to confuse coroutines with regular generators, since they
share the same syntax; this is especially true for new developers.

Whether or not a function is a coroutine is determined by a presence
of yield or yield from statements in its body, which can
lead to unobvious errors when such statements appear in or disappear
from function body during refactoring.

Support for asynchronous calls is limited to expressions where
yield is allowed syntactically, limiting the usefulness of
syntactic features, such as with and for statements.

This proposal makes coroutines a native Python language feature, and
clearly separates them from generators. This removes
generator/coroutine ambiguity, and makes it possible to reliably define
coroutines without reliance on a specific library. This also enables
linters and IDEs to improve static code analysis and refactoring.

Native coroutines and the associated new syntax features make it
possible to define context manager and iteration protocols in
asynchronous terms. As shown later in this proposal, the new async
with statement lets Python programs perform asynchronous calls when
entering and exiting a runtime context, and the new async for
statement makes it possible to perform asynchronous calls in iterators.

This proposal introduces new syntax and semantics to enhance coroutine
support in Python.

This specification presumes knowledge of the implementation of
coroutines in Python (PEP 342 and PEP 380). Motivation for the syntax
changes proposed here comes from the asyncio framework (PEP 3156) and
the "Cofunctions" proposal (PEP 3152, now rejected in favor of this
specification).

From this point in this document we use the word native coroutine to
refer to functions declared using the new syntax. generator-based
coroutine is used where necessary to refer to coroutines that are
based on generator syntax. coroutine is used in contexts where both
definitions are applicable.

The following new await expression is used to obtain a result of
coroutine execution:

async def read_data(db):
data = await db.fetch('SELECT ...')
...

await, similarly to yield from, suspends execution of
read_data coroutine until db.fetchawaitable completes and
returns the result data.

It uses the yield from implementation with an extra step of
validating its argument. await only accepts an awaitable, which
can be one of:

A native coroutine object returned from a native coroutine
function.

A generator-based coroutine object returned from a function
decorated with types.coroutine().

An object with an __await__ method returning an iterator.

Any yield from chain of calls ends with a yield. This is a
fundamental mechanism of how Futures are implemented. Since,
internally, coroutines are a special kind of generators, every
await is suspended by a yield somewhere down the chain of
await calls (please refer to PEP 3156 for a detailed
explanation).

To enable this behavior for coroutines, a new magic method called
__await__ is added. In asyncio, for instance, to enable Future
objects in await statements, the only change is to add
__await__ = __iter__ line to asyncio.Future class.

Objects with __await__ method are called Future-like objects in
the rest of this PEP.

The key await difference from yield and yield from
operators is that await expressions do not require parentheses around
them most of the times.

Also, yield from allows any expression as its argument, including
expressions like yield from a() + b(), that would be parsed as
yield from (a() + b()), which is almost always a bug. In general,
the result of any arithmetic operation is not an awaitable object.
To avoid this kind of mistakes, it was decided to make await
precedence lower than [], (), and ., but higher than **
operators.

The following is a utility class that transforms a regular iterable to
an asynchronous one. While this is not a very useful thing to do, the
code illustrates the relationship between regular and asynchronous
iterators.

This section applies only to native coroutines with CO_COROUTINE
flag, i.e. defined with the new async def syntax.

The behavior of existing *generator-based coroutines* in asyncio
remains unchanged.

Great effort has been made to make sure that coroutines and
generators are treated as distinct concepts:

Native coroutine objects do not implement __iter__ and
__next__ methods. Therefore, they cannot be iterated over or
passed to iter(), list(), tuple() and other built-ins.
They also cannot be used in a for..in loop.

An attempt to use __iter__ or __next__ on a native
coroutine object will result in a TypeError.

Plain generators cannot yield fromnative coroutines:
doing so will result in a TypeError.

Coroutines are based on generators internally, thus they share the
implementation. Similarly to generator objects, coroutines have
throw(), send() and close() methods. StopIteration and
GeneratorExit play the same role for coroutines (although
PEP 479 is enabled by default for coroutines). See PEP 342, PEP 380,
and Python Documentation [11] for details.

throw(), send() methods for coroutines are used to push
values and raise errors into Future-like objects.

A common beginner mistake is forgetting to use yield from on
coroutines:

@asyncio.coroutine
def useful():
asyncio.sleep(1) # this will do nothing without 'yield from'

For debugging this kind of mistakes there is a special debug mode in
asyncio, in which @coroutine decorator wraps all functions with a
special object with a destructor logging a warning. Whenever a wrapped
generator gets garbage collected, a detailed logging message is
generated with information about where exactly the decorator function
was defined, stack trace of where it was collected, etc. Wrapper
object also provides a convenient __repr__ function with detailed
information about the generator.

The only problem is how to enable these debug capabilities. Since
debug facilities should be a no-op in production mode, @coroutine
decorator makes the decision of whether to wrap or not to wrap based on
an OS environment variable PYTHONASYNCIODEBUG. This way it is
possible to run asyncio programs with asyncio's own functions
instrumented. EventLoop.set_debug, a different debug facility, has
no impact on @coroutine decorator's behavior.

With this proposal, coroutines is a native, distinct from generators,
concept. In addition to a RuntimeWarning being raised on
coroutines that were never awaited, it is proposed to add two new
functions to the sys module: set_coroutine_wrapper and
get_coroutine_wrapper. This is to enable advanced debugging
facilities in asyncio and other frameworks (such as displaying where
exactly coroutine was created, and a more detailed stack trace of where
it was garbage collected).

inspect.iscoroutine(obj) returns True if obj is a
native coroutine object.

inspect.iscoroutinefunction(obj) returns True if obj is a
native coroutine function.

inspect.isawaitable(obj) returns True if obj is an
awaitable.

inspect.getcoroutinestate(coro) returns the current state of
a native coroutine object (mirrors
inspect.getfgeneratorstate(gen)).

inspect.getcoroutinelocals(coro) returns the mapping of a
native coroutine object's local variables to their values
(mirrors inspect.getgeneratorlocals(gen)).

sys.set_coroutine_wrapper(wrapper) allows to intercept creation of
native coroutine objects. wrapper must be either a callable that
accepts one argument (a coroutine object), or None. None
resets the wrapper. If called twice, the new wrapper replaces the
previous one. The function is thread-specific. See Debugging
Features for more details.

sys.get_coroutine_wrapper() returns the current wrapper object.
Returns None if no wrapper was set. The function is
thread-specific. See Debugging Features for more details.

Coroutines based on generator syntax. Most common example are
functions decorated with @asyncio.coroutine.

Generator-based coroutine

Returned from a generator-based coroutine function.

Coroutine

Either native coroutine or generator-based coroutine.

Coroutine object

Either native coroutine object or generator-based coroutine
object.

Future-like object

An object with an __await__ method, or a C object with
tp_as_async->am_await function, returning an iterator. Can be
consumed by an await expression in a coroutine. A coroutine
waiting for a Future-like object is suspended until the Future-like
object's __await__ completes, and returns the result. See
Await Expression for details.

Because plain generators cannot yield fromnative coroutine
objects (see Differences from generators section for more details),
it is advised to make sure that all generator-based coroutines are
decorated with @asyncio.coroutinebefore starting to use the new
syntax.

async is mostly used by asyncio. We are addressing this by
renaming async() function to ensure_future() (see asyncio
section for details).

Another use of async keyword is in Lib/xml/dom/xmlbuilder.py,
to define an async = False attribute for DocumentLS class.
There is no documentation or tests for it, it is not used anywhere else
in CPython. It is replaced with a getter, that raises a
DeprecationWarning, advising to use async_ attribute instead.
'async' attribute is not documented and is not used in CPython code
base.

async and await names will be softly deprecated in CPython 3.5
and 3.6. In 3.7 we will transform them to proper keywords. Making
async and await proper keywords before 3.7 might make it harder
for people to port their code to Python 3.

There is no equivalent of __cocall__ in this PEP, which is
called and its result is passed to yield from in the cocall
expression. await keyword expects an awaitable object,
validates the type, and executes yield from on it. Although,
__await__ method is similar to __cocall__, but is only used
to define Future-like objects.

await is defined in almost the same way as yield from in the
grammar (it is later enforced that await can only be inside
async def). It is possible to simply write await future,
whereas cocall always requires parentheses.

To make asyncio work with PEP 3152 it would be required to modify
@asyncio.coroutine decorator to wrap all functions in an object
with a __cocall__ method, or to implement __cocall__ on
generators. To call cofunctions from existing generator-based
coroutines it would be required to use costart(cofunc, *args,
**kwargs) built-in.

Since it is impossible to call a cofunction without a cocall
keyword, it automatically prevents the common mistake of forgetting
to use yield from on generator-based coroutines. This proposal
addresses this problem with a different approach, see Debugging
Features.

A shortcoming of requiring a cocall keyword to call a coroutine
is that if is decided to implement coroutine-generators --
coroutines with yield or async yield expressions -- we
wouldn't need a cocall keyword to call them. So we'll end up
having __cocall__ and no __call__ for regular coroutines,
and having __call__ and no __cocall__ for coroutine-
generators.

Requiring parentheses grammatically also introduces a whole lot
of new problems.

With async for keyword it is desirable to have a concept of a
coroutine-generator -- a coroutine with yield and yield from
expressions. To avoid any ambiguity with regular generators, we would
likely require to have an async keyword before yield, and
async yield from would raise a StopAsyncIteration exception.

While it is possible to implement coroutine-generators, we believe that
they are out of scope of this proposal. It is an advanced concept that
should be carefully considered and balanced, with a non-trivial changes
in the implementation of current generator objects. This is a matter
for a separate PEP.

While it is possible to just implement await expression and treat
all functions with at least one await as coroutines, this approach
makes APIs design, code refactoring and its long time support harder.

Let's pretend that Python only has await keyword:

def useful():
...
await log(...)
...
def important():
await useful()

If useful() function is refactored and someone removes all
await expressions from it, it would become a regular python
function, and all code that depends on it, including important()
would be broken. To mitigate this issue a decorator similar to
@asyncio.coroutine has to be introduced.

For some people bare async name(): pass syntax might look more
appealing than async def name(): pass. It is certainly easier to
type. But on the other hand, it breaks the symmetry between async
def, async with and async for, where async is a modifier,
stating that the statement is asynchronous. It is also more consistent
with the existing grammar.

async keyword is a statement qualifier. A good analogy to it are
"static", "public", "unsafe" keywords from other languages. "async
for" is an asynchronous "for" statement, "async with" is an
asynchronous "with" statement, "async def" is an asynchronous function.

Having "async" after the main statement keyword might introduce some
confusion, like "for async item in iterator" can be read as "for each
asynchronous item in iterator".

Having async keyword before def, with and for also
makes the language grammar simpler. And "async def" better separates
coroutines from regular functions visually.

Transition Plan section explains how tokenizer is modified to treat
async and await as keywords only in async def blocks.
Hence async def fills the role that a module level compiler
declaration like from __future__ import async_await would otherwise
fill.

New asynchronous magic methods __aiter__, __anext__,
__aenter__, and __aexit__ all start with the same prefix "a".
An alternative proposal is to use "async" prefix, so that __anext__
becomes __async_next__. However, to align new magic methods with
the existing ones, such as __radd__ and __iadd__ it was decided
to use a shorter version.

The vision behind existing generator-based coroutines and this proposal
is to make it easy for users to see where the code might be suspended.
Making existing "for" and "with" statements to recognize asynchronous
iterators and context managers will inevitably create implicit suspend
points, making it harder to reason about the code.

New __await__ method for Future-like objects, and new
tp_as_async.am_await slot in PyTypeObject.

New syntax for asynchronous context managers: async with. And
associated protocol with __aenter__ and __aexit__ methods.

New syntax for asynchronous iteration: async for. And
associated protocol with __aiter__, __aexit__ and new built-
in exception StopAsyncIteration. New tp_as_async.am_aiter
and tp_as_async.am_anext slots in PyTypeObject.

C API changes: new PyCoro_Type (exposed to Python as
types.CoroutineType) and PyCoroObject.
PyCoro_CheckExact(*o) to test if o is a native coroutine.

While the list of changes and new things is not short, it is important
to understand, that most users will not use these features directly.
It is intended to be used in frameworks and libraries to provide users
with convenient to use and unambiguous APIs with async def,
await, async for and async with syntax.