Created
March 25, 2018 09:17
-
-
Save pfreixes/b040b83aeab92d378ddd8426291cee42 to your computer and use it in GitHub Desktop.
Asyncio task hierarchy using an Aiohttp server plus an Aioredis pool.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import aioredis | |
import asyncio | |
import json | |
from aiohttp import web | |
tasks = {} | |
task_id = 0 | |
def task_factory(loop, coro): | |
global task_id | |
current_task = asyncio.Task.current_task(loop=loop) | |
task = asyncio.tasks.Task(coro, loop=loop) | |
if task._source_traceback: | |
del task._source_traceback[-1] | |
task_repr = { | |
'name': coro.__qualname__, | |
'children': {} | |
} | |
task.task_repr = task_repr | |
if current_task: | |
current_task.task_repr['children'][task_id] = task_repr | |
else: | |
tasks[task_id] = task_repr | |
task_id += 1 | |
return task | |
class Redis: | |
def __init__(self): | |
self._pool = None | |
async def ping(self): | |
if not self._pool: | |
self._pool = await aioredis.create_pool( | |
'redis://localhost', | |
minsize=1, maxsize=11) | |
await self._pool.execute('ping') | |
redis = Redis() | |
async def handle(request): | |
await redis.ping() | |
return web.Response(text='hellow world') | |
asyncio.get_event_loop().set_task_factory(task_factory) | |
app = web.Application() | |
app.router.add_get('/', handle) | |
web.run_app(app, access_log=None, host="127.0.0.1", port=5000) | |
print(json.dumps(tasks, indent=4)) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
First hit the endpoint published by the Aiohttp server
The hierarchy of tasks created by Asyncio and the coroutines related to each task can be seen in the following output
The tasks identified by the ids 3 and 6 are the ones that are created to attend the two
curl
requests, the other ones that are at the first level are just the setup and tear down tasks started byAiohttp
.So, Per each connection a new task is created, 3 and 6 respectively, these tasks [1] are temporary and ephemeral and are triggered by
Asyncio
internally and they will be destroyed once the connection has completed. The read callback [2] that is in charge of reading all of the data for the opened connection is executed using the context of the task mentioned in the previous point.For each new connection,
Aiohttp
creates a new task, in the previous output, are identified by the 4 and 7 id, these tasks will handle the incoming requests. Here appears one of the structural differences betweenAsyncio
and other frameworks such as Trio or Curio. The task context of the callback in charge of the input/output might belong to a task context that differs from the task context that is performing theread
.The first time that we have a request, the
Redis
pool is created and the connection is bound to the task number 5. Hence, when the second request comes into the system - having the pool initialized - the context used to read the response of the ping method will be the same context used to create the connection and register the read callback, the 5 one.Asyncio, by design, allows these situations.
[1] https://github.com/python/cpython/blob/master/Lib/asyncio/selector_events.py#L199
[2] https://github.com/python/cpython/blob/master/Lib/asyncio/selector_events.py#L757