All examples I saw for fetching multiple urls with aiohttp suggest to to the following:
async def fetch(session, url):
async with session.get(url, ssl=ssl.SSLContext()) as response:
return await response.json()
async def fetch_all(urls, loop):
async with aiohttp.ClientSession(loop=loop) as session:
results = await asyncio.gather(*[fetch(session, url) for url in urls], return_exceptions=True)
return results
if __name__ == '__main__':
loop = asyncio.get_event_loop()
urls = url_list
htmls = loop.run_until_complete(fetch_all(urls, loop))
print(htmls)
(https://stackoverflow.com/a/51728016/294103)
In practice, however, I typically have a generator (can be also async) returning domain objects from db, one attribute of which is url, but I also need access to other attributes later in the loop:
async for domain_obj in generator:
url = domain_obj.url
response = xxx # need to fetch single url here in async manner
# do something with response
Of course I can batch collect domain_objs in a list, and fetch all of them like in example, but this doesn't feel right.
asyncio.get_event_loopmethod.