7

I'm writing app, which scans directory every second, checks for new files, and if they appeared - sends them via POST request and performs archiving. Assuming that number of files, which can appear in directory can be from 10 to 100 - I decided to use asyncio and aiohttp, to send requests concurrently.

Code:

import os
import aiohttp
from aiohttp.client import ClientSession

BASE_DIR = '/path/to'
ARCHIVE_DIR = '/path/to/archive'

async def scan():
    while True:
        await asyncio.sleep(1)
        for file in os.listdir(BASE_DIR):
            if os.path.join(BASE_DIR, file).endswith('jpg'):
                asyncio.ensure_future(publish_file(file))


async def publish_file(file):
    async with ClientSession(loop=loop) as session:
        async with session.post(url=url, data={'photo': open(os.path.join(BASE_DIR, file), 'rb')}) as response:
            if response.status == 200:
                await move_to_archive(file)

async def move_to_archive(file):
    os.rename(os.path.join(BASE_DIR, file), os.path.join(ARCHIVE_DIR, file))

loop = asyncio.get_event_loop()

coros = [
    asyncio.ensure_future(scan())
]
loop.run_until_complete(asyncio.wait(coros))

So the question is: If i want to send requests concurrent, is this a good practice to add coroutines to loop like this : asyncio.ensure_future(publish_file(file))?

1 Answer 1

5

Yes, it's correct.

P.S. Better to share the same session (perhaps with limited amount of parallel connections) than recreate a connection pool on every post request:

session = aiohttp.ClientSession(connector=aiohttp.TCPConnector(limit=10))
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.