0

I have build an application (python3.6) which keeps checking a directory for incoming files and when there are files it spawns some python processes in order to work on those files.This work involves db calls and as of now there is no connection pooling so far. In near future load is going to go so heavy that we can't sustain without db connection pool. Is there a way that I can share connection pool across multiple processes in python. I have gone through the python documentation and stackoverflow but didn't find anything solid. At high level this is how I want it to be working ....

Thanks in advance for your suggestions enter image description here

6
  • In short, no. You cant share a single (network) connection between multiple processes. It's not clear to me though why you can't have e.g. a single DB connection in each child process? It's not clear what you mean by "... load is going to go so heavy that we can't sustain without db connection pool". What load are you talking about there and why do you think a DB connection pool will help? Commented Oct 10, 2019 at 22:02
  • normally you can't do that...maybe you can explore a pist with gevent (asynchronous job)? Commented Oct 10, 2019 at 22:03
  • @TomDalton Today I am opening and closing connections to process each file, but later number of files is going to be huge and it wouldn't make sense to open\close connection for each file. Commented Oct 14, 2019 at 15:31
  • Can you have a connection (or pool) per worker process? Commented Oct 19, 2019 at 17:28
  • 1
    @TomDalton problem is that I keep creating new processes.. main process gets the work and then creates n worker processes anf distribute work to them Commented Oct 25, 2019 at 19:00

0

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.