2

I can use multiprocessing to easily set up parallel calls to "func" like this:

import multiprocessing

def func(tup):
    (a, b) = tup
    return str(a+b)

pool = multiprocessing.Pool()
tups = [ (1,2), (3,4), (5,6), (7,8)]
results = pool.imap(func, tups)
print ", ".join(results)

Giving result:

3, 7, 11, 15

The problem is that my actual function "func" is more complicated than the example here, so I don't want to call it with a single "tup" argument. I want multiple arguments and also keyword arguments. What I want to do is something like below, but the "*" unpacking inside a list doesn't work (and doesn't support keywords either):

import multiprocessing

def func(a, b):
    return str(a+b)

pool = multiprocessing.Pool()
tups = [ *(1,2), *(3,4), *(5,6), *(7,8)]
results = pool.imap(func, tups)
print ", ".join(results)

So... is there a way to get all the power of python function calls, while doing parallel processing?

4
  • Keep in mind that (i)map will try and distribute your arguments evenly. So any list of a has to match with any list of b. In which case, a list of tuples as you're doing now may be the best way to go. Commented Apr 14, 2016 at 5:49
  • Could you give a clearer example of what those other arguments, including the keyword arguments, are? Commented Apr 14, 2016 at 5:49
  • On second thoughts, I think I can live without the keyword arguments. At least at the moment, what I want is the ability to pass multiple parameters to function "func", without "func" having to do the slightly messy job of unpacking the tuple. I know it's not a big deal, but I'll be showing the code to non-python people, so any simplifying helps. Commented Apr 14, 2016 at 5:58
  • 1
    Possible duplicate of Python multiprocessing pool.map for multiple arguments Commented Apr 14, 2016 at 7:39

2 Answers 2

1

Can't you just use dicts or objects?

import multiprocessing
def func(a):
     print(str(a['left'] + a['right']))

pool = multiprocessing.Pool()
i1 = {'left': 2, 'right': 5}
i2 = {'left': 3, 'right': 4}
pool.imap(func, [i1, i2])

This way you won't have the keywords defined in the method definition but you will at least be able to reference the keywords within the method body. Same goes for the function input. Instead of dealing with tuple this is much more readable.

Sign up to request clarification or add additional context in comments.

1 Comment

Thanks for that - good point. I lost most of a day due to a bug in my code packing/unpacking the tuples. So now I pass an object to "func", which is almost as good as what I asked for.
0

HarryPotfleur's comment is correct, basically if you're using Python 3.3+, you could use starmap:

starmap(func, iterable[, chunksize])
Like map() except that the elements of the iterable are expected to be iterables that are unpacked as arguments.

Hence an iterable of [(1,2), (3, 4)] results in [func(1,2), func(3,4)].

Just replace imap with starmap.

If you're using lower version, there's no direct way of doing this. Instead, you could use a intermediate function

def inter(tup):
    return func(*tup)

def func(a, b, c):
    return str(a+b+c)

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.