Question or problem about Python programming:
The concurrent.futures.Executor.map takes a variable number of iterables from which the function given is called. How should I call it if I have a generator that produces tuples that are normally unpacked in place?
The following doesn’t work because each of the generated tuples is given as a different argument to map:
args = ((a, b) for (a, b) in c) for result in executor.map(f, *args): pass
Without the generator, the desired arguments to map might look like this:
executor.map( f, (i[0] for i in args), (i[1] for i in args), ..., (i[N] for i in args), )
How to solve the problem:
Solution 1:
You need to remove the *
on the map
call:
args = ((a, b) for b in c) for result in executor.map(f, args): pass
This will call f
, len(args)
times, where f
should accept one parameter.
If you want f
to accept two parameters you can use a lambda call like:
args = ((a, b) for b in c) for result in executor.map(lambda p: f(*p), args): # (*p) does the unpacking part pass
Solution 2:
One argument that is repeated, one argument in c
from itertools import repeat for result in executor.map(f, repeat(a), c): pass
Need to unpack items of c, and can unpack c
from itertools import izip for result in executor.map(f, *izip(*c)): pass
Need to unpack items of c, can’t unpack c
- Change
f
to take a single argument and unpack the argument in the function. -
If each item in
c
has a variable number of members, or you’re callingf
only a few times:executor.map(lambda args, f=f: f(*args), c)
It defines a new function that unpacks each item from
c
and callsf
. Using a default argument forf
in thelambda
makesf
local inside thelambda
and so reduces lookup time. -
If you’ve got a fixed number of arguments, and you need to call
f
a lot of times:from collections import deque def itemtee(iterable, n=2): def gen(it = iter(iterable), items = deque(), next = next): popleft = items.popleft extend = items.extend while True: if not items: extend(next(it)) yield popleft() return [gen()] * n executor.map(f, *itemtee(c, n))
Where n
is the number of arguments to f
. This is adapted from itertools.tee
.
Solution 3:
You can use currying to create new function via partial method in Python
from concurrent.futures import ThreadPoolExecutor from functools import partial def some_func(param1, param2): # some code # currying some_func with 'a' argument is repeated func = partial(some_func, a) with ThreadPoolExecutor() as executor: executor.map(func, list_of_args): ...
If you need to pass more than one the same parameters you can pass them to partial method
func = partial(some_func, a, b, c)
Solution 4:
So suppose you have a function with takes 3 arguments and all the 3 arguments are dynamic and keep on changing with every call. For example:
def multiply(a,b,c): print(a * b * c)
To call this multiple times using threading, I would first create a list of tuples where each tuple is a version of a,b,c:
arguments = [(1,2,3), (4,5,6), (7,8,9), ....]
To we know that concurrent.futures
‘s map
function would accept first argument as the target function and second argument as the list of arguments for each version of the function that will be execute. Therefore, you might make a call like this:
for _ in executor.map(multiply, arguments) # Error
But this will give you error that the function expected 3 arguments but got only 1
. To solve this problem, we create a helper function:
def helper(numbers): multiply(numbers[0], numbers[1], numbers[2])
Now, we can call this function using executor as follow:
with ThreadPoolExecutor() as executor: for _ in executor.map(helper, arguments): pass
That should give you the desired results.
Solution 5:
For ProcessPoolExecutor.map()
:
Similar to map(func, *iterables) except:
the iterables are collected immediately rather than lazily;
func is executed asynchronously and several calls to func may be made
concurrently.
Try running the following snippet under python 3, and you will be quite clear:
from concurrent.futures import ProcessPoolExecutor def f(a, b): print(a+b) with ProcessPoolExecutor() as pool: pool.map(f, (0, 1, 2, 3, 4, 5, 6, 7, 8, 9), (0, 1, 2)) # 0, 2, 4 array = [(i, i) for i in range(3)] with ProcessPoolExecutor() as pool: pool.map(f, *zip(*array)) # 0, 2, 4
Solution 6:
I have seen so many answers here, but none of them is as straight forward as using lambda expressions:
foo(x,y):
pass
want to call above method 10 times, with same value i.e. xVal and yVal?
with concurrent.futures.ThreadPoolExecutor() as executor:
for _ in executor.map( lambda _: foo(xVal, yVal), range(0, 10)): pass