I am practicing on using shared values for multiprocessing. I have an existing Process function that is working using shared value:
def run_procs_with_loop(lock):
# this is my shared value
shared_number = Value('i', 0)
print(__name__, 'shared_value in the beginning', shared_number.value)
# create a process list to append each process spawned by the for- loop
processes = []
for _ in range(2):
p = Process(target=add_100_locking, args=(shared_number, lock))
processes.append(p)
p.start()
for _ in processes:
p.join()
print('shared_value at the end', shared_number.value)
The above process is directed to spawn TWO processes, and each process is directed to a function with args (shared_number, lock). The function ran as expected.
I tried to convert it to a multiprocessing Pool - I attempted to pass the argument `[ shared_number, lock] *2 in my pool.map() statement (I want the Pool to spawn just two processes) but python is rejecting it:
def run_procs_with_pool(lock):
shared_number = Value('i', 0)
print(__name__, 'shared_value in the beginning', shared_number.value)
# create processes using multiprocessing.Pool
pool = Pool()
pool.map(add_100_with_lock, [(shared_number,lock)] * 2)
print('shared_value at the end', shared_number.value)
Thanks for any helpful input in advance.
Update:
Someone suggested that I use starmap instead of map, but I am getting the error RuntimeError: Synchronized objects should only be shared between processes through inheritance. it looks like multiprocessing.Pool does not allow shared values to be passed in this way?
Thought I'd share the task function add_100_with_lock as shown below:
def add_100_with_lock(num,locking):
for _ in range(100):
time.sleep(0.001)
with lock:
num.value += 1
Is there way to make passing shared values work with multiprocessing.Pool to work?