I have code like this:
from multiprocessing import Pool
def do_stuff(idx):
for i in items[idx:idx+20]:
# do stuff with idx
items = # a huge nested list
pool = Pool(5)
pool.map(do_stuff, range(0, len(items), 20))
pool.close()
pool.join()
The issue is that threadpool does not share items but rather does create copy for each thread, which is an issue since list is huge and it hogs memory. Is there a way to implement this in a way that items would be shared? found some examples with global that work in basic thread library but that does not seem to apply for multiprocessing lib.
Thanks!