I am trying to utilize python multiprocessing pool to take values from a shared queue and consume in worker processes.
import multiprocessing
def myTask(queue):
   while not queue.empty():     
       value = queue.get()
       print "Process {} Popped {} from the shared Queue".format(multiprocessing.current_process().pid, value)
       queue.task_done()
def main():
   m = multiprocessing.Manager()
   sharedQueue = m.Queue()
   sharedQueue.put(2)
   sharedQueue.put(3)
   sharedQueue.put(4)
   sharedQueue.put(5)
   sharedQueue.put(6)
   pool = multiprocessing.Pool(processes=3)
   pool.apply_async(myTask, args=(sharedQueue,))
   pool.close()
   pool.join()
if __name__ == '__main__':
    main()
From the output I am getting, I see that only a single process was started that took all values out from queue. How can I spawn multiple processes in parallel that keep on getting values from queue. I do have a max limit on number of processes that may be greater than the queue size. Kindly guide me in this regard. Thanks.
PS: This is just an example. The actual task does migration of data from one form to another, and does some heavy lifting operations on data.
Update: I did following modifications, and they seem to work, except for the fact that pool.join() blocks main process from exiting even when all child processes exit.
pool = multiprocessing.Pool(processes=4)
while not sharedQueue.empty():
   pool.apply_async(myTask, args=(sharedQueue,))
pool.close()
#pool.join()
def myTask(queue):
   value = queue.get()
   print "Process {} Popped {} from the shared   Queue".format(multiprocessing.current_process().pid, value)
   queue.task_done()