I have binary file that contains invariable number of images (size of each image 1024*768). I put each image to JoinableQueue and analyzed it in multiprocessing, and it works perfect with small files, but I get Memory Error when I try to read huge files. Anybody know how can i store big files to bufer/Queue(as string)? (unfortunately i can't use Manager or Pool)
            Asked
            
        
        
            Active
            
        
            Viewed 340 times
        
    0
            
            
        - 
                    Have you considered chunking the input with a generator as in in this [answer](https://stackoverflow.com/questions/519633/lazy-method-for-reading-big-file-in-python)? – Dodge Feb 08 '19 at 15:40
2 Answers
0
            
            
        Did you have a look at the module io.BytesIO? You can find it here: https://docs.python.org/release/3.1.3/library/io.html#binary-i-o You can set your Buffer size, that solved a memory problem for me once.
 
    
    
        CLpragmatics
        
- 625
- 6
- 21
0
            
            
        - You can read about buffer here.
- If your memory if small, you can try force gc like that:
import gc
SIZE = 1024*768  
MEMOSIZE = 1024  # your memory size
with open('xxx', 'rb') as fp:  # open the file
    i = 0  # remember the number to gc in time
    queue = []
    while True:
        if (i*(SIZE-1) < MEMOSIZE):
            x = fp.read(SIZE)  # if your image is single channel
            queue.append(x)
            # do something
        else:
            del queue
            gc.collect()
 
    
    
        Rouzip
        
- 101
- 3
