I have a directory with around a million images. I want to create a batch_generator so that I could train my CNN as I cannot hold all those images in memory at once.
So, I wrote a generator function to do so:
def batch_generator(image_paths, batch_size, isTraining):
    while True:
        batch_imgs = []
        batch_labels = []
        
        type_dir = 'train' if isTraining else 'test'
        
        for i in range(len(image_paths)):
            print(i)
            print(os.path.join(data_dir_base, type_dir, image_paths[i]))
            img = cv2.imread(os.path.join(data_dir_base, type_dir, image_paths[i]), 0)
            img  = np.divide(img, 255)
            img = img.reshape(28, 28, 1)
            batch_imgs.append(img)
            label = image_paths[i].split('_')[1].split('.')[0]
            batch_labels.append(label)
            if len(batch_imgs) == batch_size:
                yield (np.asarray(batch_imgs), np.asarray(batch_labels))
                batch_imgs = []
        if batch_imgs:
            yield batch_imgs
When I call this statement:
index = next(batch_generator(train_dataset, 10, True))
It is printing the same index values and paths hence, it is returning the same batch on every call of next().
How do I fix this?
I used this question as a reference for the code: how to split an iterable in constant-size chunks
 
     
     
     
     
    