I'm facing a dead kernel problem.
I'm trying to save over 2000 .txt files into a list of lists.
my_path contains paths to these 2000+ files. I tried try - except as below but it didn't help.
The kernel seems to die randomly, i.e. I tried to find files where it breaks, but it seems to break on files which weren't a problem during a previous run.
my_list = []
for i in my_path:
    with open(i) as f:
        try:
            lines = f.read().splitlines()
            #print(f)
            my_list.append(lines)
            f.close() 
        except:
            print(f)
I also tried opening the files where kernel died separately and they seem to work fine. I assume something is wrong with my loop?
UPD. I'm using EndeavourOS, Jupyter in VSCode, RAM 16 GB.
I split the paths and it looks like I'm running out of memory. I tried del ... and gc.collect(), but unsuccessful, it doesn't free the memory and once it's over 12 GB it crashes.
