In the code below I'm merging all csv files starting with a certain date contained in the variable: file_date. The code is working perfectly for small and moderate sized csv files but crashes with very large csv files.
path = '/Users/Documents/'+file_date+'*'+'-details.csv'+'*'
    allFiles = glob.glob(path)
    frame = pd.DataFrame()
    list_ = []
    for file_ in allFiles:
        frame = pd.read_csv(file_,index_col=None, header=0)
        print frame.shape 
        list_.append(frame)
        df = pd.concat(list_)
        print df.shape
    df.to_csv('/Users/Documents/'+file_date+'-details.csv',sep=',', index = False)
Can I process each file in chunks? if yes, how do I do that?
 
     
     
    