I am trying to convert a Big size list to dataframe.
Length of the list len(rows_list) is 15347782 which is pretty big.
Worked well with lesser size of list using this :
df = pd.DataFrame(rows_list)
But this breaks because of memory error when I try to convert this size of list into dataframe.
Is there any way of implementing chunksize while writing it into dataframe? Like we do while writing big size file to csv or reading a big size file from csv.
Or there is any other smooth way for this task?
Thanks in advance!