I want to read huge data from CSV, containing around 500,000 rows. I am using OpenCSV library for it. My code for it is like this
    CsvToBean<User> csvConvertor = new CsvToBean<User>();
    List<User> list = null;
    try {
        list =csvConvertor.parse(strategy, new BufferedReader(new FileReader(filepath)));
    } catch (FileNotFoundException e) {
        e.printStackTrace();
    }
Upto 200,000 records,data is read into list of User bean objects. But for data more than that I am getting
java.lang.OutOfMemoryError: Java heap space
I have this memory setting in "eclipse.ini" file
-Xms256m
-Xmx1024m
I am thinking a solution of splitting the huge file in separate files and read those files again, which I think is a lengthy solution.
Is there any other way, by which I can avoid OutOfMemoryError exception.
 
     
     
    