I'm always getting GC overhead limit errors during random times in a for loop, where I go through 200K+ records I queried from Pgres. How do I prevent this error?
Here's when I get the OutOfMemoryError, going through 200K+ records:
 final List<CustomerCSV> customerCSVs = new ArrayList<>();
 for (Record record : results) {
     final Iterable<String> tags = 
     Iterables.transform(record.get(CUSTOMER.TAGS), (part) -> StringUtils.chop(part.toString().substring(1)));
     customerCSVs.add(new CustomerCSV(record.get("name"), record.get("email"), Joiner.on(",").join(tags));
}
 
     
    