I have in my java app a loop that adds object arrays into a list
List<Object[]> list = new ArrayList<Object[]>();
I noticed that NOT giving an initial capacity was actually faster than giving the capacity.
I have a 23 seconds vs 15 seconds difference.
I've created this dummy JUnit test to mock that and go 6 seconds vs 6.7 seconds difference, favoring the NO initial capacity:
@Test
public void asdf() {
    int rowSize = 20;
    int rowCount = 10000000;
    // DATA PREP
    int[][] matrix = new int[rowCount][];
    for (int i=0; i< matrix.length; i++) {
        matrix[i] = new int[rowSize];
        for (int j=0; j< rowSize; j++) {
            matrix[i][j] = (int) (Math.random()*10);
        }
    }
    BiFunction<Integer, Integer, Object> getValue = new BiFunction<Integer, Integer, Object>() {
        @Override
        public Object apply(Integer row, Integer col) {
            return matrix[row][col];
        }
    };
    //END DATA PREP
    long start = System.currentTimeMillis();
    // THIS IS THE DIFFERENCE!!!!
    //List<Object[]> lst = new ArrayList<>(rowCount);
    List<Object[]> lst = new ArrayList<>();
    for (int i = 0; i < rowCount; i++) {
        Object[] row = new Object[rowSize];
        for (int j = 0; j < rowSize; j++) {
            row[j] = getValue.apply(i, j);
        }
        lst.add(row);
    }
    long totalTime = System.currentTimeMillis() - start;
    System.out.println(totalTime);
}     
How does it make any sense?
I know generally this code is crap, It's just trying to mock my real case, reading off a resultset (not the jdbc one).
Thanks!
 
    