import java.util.HashMap;
import java.util.Map;
import java.util.concurrent.*;
public class TestLock {
    private static ExecutorService executor = Executors.newCachedThreadPool();
    private static Map<Integer, Integer> map = new HashMap<>(1000000);
    private static CountDownLatch doneSignal = new CountDownLatch(1000);
    public static void main(String[] args) throws Exception {
        for (int i = 0; i < 1000; i++) {
            final int j = i;
            executor.execute(new Runnable() {
                @Override
                public void run() {
                    map.put(j, j);
                    doneSignal.countDown();
                }
            });
        }
        doneSignal.await();
        System.out.println("done,size:" + map.size());
    }
}
Some people say that hashmap insertion is not safe when concurrency. Because the hashmap will perform capacity expansion operations, but I set the size here to 1000000, which will only expand at 750,000. I am doing 1000 inserts here, so I won't expand it. So there should be no problem. But the result is always less than 1000, what went wrong?