I tried the following code:
public class Test {
    public static void main(String[] args) {
        int x = 9, y = 9, z = 0;
        long startTime = System.currentTimeMillis();
        // System.out.println("loop one start time = " + startTime);
        for (int i = 0; i < 10000; i++) {
            for (int j = 0; j < 10000; j++) {
                z = x + y;
            }
        }
        System.out.println("loop one use time = " + (System.currentTimeMillis() - startTime) + ",z = " + z);
        startTime = System.currentTimeMillis();
        // System.out.println("loop two start time = " + startTime);
        for (int i = 0; i < 10000; i++) {
            for (int j = 0; j < 10000; j++) {
                z = sum(x, y);
            }
        }
        System.out.println("loop two use time = " + (System.currentTimeMillis() - startTime) + ",z = " + z);
    }
    public static int sum(int x, int y) {
        int t;
        t = x + y;
        return t;
    }
}
The output to the console is:
loop one use time = 216,z = 18
loop two use time = 70,z = 18.
It seems that the second loop took less time, than the first one! I don't understand why this happens. Thanks for you help.
Update: I exchanged the two loops, now loop one takes less time!!
loop two use time = 219,z = 18
loop one use time = 69,z = 18
 
     
     
    