I have written a simple arithmetic logic in both C and JAVA. But C takes nearly 23.4s whereas JAVA takes around 4s to finish executing. This question is not based on how I calculate time which I suppose is already mentioned in the code. This is based on the execution. C code is as follows.
#include<stdio.h>
#include<time.h>
main() {
    clock_t begin = clock();
    long i, temp;
    for(i = 0; i<10000000000; i++)
        temp = i * 5;
     printf("temp : %ld\n", temp);
     clock_t end = clock();
     printf("time : %lf\n", (double) (end - begin) / CLOCKS_PER_SEC);
}
The output for C is
temp : 49999999995
time : 23.477688
JAVA code is as follows
public class Test {
    public static void main(String[] args) {
        long startTime = System.currentTimeMillis();
        long num = 5, temp = 0;
        for(long i = 0; i < 10000000000L; i++)
            temp = num * i;
        System.out.println(temp);
        long endTime   = System.currentTimeMillis();
        long totalTime = endTime - startTime;
        System.out.println("Execution time : " + totalTime);
    }
}
The output for JAVA is
49999999995
Execution time : 4194
Am I missing something to interpret JAVA is more efficient than C
 
     
    