I have a program where i test different data sets and configuration. I have a script to execute all of those.
imagine my code as :
start = omp_get_wtime()
  function()
end = omp_get_wtime() 
print(end-start) 
and the bash script as
for a in "${first_option[@]}"
do 
  for b in "${second_option[@]}"
  do 
    for c in "${third_option[@]}"
    do
       printf("$a $b $c \n")
       ./exe $a $b $c >> logs.out 
    done 
  done
done 
now when i execute the exact same configurations by hand, i get varying results from 10 seconds to 0.05 seconds but when i execute the script, i get the same results on the up side but for some reason i can't get any timings lower than 1 seconds. All the configurations that manually compute at less than a second get written in the file at 1.001; 1.102; 0.999 ect...
Any ideas of what is going wrong?
Thanks
 
    