I am running 1000 simulations in a loop. All simulations should take similar computational time because are all the same procedure, however my computational times are { 6.496, 7.680, 9.464 , 10.976, ..., 141.460, 145.276, 143.148}. They are badly increasing with time.
My guess is that has something to do with no space in the temporal memory or something like that, but I know very little about computer sciences. I would think that I need to just add an extra step whithin the loop where I delete the garbage that is using the memory (like a reset without deleting the previous calculations) and that should solve the problem with this unnecessary waste of time.
I appreciate a solution of this, but also a small explanation of why this happens in case you do not have a solution for R.
The code I am using is
 ptm <- proc.time()
 init_pars = c(0.8,0.0175,0.1)
 pars=init_pars
 n_it = 50
 M = matrix(nrow=n_it,ncol=3)
 for (i in 1:n_it){
   print(c(pars[1],pars[2],pars[3]))
   n_it = 10
   S=list()
   for (j in 1:n_it){
     rec_tree = reconst_tree(bt=s2$t,pars=pars,tt=15)
     S[[j]] = rec_tree
   }
   pars = mle_dd_setoftrees(S)
   pars = c(pars$lambda,pars$beta,pars$mu)
   M[i,]=c(pars[1],pars[2],pars[3])
   print(proc.time() - ptm)
   ptm <- proc.time()
 }
the function reconst_tree create independent simulations and mle_dd_setoftrees caluculate estimations from a set of simulations, then I store the estimations in the matrix M.
 
     
    