There are an array of approaches to measure function execution time in R (system.time(), microbenchmark(), proc.time(), profvis() etc) but they all involve wrapping these functions around your individual calls to time the process e.g.
system.time(a <- rnorm(10e7))  
#  user  system elapsed 
#   5.22    0.28    5.50 
This is fine when you know you want to time something in advance but as is often the case, I run a piece of code and realise that it has been running for sometime and I have forgotten to wrap it in a timer function above.
Is there a way to automatically time all calls? A crude estimate is all I am looking for. I suspect this is challenging so even printing the local time after every call would be useful (e.g. call format(Sys.time(), "%H:%M:%S") after each call). Is there a way to set this at a top level perhaps via .rprofile?
There is a similar type of question asked here. Although the solution posed by @hadley is useful it requires all your calls to be in a script and source()ed. The question and answer are quite old now and I am wondering are there any newer clever ways to do this. I am using RStudio in case its of relevance here.
Thanks
EDIT
@Allan Cameron makes a good point. I wouldn't want the start time printed for every function but at the top level only. So, if I ran the following chunk, I would just want one time printed (current time when the overall chunk was started) and not a separate one for group_by, summarise, n() etc.
#library(dplyr)
starwars %>%
  group_by(species) %>%
  summarise(
    n = n(),
    mass = mean(mass, na.rm = TRUE)
  ) %>%
  filter(
    n > 1,
    mass > 50
  )
 
    