Occasionally when connecting to my Oracle database through ROracle and dbplyr I will run a dplyr::collect operation that fetches more data than expected and than R can handle.
This can make R crash and is often a sign I should have filtered or aggregated data further before fetching.
It would be great to be able to check the size of the result before choosing to fetch it or not (without running the query twice).
Let's name collect2 the variation of collect that would allow this:
expected behavior:
small_t <- con %>% tbl("small_table") %>%
filter_group_etc %>%
collect2(n_max = 5e6) # works fine
big_t <- con %>% tbl("big_table") %>%
filter_group_etc %>%
collect2(n_max = 5e6) # Error: query returned 15.486.245 rows, n_max set to 5.000.000
Would this be possible ?
I'm also open to a solution using ROracle / DBI without dplyr, e.g.:
dbGetQuery2(con, my_big_sql_query,n_max = 5e6) # Error: query returned 15.486.245 rows, n_max set to 5.000.000
EDIT:
See below a partial solution posted as an answer, not optimal because some time is wasted fetching data I have no use for.