I have a large dataset on a remote DB which is accessible only through a very slow network link. I regularly query the whole dataset overnight and then work from a local copy. Up to now I have been using a simple script that queries the data using DBI::dbGetQuery(...)
, and then saves the data.frame to disk. However the dataset has grown to a point that I am running out of memory to hold the data.frame. I would like instead to "pipe" the queried data straight into an SQLite database. Is there a way to do this using DBI functions alone?
Update: there is a useful example here showing how to use DBI:dbSendQuery
, and then recover the result in small chunks with repeated calls to DBI::dbFetch
. I have not tried this yet, but this seems like the solution.