I'm reading in a number of large (compressed) csv files using read_csv. The files are successfully being pulled into R, but leave behind very large files in my %tmp% directory. Is there a way to clean these up automatically from within R?
I've run into this issue when asking R to run multiple imports which each use lapply to read and filter multiple files within a function. I wonder if I've prevented R from correctly destroying the objects.
@startz That might work if I set a different temp dir for R to use, I don't want to purge all of %tmp% and screw up other users