I am working on a capstone for a certificate program and I can't even read two out of my 12 csv files without crashing R because the RAm goes through the roof. Is there a way to get me through this project without paying $100 to increase the RAM? I don't have the resources to make a purchase like that right now, but I also can't finish my course with R the way it is...any help is appreciated!
Do the csv's have extraneous data that you could dispense with ? i.e. you could stream them in, dropping what you dont need as you go and perhaps what is left is small enough to fit in your memory?
readr has read_csv_chunked for that sort of thing.
also disk.frame exists for doing dplyr work on bigger than memory data, it might get you part way where you need to go. disk.frame README