Problem joining large csv files on R. System Crash

Dear community,
This is my first post, I'm a beginner, trying to learn and create my first project, so thanks for your help and guidance.

I'm trying to analize a project that include a year operation report. I was given .csv files by each month, so I create a list, used lapply to read all the files and bind_rows to join everything.

At this point the new merged files has around 5.5 millions rows, so when I try to view this file, the system just crash, same when trying to create new columns. Im working in Rstudio desktop.

My best guess, you are simply running out of RAM in your machine. It is hard to give you any specific advice without a reprex or sample of the code you are using but you can try using memory efficient packages like data.table or "on-disk" approaches like the arrow package or RDBMSes like Postgresql and the severless SQLite and duckdb

Hi Andresrcs,
Thanks for taking the time to answer. Surely is about the RAM, any code I write take a lot of time to Run. So I'll investigate about other options you mention.

This topic was automatically closed 42 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.