Error: vector memory exhausted (limit reached?)

I`m manipulating some big data in Rstudio, I find a lot of tips on the internet but I preferred to ask here, I was using a function in my time series dataset (fill_gaps) and I got the message:
Error: vector memory exhausted (limit reached?)

What is the best approach to fix it ?


R operates in available RAM memory. So, for example, if there are 4GB available once all the other background processes are taken into account and R has to bring a large object into memory and perform some operation on it, the available memory can be exhausted.

Even with a machine with abundant RAM, not all of it is necessarily available, due to settings. If on a Unix derivative, such as Linux or macOS, this can be checked from the terminal with $ ulimit, which is usually set to 8GB, by default. There are operating system varying ways of raising this. See man ulimit on the Unices. For Windows, I can't say.

Even if the objects that are taken into the memory made available by ulimit and R releases the memory when finished, the operating system may not always "take it back." This is difficult to provide advice for in the abstract, because it depends on the totality of the circumstances.

Actually I just check my setup here is unlimited, even with this configuration my rstudio is unable to run my archives.

sparklyr is an option here? I don't know if supports packages like fable , tsibble or lubridate.

Could be, but I don't have any direct experience to offer.

1 Like

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.