memory allocation issue in R

I found couple of ways to fix the memory allocation issue to create new data frames in R. This would help during any data wrangling and modeling tasks performed on R.

The below solutions worked for me, hence sharing this across wider audience.

  1. Usage of Parallel processing packages:-

library(bigmemory)

library(biganalytics)

library(bigtabulate)

  1. Gdata Ranking:-

ll() # return a dataframe that consists of a variable name as rownames, and class and size (in KB) as columns

subset(ll(), KB > 1000) # list of object that have over 1000 KB

ll()[order(ll()$KB),] # sort by the size (ascending)

  1. Memory Defragmentation – This release and reloads memory in current R session and in turn helps avoid memory errors.

ls()

save.image(file="business_model_temp.RData")

rm(list=ls())

load(file="business_model_temp.RData")

ls()

  1. Object Size identification – Know size of each object present in current R session and then drop it to continue creating new data frames without any errors:

sort( sapply(ls(),function(x){object.size(get(x))})) - For all objects in current session

object.size(object_name) – For selective object in current session

Feel free to share if you came across anything else,

Regards

1 Like

Sometimes

gc()

garbage collection helps.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.