How much RAM is available?

I try to do some analysis online using
My data is of the size:

> object.size(data)
361824760 bytes

I have to do some filtering and subsetting to derive subsamples which I keep. Then I delete the data again to free up memory.
Something like

test.X = data %>% filter(type == "test") %>% select(one_of(modelvars))
test.y = data %>% filter(type == "test") %>% select(one_of(modelvars)) 

train.X = data %>% filter(type == "train") %>% select(one_of(modelvars))
train.y = data %>% filter(type == "train") %>% select(one_of(modelvars)) 


My session crashes before I can do the rm statement.

Should I be able to perform such manipulations with the ressources given? Can I somehow increase the memory in my session?
Attached you find my setup.
Thank you!

> cat(readLines("/proc/meminfo"), sep = "\n")
MemTotal:       30865528 kB
MemFree:        21066276 kB
MemAvailable:   24994484 kB
Buffers:         1222280 kB
Cached:          2152752 kB
SwapCached:            0 kB
Active:          7132728 kB
Inactive:         918356 kB
Active(anon):    4674556 kB
Inactive(anon):     1660 kB
Active(file):    2458172 kB
Inactive(file):   916696 kB
Unevictable:         240 kB
Mlocked:             240 kB
SwapTotal:             0 kB
SwapFree:              0 kB
Dirty:              3920 kB
Writeback:            40 kB
AnonPages:       4670244 kB
Mapped:           239648 kB
Shmem:              6240 kB
Slab:            1392940 kB
SReclaimable:     945020 kB
SUnreclaim:       447920 kB
KernelStack:       44656 kB
PageTables:        47320 kB
NFS_Unstable:          0 kB
Bounce:                0 kB
WritebackTmp:          0 kB
CommitLimit:    15432764 kB
Committed_AS:   25613128 kB
VmallocTotal:   34359738367 kB
VmallocUsed:           0 kB
VmallocChunk:          0 kB
HardwareCorrupted:     0 kB
AnonHugePages:   1462272 kB
CmaTotal:              0 kB
CmaFree:               0 kB
HugePages_Total:       0
HugePages_Free:        0
HugePages_Rsvd:        0
HugePages_Surp:        0
Hugepagesize:       2048 kB
DirectMap4k:       88064 kB
DirectMap2M:     4106240 kB
DirectMap1G:    28311552 kB


We're currently limiting memory to 1GB. We have plans to allow users to increase this, which is something we're hoping to have ready in the next 3-6 months.



Great news. I am really looking forward to use the service with more RAM! Best, Richard

Additionally, to check the available memory in

1 Like

Thank you, that's useful!

Is this still in the cards? Or is it even possible now to increase memory?

Yes. We are currently working adding the ability to change the memory limit. We intend on having that feature in beta before the end of the year. If you're interested in sharing your use-case with us, we would love to hear how you would use additional memory limits.

1 Like

I use the RSQLite package in RStudio, and I work with tables on the order of [10^6,20]. So far the 1GB memory limitation has pretty much prevented me from working with these data on RStudio Cloud. Looking forward to the memory limit upgrade!