Hi I am running into some annoying issue. I noticed that the "vector memory exhausted (limit reached?)" is very common when running R and R studio in my Macbook pro M1 with 16G RAM. In my case, this is happening whenever I try to load data and environments that are larger than ~4GB. This is quite surprising to me given that my older MacBook air, with only 8G RAM could load bigger sizes than this, and this issue makes it very hard for me to work on the M1 given that many of my projects are typically >4G.
Any suggestions on how to handle this? Very much appreciated.
Hi PJ, thanks for sharing this, I had already seen this and other suggestions for similar problems. The suggested solution doesn't really work. I have experienced this before on my older MacBook and was able to fix it using the suggested steps, for my old MAC. But for the M1, this happens quite often for relatively small datasets. with my old mac I wouldn't encounter this until i started hitting close to 8 to 10GB on the size if the dataset, ut for the M1 this starts happening at around 4 GB and can't seem to solve it whatever I do