I am working with Rstudio on desktop but with almost large data-sets. By running simple regression I've got this error! so my question is, do you think it would be a good idea to move to cloud computing or server or increasing the memory? The laptop is 16G and Im using R-studio 64-bit on desktop.
Try re-running your script on a fresh R session first. These sort of messages are common when certain operations are run multiple times and accumulate in memory.
Working in the cloud on its own won't do anything. It's the RAM which is important whether cloud-based or on the desktop.
However, I have found these error messages to be accurate on some occasions (i.e. requiring the RAM to be increased to the amount stated) or sometimes to be nonsense as the result of garbage accumulating in memory and not being released or by repeatedly knitting an Rmarkdown document. Hence it's worth checking if a fresh session helps first. If it doesn't then redesigning your script or increasing your RAM should be the next step.
Here is a nice answer on a related topic about dealing with large datasets on R
To summarise you have another options as using more RAM efficient libraries like data.table or change your approach to out of memory solutions like sql databases.
This error can also be accurate but a result of a coding mistake, e.g. you intend to do an inner join of two large data sets but you mistakenly call for a cross join. What line in your code throws the error?