So as the titles says, I've been doing regression and plotting in a r-script with quiet a lot of observations (3 600 000) and it was no problem. But all of a sudden it stopped working, even with a sample (360 000) It wouldnt work and the regression was slow and then crash.
Even though I've done this regression: simple lm regression on 3 600 000 multiple times before, now its simply doesnt work.
I've got this error message when it doesnt crash: Error: cannot allocate vector of size 8.3 Mb.
And my cpu and ram get 100% useage .
Any thoughts on this issue? Could it be a program or hardware problem?
My computer seems to run fine in other regards, and I have 16gb of ram
I'm just guessing but maybe your working environment has become too crowded and you are running out of RAM memory, or it has become corrupted in some way. I would try cleaning my environment and resetting my project's state.
I've cleaned my environment and even tried my code in a new project. Anything else I can do to make sure its not the R-programs problem?
Hard to say, have you recently updated R or the libraries you are using? I don't think you are going to get any meaningful help beyond guessings without a reproducible example, I know it is not trivial task in your case but it would increase your chances of getting a solution so maybe it worths the effort.
> ChickWeight_models <- bytbil %>%
> group_by(mon, segment) %>%
> do(fit = lm(price^(1/3)~ distance+model_year, data = bytbil,))
The problem was that I had bytbil in the do() function. You cant realize how frustrating it was to find out
This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.
If you have a query related to it or one of the replies, start a new topic and refer back with a link.