To be honest, I'm not sure what the solution is, but I know lots of people have asked similar questions here. So in case you haven't read them, it's possible that someone already solved the same question. I'll link some here
library(e1071)
Attaching package: ‘e1071’
The following object is masked from ‘package:mlr’:
impute
Warning message:
package ‘e1071’ was built under R version 3.4.4
svm_model <- svm(Price ~ ., data=data.over.svm)
Error: cannot allocate vector of size 76.4 Gb
memory.limit()
[1] 8071
memory.limit(size=56000)
[1] 56000
svm_model <- svm(Price ~ ., data=data.over.svm)
Error: cannot allocate vector of size 76.4 Gb
how to solve this vector allocation error?
Any suggestion to overcome this error
Error: cannot allocate vector of size 196.1 Mb
In addition: Warning message:
In create2GDAL(dataset = dataset, drivername = drivername, type = type, :
mvFlag=NA unsuitable for type Byte
Hi,
I am trying to perform an image classification using the random forest model. The image is a Landsat 7 ETM with bands 4,3,2. I am running the code on Rstudio server. However, I get the following error.
rf <- randomForest(as.factor(class) ~ B2 + B3 + B4, data=training,
importance=TRUE,
ntree=2000, na.action = na.omit)
Error: cannot allocate vector of size 25.5 Gb
What could be the problem?
Regards
Edward
(Continuing the discussion from Issues with packages after updating to R 3.5.0 as I thought that it might be easier to start a new, more specific thread):
I had to reinstall a lot of packages following the upgrade to R 3.5.0 released yesterday and I am running into an unsual issue with the ddalpha package:
Error in system2(file.path(R.home("bin"), "R"), c(if (nzchar(arch)) paste0("--arch=", :
cannot popen ' '/usr/lib64/R/bin/R' --no-save --slave 2>&1 < '/tmp/RtmpU4dXOt/file3fb5b89c507'', probable reason 'Cannot allocate memory'
I have lots of RAM and am a little surprised by the error message. I've never had such issue before. But maybe it has to do with R memory allocation? I have ne…
3 Likes