Upon trying to use panelvar's pvargmm command R runs out of memory even when using a small (100 observation) dataset. The data-set was self-generated and the problem can be recreated by running the code below:
library(panelvar)
N=100
Y=rnorm(N, mean = 0, sd = 1)
X=rnorm(N, mean = 0, sd = 1)
Y2=rnorm(N, mean = 0, sd = 1)
X2=rnorm(N, mean = 0, sd = 1)
for (i in 2:N){
X[i]=X[i]+0.5*Y[i-1]+0.3*X[i-1]
Y[i]=Y[i]+0.5*Y[i-1]-0.5*X[i-1]
X2[i]=X2[i]+0.5*Y2[i-1]+0.3*X2[i-1]
Y2[i]=Y2[i]+0.5*Y2[i-1]-0.5*X2[i-1]
}
id=array(1:(2*N))
id[1:N]=1
id[(N+1):(2*N)]=2
year=array(1:N)
year=c(year,year)
Var1=c(X,X2)
Var2=c(Y,Y2)
Data=data.frame(id=id,year=year,v1=Var1,v2=Var2)
pvargmm(dependent_vars = c("v1", "v2"),
lags = 1,
transformation = "fod",
data = Data,
panel_identifier=c("id", "year"),
steps = c("twostep"),
system_instruments = FALSE,
max_instr_dependent_vars = 99,
max_instr_predet_vars = 99,
min_instr_dependent_vars = 2L,
min_instr_predet_vars = 1L,
collapse = FALSE
)
Upon running the error "cholmod error 'out of memory' ..." appears. What could be the issue?