Looking at the PyTorch docs, I see that one can set different learning rates for different parameters. However, I don't see how to do this in R.
myModel <- torch::nn_module(
classname="MyModel",
initialize = function (mean,sd) {
private$mPar <- nn_parameter(torch_tensor(mean))
private$sPar <- nn_parameter(torch_tensor(log(sd))
},
params = function() {
list(mean=private$mPar,sd=private$sPar)
},
...
}
mod0 <- myMod$new(mean=c(-1,1),sd=.5)
opt0 <- optim_adam(mod0$params(),lr=.1)
This code works, but it uses the same learning rate for the mean and sd parameters. I would like to be able to set them separately, but there is no example of how to do this in R, only Python.
To do this you can create 2 optimizers, eg:
mod0 <- myMod$new(mean=c(-1,1),sd=.5)
opt_mean <- optim_adam(mod0$mPar,lr=.1)
opt_sd <- optim_adam(mod0$sd,lr=.5)
Or, if you want to use the parameter group feature, you can do:
myModel <- torch::nn_module(
classname="MyModel",
initialize = function (mean,sd) {
private$mPar <- nn_parameter(torch_tensor(mean))
private$sPar <- nn_parameter(torch_tensor(log(sd)))
},
params = function() {
list(mean=list(params = private$mPar, lr = 0.1),sd=list(params = private$sPar, lr = 0.001))
}
)
system
Closed
November 16, 2025, 7:27pm
3
This topic was automatically closed 90 days after the last reply. New replies are no longer allowed. If you have a query related to it or one of the replies, start a new topic and refer back with a link.