Different learning rates per parameter

Looking at the PyTorch docs, I see that one can set different learning rates for different parameters. However, I don't see how to do this in R.

myModel <- torch::nn_module(
  classname="MyModel",
  initialize = function (mean,sd) {
      private$mPar <- nn_parameter(torch_tensor(mean))
      private$sPar <- nn_parameter(torch_tensor(log(sd))
   },
  params = function() {
     list(mean=private$mPar,sd=private$sPar)
  },
  ...
}

mod0 <- myMod$new(mean=c(-1,1),sd=.5)
opt0 <- optim_adam(mod0$params(),lr=.1)

This code works, but it uses the same learning rate for the mean and sd parameters. I would like to be able to set them separately, but there is no example of how to do this in R, only Python.

To do this you can create 2 optimizers, eg:

mod0 <- myMod$new(mean=c(-1,1),sd=.5)
opt_mean <- optim_adam(mod0$mPar,lr=.1)
opt_sd <- optim_adam(mod0$sd,lr=.5)

Or, if you want to use the parameter group feature, you can do:

myModel <- torch::nn_module(
  classname="MyModel",
  initialize = function (mean,sd) {
      private$mPar <- nn_parameter(torch_tensor(mean))
      private$sPar <- nn_parameter(torch_tensor(log(sd)))
  },
  params = function() {
     list(mean=list(params = private$mPar, lr = 0.1),sd=list(params = private$sPar, lr = 0.001))
  }
)