I have a ML algorithm that I want to train. I am not yet satisfied with the results and would like to optimize it. For this I would like to build a loop around the algorithm and vary different parameters so that the error is minimal. Does anyone have an idea how such a loop could look like.? In principle I would like to "find the value x for which y is minimal".
Typically a lot of ML packages come out with functions to cover this need. In many instances it is referred to as grid search (see here as an example: Model tuning via grid search — tune_grid • tune ). All the big packages should have something like this.