loop to optimize machine learning algrotihmus

Hey there,

I have a ML algorithm that I want to train. I am not yet satisfied with the results and would like to optimize it. For this I would like to build a loop around the algorithm and vary different parameters so that the error is minimal. Does anyone have an idea how such a loop could look like.? In principle I would like to "find the value x for which y is minimal".

Thanks,
Paula

Hi there,

Typically a lot of ML packages come out with functions to cover this need. In many instances it is referred to as grid search (see here as an example: Model tuning via grid search — tune_grid • tune ). All the big packages should have something like this.

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.