The hyper_parameter_optimizer.py example script demonstrates hyperparameter optimization, which is automated by using ClearML
A search strategy is required for the optimization, as well as a search strategy optimizer class to implement that strategy.
The following search strategies can be used:
BOHB performs robust and efficient hyperparameter optimization at scale by combining the speed of Hyperband searches with the guidance and guarantees of convergence of Bayesian Optimization.
Random uniform sampling of hyperparameter strategy - automation.optimization.RandomSearch
Full grid sampling strategy of every hyperparameter combination - Grid search automation.optimization.GridSearch.
Custom - Use a custom class and inherit from the ClearML automation base strategy class, automation.optimization.SearchStrategy.
The search strategy class that is chosen will be passed to the automation.optimization.HyperParameterOptimizer object later.
The example code attempts to import
OptimizerOptuna for the search strategy. If
clearml.automation.optuna is not
installed, it attempts to import
clearml.automation.hpbandster is not installed, it uses
RandomSearch for the search strategy.
When the optimization starts, a callback is provided that returns the best performing set of hyperparameters. In the script,
job_complete_callback function returns the ID of
We set the Task type to optimizer, and create a new experiment (and Task object) each time the optimizer runs (
When the code runs, it creates an experiment named Automatic Hyper-Parameter Optimization that is associated with the project Hyper-Parameter Optimization, which can be seen in the ClearML Web UI.
Create an arguments dictionary that contains the ID of the Task to optimize, and a Boolean indicating whether the optimizer will run as a service, see Running as a service.
In this example, an experiment named Keras HP optimization base is being optimized. The experiment must have run at least once so that it is stored in ClearML Server, and, therefore, can be cloned.
Since the arguments dictionary is connected to the Task, after the code runs once, the
template_task_id can be changed
to optimize a different experiment, see tuning experiments.
Initialize an automation.optimization.HyperParameterOptimizer object, setting the optimization parameters, beginning with the ID of the experiment to optimize.
Set the hyperparameter ranges to sample, instantiating them as ClearML automation objects using automation.parameters.UniformIntegerParameterRange and automation.parameters.DiscreteParameterRange.
Set the metric to optimize and the optimization objective.
Set the number of concurrent Tasks.
Set the optimization strategy, see Set the search strategy for optimization.
Specify the queue to use for remote execution. This is overridden if the optimizer runs as a service.
Specify the remaining parameters, including the time limit per Task (minutes), period for checking the optimization (minutes), maximum number of jobs to launch, minimum and maximum number of iterations for each Task.
The optimization can run as a service, if the
run_as_service argument is set to
true. For more information about
running as a service, see ClearML Agent services container
on "Concepts and Architecture" page.
The optimizer is ready. Set the report period and start it, providing the callback method to report the best performance.
Now that it is running:
- Set a time limit for optimization
- Get the best performance
- Print the best performance
- Stop the optimizer.