hyperparameter_search.ipynb demonstrates integrating ClearML into a Jupyter Notebook which performs automated hyperparameter optimization. This is an example of ClearML automation. It creates a ClearML HyperParameterOptimizer object, which is a search controller. The search controller's search strategy optimizer is OptimizerBOHB The example maximizes total accuracy by finding an optimal batch size, base learning rate, and dropout. ClearML automatically logs the optimization's top performing experiments.
The experiment whose hyperparameters are optimized is named
image_classification_CIFAR10. It is created by running another
ClearML example, image_classification_CIFAR10.ipynb, which must run before
hyperparameter_search.py runs, it creates an experiment named
Hyper-Parameter Optimization which is associated
Hyper-Parameter Search project.
The optimizer Task,
Hyper-Parameter Optimization, and the experiments appear individually in the ClearML Web UI.
Scalars for total accuracy and remaining budget by iteration, and a plot of total accuracy by iteration appear in RESULTS > SCALARS. Remaining budget indicates the percentage of total iterations for all jobs left before that total is reached.
These scalars are reported automatically by ClearML from
HyperParameterOptimizer when it runs.
A plot for the optimization of total accuracy by job appears in RESULTS > SCALARS.
This is also reported automatically by ClearML when
HyperParameterOptimizer hyperparameters, including the optimizer parameters appear in CONFIGURATIONS > HYPER PARAMETERS.
These hyperparameters are those in the optimizer Task, where the
HyperParameterOptimizer object is created.
All console output from
Hyper-Parameter Optimization appears in RESULTS tab, CONSOLE sub-tab.
ClearML automatically logs each job, meaning each experiment that executes with a set of hyperparameters, separately. Each appears as an individual experiment in the ClearML Web UI, where the Task name is
image_classification_CIFAR10 and the hyperparameters appended.
image_classification_CIFAR10: base_lr=0.0075 batch_size=12 dropout=0.05 number_of_epochs=6
Use the ClearML Web UI experiment comparison to visualize the following:
- Side by side hyperparameter value comparison
- Metric comparison by hyperparameter
- Scalars by specific values and series
- Debug images
In the experiment comparison window, HYPER PARAMETERS tab, select Values in the list (the right of + Add Experiment), and hyperparameter differences appear with a different background color.
Select Parallel Coordinates in the list, click a Performance Metric, and then select the checkboxes of the hyperparameters.
In the SCALARS tab, select Last Values, Min Values, or Max Values. Value differences appear with a different background color.
Select Graph and the scalar series for the jobs appears, where each scalar plot shows the series for all jobs.
In the DEBUG SAMPLES tab, debug images appear.