Skip to main content

Tracking Experiments and Visualizing Results

While an experiment is running, and any time after it finishes, track it and visualize the results in the ClearML Web UI, including:

  • Execution details - Code, the base Docker image used for ClearML Agent, output destination for artifacts, and the logging level.
  • Configuration - Hyperparameters, user properties, and configuration objects.
  • Artifacts - Input model, output model, model snapshot locations, other artifacts.
  • General information - Information about the experiment, for example: the experiment start, create, and last update times and dates, user creating the experiment, and its description.
  • Console - stdout, stderr, output to the console from libraries, and ClearML explicit reporting.
  • Scalars - Metric plots.
  • Plots - Other plots and data, for example: Matplotlib, Plotly, and ClearML explicit reporting.
  • Debug samples - Images, audio, video, and HTML.

Viewing modes#

The ClearML Web UI provides two viewing modes for experiment details:

  • The info panel

  • Full screen details mode.

Both modes contain all experiment details. When either view is open, switch to the other mode by clicking Table/Full screen view (View in experiments table / full screen), or clicking Bars menu (menu) > View in experiments table / full screen.

Info panel#

The info panel keeps the experiment table in view so that experiment actions can be performed from the table (as well as the menu in the info panel).

View a screenshot

Info panel

Full screen details view#

The full screen details view allows for easier viewing and working with experiment tracking and results. The experiments table is not visible when the full screen details view is open. Perform experiment actions from the menu.

View a screenshot

Full screen view

Execution details#

In the EXECUTION tab of an experiment's detail page, there are records of:

  • Source code
  • ClearML Agent configuration
  • Output details
  • Uncommitted changes
  • Installed Python packages

Source code, ClearML Agent configuration, and output details#

The source code details of the EXECUTION tab of an experiment include:

  • The experiment's repository
  • Commit ID
  • Script path
  • Working directory

Additionally, there is information about the ClearML Agent configuration. The ClearML Agent base image is a pre-configured Docker that ClearML Agent will use to remotely execute this experiment (see Building Docker containers).

The output details include:

  • The output destination used for storing model checkpoints (snapshots) and artifacts (see also, default_output_uri in the configuration file, and output_uri in Task.init parameters).

  • The logging level for the experiment, which uses the standard Python logging levels.

View a screenshot

Execution details section

Uncommitted changes#

View a screenshot

Uncomitted changes section

Installed Python packages and their versions#

View a screenshot

Installed packages section

Configuration#

All parameters and configuration objects appear in the CONFIGURATION tab.

Hyperparameters#

important

In older versions of ClearML Server, the CONFIGURATION tab was named HYPER PARAMETERS, and it contained all parameters. The renamed tab contains a HYPER PARAMETER section, and subsections for hyperparameter groups.

Hyperparameters are grouped by their type and appear in CONFIGURATION > HYPER PARAMETERS.

Command line arguments#

The Args section shows automatically logged argparse arguments, and all older experiments parameters, except TensorFlow Definitions. Hover over a parameter, and the type, description, and default value appear, if they were provided.

View a screenshot

Command line arguments configuration section

Environment variables#

If the CLEARML_LOG_ENVIRONMENT variable was set, the Environment section will show environment variables (see this FAQ).

View a screenshot

Environment variables configuration section

Custom parameter groups#

Custom sections shows parameter dictionaries, if the parameters were connected to the Task, using the Task.connect method, with a name argument provided.

View a screenshot

Custom parameters section

TensorFlow Definitions#

The TF_DEFINE sections shows automatic TensorFlow logging.

View a screenshot

TF_DEFINE parameter section

Once an experiment is run and stored in ClearML Server, any of these hyperparameters can be modified.

User properties#

User properties allow to store any descriptive information in a key-value pair format. They are editable in any experiment, except experiments whose status is Published (read-only).

View a screenshot

User properties section

Configuration objects#

ClearML tracks experiment (Task) model configuration objects, which appear in Configuration Objects > General. These objects include those that are automatically tracked, and those connected to a Task in code (see Task.connect_configuration). ClearML supports providing a name for a Task model configuration (see the name parameter in Task.connect_configuration.

important

In older versions of ClearML Server, the Task model configuration appeared in the ARTIFACTS tab, MODEL CONFIGURATION section. Task model configurations now appear in the Configuration Objects section, in the CONFIGURATION tab.

View a screenshot - Configuration objects

Configuration objects


View a screenshot - Custom configuration object

Custom configuration objects

Artifacts#

Artifacts tracked in an experiment appear in the ARTIFACTS tab, and include models and other artifacts.

Copy the location of models and artifacts stored in local files (file://) to the clipboard. Download models and artifacts in remote storage (for example https:// or s3).

Models#

The input and output models appear in the ARTIFACTS tab. Models are associated with the experiment, but to see further model details, including design, label enumeration, and general information, go to the MODELS tab, by clicking the model name, which is a hyperlink to those details.

To retrieve a model:

  1. In the ARTIFACTS tab > MODELS > Input Model or Output Model, click the model name hyperlink.

  2. In the model details > GENERAL tab > MODEL URL, either:

    • Download the model, if it is stored in remote storage.
    • Copy its location to the clipboard Copy Clipboard, if it is in a local file.
View a screenshot

Models in Artifacts tab

Other artifacts#

To retrieve another artifact:

  1. In the ARTIFACTS tab > DATA AUDIT or OTHER > Select an artifact > Either:

    • Download the artifact , if it is stored in remote storage.
    • Copy its location to the clipboard Copy Clipboard, if it is in a local file.

Data audit#

Artifacts which are uploaded and dynamically tracked by ClearML appear in the DATA AUDIT section. They include the file path, file size, hash, and metadata stored with the artifact.

View a screenshot

Data audit section

Other#

Other artifacts, which are uploaded but not dynamically tracked after the upload, appear in the OTHER section. They include the file path, file size, and hash.

View a screenshot

Other artifacts section

General information#

General experiment details appear in the INFO tab. This includes information describing the stored experiment:

  • The parent experiment
  • Project name
  • Creation, start, and last update dates and times
  • User who created the experiment
  • Experiment state (status)
  • Whether the experiment is archived
  • Runtime properties - Information about the machine running the experiment, including:
    • Operating system
    • CUDA driver version
    • Number of CPU cores
    • Number of GPUs
    • CPU / GPU type
    • Memory size
    • Host name
    • Processor
    • Python version
View a screenshot

Info tab

Experiment results#

Console#

The complete experiment log containing everything printed to stdout and strerr appears in the CONSOLE tab. The full log is downloadable. To view the end of the log, click Jump to end.

View a screenshot

Console tab

Scalars#

All scalars that ClearML automatically logs, as well as those explicitly reported in code, appear in RESULTS > SCALARS.

Scalar plot tools#

Use the scalar tools to improve analysis of scalar metrics. In the info panel, click to use the tools. In the full screen details view, the tools are on the left side of the window. The tools include:

  • Group by - select one of the following:

    • Metric - all variants for a metric on the same plot

      View a screenshot

      Plots grouped by metric


    • None - Group by metric and variant (individual metric-variant plots).

      View a screenshot

      Plots groups my metric and variant

  • Show / hide plots - Click HIDE ALL, and then click Eye Show All on those you want to see.

  • Horizontal axis modes (scalars, only) - Select one of the following:

    • ITERATIONS
    • RELATIVE - time since experiment began
    • WALL - local clock time
  • Curve smoothing (scalars, only) - In Smoothing > Move the slider or type a smoothing factor between 0 and 0.999.

See additional plot controls below.

Plots#

Non-time-series plots appear in RESULTS > PLOTS. These include data reported by libraries, visualization tools, and ClearML explicit reporting. These may include 2D and 3D plots, tables (Pandas and CSV files), and Plotly plots. Individual plots can be shown / hidden or filtered by title.

View a screenshot

Plots tab

Plot controls#

The table below lists the plot controls which may be available for any plot (in the SCALARS and PLOTS tabs). These controls allow you to better analyze the results. Hover over a plot, and the controls appear.

IconDescription
Download PNG iconDownload plots as PNG files.
Pan iconPan around plot. Click Pan icon, click the plot, and then drag.
Dotted box iconTo examine an area, draw a dotted box around it. Click Dotted box icon and then drag.
Dotted lasso iconTo examine an area, draw a dotted lasso around it. Click Dotted lasso icon and then drag.
Zoom iconZoom into a section of a plot. Zoom in - Click Zoom icon and drag over a section of the plot. Reset to original scale - Click Reset autoscale icon .
Zoom-in iconZoom in.
Zoom-out iconZoom out.
Reset autoscale iconReset to autoscale after zooming ( Zoom icon, Zoom-in icon, or Zoom-out icon).
Reset axes iconReset axes after a zoom.
Spike lines iconShow / hide spike lines.
Show closest iconShow the closest data point on hover, including horizontal and vertical axes values. Click Show closest icon and then hover over a series on the plot.
Compare data iconCompare data on hover. Click Compare data icon and then hover over the plot.
Logarithmic view iconSwitch to logarithmic view.
Graph legend iconHide / show the legend.
Download JSON iconTo get metric data for further analysis, download plot data to JSON file.
Maximize plot iconExpand plot to entire window.

3D plot controls#

IconDescription
Orbital rotation mode iconSwitch to orbital rotation mode - rotate the plot around its middle point.
Turntable rotation mode iconSwitch to turntable rotation mode - rotate the plot around its middle point while constraining one axis
reset axes iconReset axes to default position.

Debug samples#

View debug samples by metric at any iteration. The most recent iteration appears first. Use the viewer / player to inspect images, audio, video samples and do any of the following:

  • Move to the same sample in a different iteration (move the iteration slider).
  • Show the next or previous iteration's sample.
  • Download the file .
  • Zoom.
  • View the sample's iteration number, width, height, and coordinates.
View a screenshot - Debug samples

Debug Samples tab


View a screenshot - Viewer

Debug Samples image viewer


To view debug samples:

  1. Click the DEBUG SAMPLES tab. The most recent iteration appears at the top.

  2. Locate debug samples by doing the following:

    • Filter by metric. In the Metric list, choose a metric.
    • Show other iterations. Click (Older images), (New images), or (Newest images).

To view a debug sample in the viewer / player:

  1. Click the debug sample click the thumbnail.

  2. Do any of the following:

    • Move to the same sample in another iteration - Move the slider, or click < (previous) or > (next).
    • Download the file - Click .
    • Zoom
    • For images, locate a position on the sample - Hover over the sample and the X, Y coordinates appear in the legend below the sample.

Tagging experiments#

Tags are user-defined, color-coded labels that can be added to experiments (and models), allowing to easily identify and group experiments. Tags can show any text. For example, add tags for the type of remote machine experiments were executed on, label versions of experiments, or apply team names to organize experimentation.

  • To add tags and change tag colors:
    1. Click the experiment > Hover over the tag area > +ADD TAG or Bars menu (menu)
    2. Do one of the following:
      • Add a new tag - Type the new tag name > (Create New).
      • Add an existing tag - Click a tag.
      • Change a tag's colors - Click Tag Colors > Click the tag icon > Background or Foreground > Pick a color > OK > CLOSE.
  • To remove a tag - Hover over the tag > X.

Locating the experiment (Task) ID#

  • In the info panel, in the top area, to the right of the Task name, click ID. The Task ID appears.