Skip to main content

Tracking Experiments and Visualizing Results

While an experiment is running, and any time after it finishes, track it and visualize the results in the ClearML Web UI, including:

  • Execution details - Code, the base Docker image used for ClearML Agent, output destination for artifacts, and the logging level.
  • Configuration - Hyperparameters, user properties, and configuration objects.
  • Artifacts - Input model, output model, model snapshot locations, other artifacts.
  • General information - Information about the experiment, for example: the experiment start, create, and last update times and dates, user creating the experiment, and its description.
  • Console - stdout, stderr, output to the console from libraries, and ClearML explicit reporting.
  • Scalars - Metric plots.
  • Plots - Other plots and data, for example: Matplotlib, Plotly, and ClearML explicit reporting.
  • Debug samples - Images, audio, video, and HTML.

Viewing Modes

The ClearML Web UI provides two viewing modes for experiment details:

  • The info panel

  • Full screen details mode.

Both modes contain all experiment details. When either view is open, switch to the other mode by clicking Table/Full screen view (View in experiments table / full screen), or clicking Bars menu (menu) > View in experiments table / full screen.

Info Panel

The info panel keeps the experiment table in view so that experiment actions can be performed from the table (as well as the menu in the info panel).

Info panel

Click Compressed view to hide details in the experiment table, so only the experiment names and statuses are displayed

Compressed info panel

Full Screen Details View

The full screen details view allows for easier viewing and working with experiment tracking and results. The experiments table is not visible when the full screen details view is open. Perform experiment actions from the menu.

Full screen view

Execution

An experiment's EXECUTION tab of lists the following:

  • Source code
  • Uncommitted changes
  • Installed Python packages
  • Container details
  • Output details

In full-screen mode, the source code and output details are grouped in the DETAILS section.

Source Code

The Source Code section of an experiment's EXECUTION tab includes:

  • The experiment's repository
  • Commit ID
  • Script path
  • Working directory
  • Binary (Python executable)

Source code section

Uncommitted Changes

ClearML displays the git diff of the experiment in the Uncommitted Changes section.

Uncommitted changes section

Installed Packages

The Installed Packages section lists the experiment's installed Python packages and their versions.

Installed packages section

When a ClearML agent executing an experiment ends up using a different set of python packages than was originally specified, both the original specification (original pip or original conda), and the packages the agent ended up using to set up an environment (pip or conda) are available. Select which requirements to view in the dropdown menu.

Packages used by agent

Container

The Container section list the following information:

  • Image - a pre-configured Docker that ClearML Agent will use to remotely execute this experiment (see Building Docker containers)
  • Arguments - add Docker arguments
  • Setup shell script - a bash script to be executed inside the Docker before setting up the experiment's environment

Container section

Output

The Output details include:

  • The output destination used for storing model checkpoints (snapshots) and artifacts (see also, default_output_uri in the configuration file, and output_uri in Task.init parameters).

Execution details section

Configuration

All parameters and configuration objects appear in the CONFIGURATION tab.

Hyperparameters

Hyperparameters are grouped by their type and appear in CONFIGURATION > HYPERPARAMETERS. Once an experiment is run and stored in ClearML Server, any of these hyperparameters can be modified.

Command Line Arguments

The Args group shows automatically logged argument parser parameters (e.g. argparse, click, hydra). Hover over Description (menu) on a parameter's line, and the type, description, and default value appear, if they were provided.

Command line arguments configuration group

Environment Variables

If the CLEARML_LOG_ENVIRONMENT variable was set, the Environment group will show environment variables (see this FAQ).

Environment variables configuration group

Custom Parameter Groups

Custom parameter groups show parameter dictionaries if the parameters were connected to the Task, using Task.connect() with a name argument provided. General is the default section if a name is not provided.

Custom parameters group

TensorFlow Definitions

The TF_DEFINE parameter group shows automatic TensorFlow logging.

TF_DEFINE parameter group

User Properties

User properties allow to store any descriptive information in a key-value pair format. They are editable in any experiment, except experiments whose status is Published (read-only).

User properties section

Configuration Objects

ClearML tracks experiment (Task) model configuration objects, which appear in Configuration Objects > General. These objects include those that are automatically tracked, and those connected to a Task in code (see Task.connect_configuration).

Configuration objects

ClearML supports providing a name for a Task model configuration object (see the name parameter in Task.connect_configuration).

Custom configuration objects

Artifacts

Artifacts tracked in an experiment appear in the ARTIFACTS tab, and include models and other artifacts.

Artifacts location is stored in the FILE PATH field. The UI provides locally stored artifacts with a 'copy to clipboard' action (Clipboard) to facilitate local storage access (since web applications are prohibited from accessing the local disk for security reasons). The UI provides Network hosted (e.g. https://, s3:// etc. URIs) artifacts with a download action (Download) to retrieve these files.

Models

The input and output models appear in the ARTIFACTS tab. Models are associated with the experiment, but to see further model details, including design, label enumeration, and general information, go to the MODELS tab, by clicking the model name, which is a hyperlink to those details.

To retrieve a model:

  1. In the ARTIFACTS tab > MODELS > Input Model or Output Model, click the model name hyperlink.

  2. In the model details > GENERAL tab > MODEL URL, either:

    • Download the model, if it is stored in remote storage.
    • Copy its location to the clipboard Copy Clipboard, if it is in a local file.

Models in Artifacts tab

Other Artifacts

Other artifacts, which are uploaded but not dynamically tracked after the upload, appear in the OTHER section. They include the file path, file size, and hash.

To retrieve Other artifacts:

In the ARTIFACTS tab > OTHER > Select an artifact > Either:

  • Download the artifact , if it is stored in remote storage.
  • Copy its location to the clipboard Copy Clipboard, if it is in a local file.

Other artifacts section

General Information

General experiment details appear in the INFO tab. This includes information describing the stored experiment:

  • The parent experiment
  • Project name
  • Creation, start, and last update dates and times
  • User who created the experiment
  • Experiment state (status)
  • Whether the experiment is archived
  • Runtime properties - Information about the machine running the experiment, including:
    • Operating system
    • CUDA driver version
    • Number of CPU cores
    • Number of GPUs
    • CPU / GPU type
    • Memory size
    • Host name
    • Processor
    • Python version
  • Experiment Progress

Info tab

Experiment Results

Embedding ClearML Visualization

You can embed experiment plots and debug samples into ClearML Reports. These visualizations are updated live as the experiment(s) updates. The Enterprise Plan and Hosted Service support embedding resources in external tools (e.g. Notion). See Plot Controls.

Console

The complete experiment log containing everything printed to stdout and stderr appears in the CONSOLE tab. The full log is downloadable. To view the end of the log, click Jump to end.

Console tab

Scalars

All scalars that ClearML automatically logs, as well as those explicitly reported in code, appear in SCALARS.

Scalar series are presented in a line chart. To see the series for a metric in high resolution, view it in full screen mode by hovering over the graph and clicking Maximize plot icon.

Full Screen Refresh

Scalar graphs in full screen mode do not auto-refresh. Click Refresh to update the graph.

Reported single value scalars are aggregated into a table plot displaying scalar names and values (see Logger.report_single_value).

Scalar Plot Tools

Use the scalar tools to improve analysis of scalar metrics. In the info panel, click Settings gear to use the tools. In the full screen details view, the tools are on the left side of the window. The tools include:

  • Group by - Select one of the following:

    • Metric - All variants for a metric on the same plot

      Plots grouped by metric

    • None - Group by metric and variant (individual metric-variant plots).

      Plots groups my metric and variant

  • Horizontal axis - Select the x-axis units:

    • Iterations
    • Time from start - Time since experiment began
    • Wall time - Local clock time
  • Curve smoothing - Choose which smoothing algorithm to use from the dropdown menu: Exponential moving average, Gaussian, or Running Average. Use the slider to configure the smoothing factor or specify a value manually.

  • Show / hide plots - Click HIDE ALL, and then click Eye Show All on those you want to see.

To embed scalar plots in your Reports, hover over a plot and click Embed code, which will copy to clipboard the embed code to put in your Reports. In contrast to static screenshots, embedded resources are retrieved when the report is displayed allowing your reports to show the latest up-to-date data.

See additional plot controls below.

Plots

Non-time-series plots appear in PLOTS. These include data generated by libraries, visualization tools, and explicitly reported using the ClearML Logger. These may include 2D and 3D plots, tables (Pandas and CSV files), and Plotly plots. Individual plots can be shown / hidden or filtered by title.

Plots tab

For each metric, the latest reported plot is displayed.

When viewing a plot in full screen (Maximize plot icon), older iterations are available through the iteration slider (or using the up/down arrow keyboard shortcut). Go to the previous/next plot in the current iteration using the Previous / Next buttons (or using the left/right arrow keyboard shortcut).

Plots maximize tab

Plot Controls

The table below lists the plot controls which may be available for any plot (in the SCALARS and PLOTS tabs). These controls allow you to better analyze the results. Hover over a plot, and the controls appear.

IconDescription
Download PNG iconDownload plots as PNG files.
Pan iconPan around plot. Click Pan icon, click the plot, and then drag.
Dotted box iconTo examine an area, draw a dotted box around it. Click Dotted box icon and then drag.
Dotted lasso iconTo examine an area, draw a dotted lasso around it. Click Dotted lasso icon and then drag.
Zoom iconZoom into a section of a plot. Zoom in - Click Zoom icon and drag over a section of the plot. Reset to original scale - Click Reset autoscale icon.
Zoom-in iconZoom in.
Zoom-out iconZoom out.
Reset autoscale iconReset to autoscale after zooming (Zoom icon, Zoom-in icon, or Zoom-out icon).
Reset axes iconReset axes after a zoom.
Spike lines iconShow / hide spike lines.
Show closest icon
Compare data icon
X-united mode
Set data hover mode:
  • Show closest icon Closest - Show the (X, Y) data point closest to the cursor, including horizontal and vertical axes values
  • Compare data icon X - Show labels for points with the same x value as the cursor
  • X-united mode X unified - Show a single label for the points with the same x value as the cursor
Logarithmic view iconSwitch to logarithmic view.
Graph legend iconHide / show the legend.
Plot layout settingSwitch between original and auto-fitted plot dimensions. The original layout is the plot's user-defined dimensions.
Download JSON iconDownload plot data as a JSON file.
Download CSV iconDownload table plot data as a CSV file.
Maximize plot iconExpand plot to entire window. When used with scalar graphs, full screen mode displays plots with all data points, as opposed to an averaged plot
RefreshRefresh scalar graphs in full screen mode to update it.
Embed codeCopy to clipboard the resource embed code. This opens the following options:
  • Embed in External tool (available in the ClearML Enterprise plan and Hosted Service) - Copy code to add to external tools (e.g. Notion).
  • Embed in ClearML report - Copy code to add to a report
In contrast to static screenshots, embedded resources are retrieved when the tool/report is displayed allowing your tools/reports to show the latest up-to-date data.

3D Plot Controls

IconDescription
Orbital rotation mode iconSwitch to orbital rotation mode - rotate the plot around its middle point.
Turntable rotation mode iconSwitch to turntable rotation mode - rotate the plot around its middle point while constraining one axis
reset axes iconReset axes to default position.

Debug Samples

Experiment outputs such as images, audio, and videos appear in DEBUG SAMPLES. These include data generated by libraries and visualization tools, and explicitly reported using the ClearML Logger.

You can view debug samples by metric in the reported iterations. Filter the samples by metric by selecting a metric from the dropdown menu above the samples. The most recent iteration appears first.

Debug Samples tab

For each metric, the latest reported debug sample is displayed.

Click a sample to view it in full screen. If the sample is video or audio, the full screen mode includes a player.

When viewing a sample in full screen, older iterations are available through the iteration slider (or using the up/down arrow keyboard shortcut). Go to the previous/next sample in the current iteration using the Previous / Next buttons (or using the left/right arrow keyboard shortcut).

Debug Samples image viewer

Tagging Experiments


Tags are user-defined, color-coded labels that can be added to experiments (and pipelines, datasets, and models), allowing to easily identify and group experiments. Tags can help in organizing, querying, and automating experiments. For example, tag experiments by the machine type used to execute them, label versions, team names, or any other category.

You can use tags to filter your experiments in your experiment table (see Filtering Columns) or when querying experiments in your code (see Tag Filters). You can trigger experiment execution according to their tags (see TriggerScheduler) or automatically deploy models according to their tags (see ClearML Serving).

To add tags:

  1. Click the experiment > Hover over the tag area > +ADD TAG or Bars menu (menu)
  2. Do one of the following:
    • Add a new tag - Type the new tag name > (Create New).
    • Add an existing tag - Click a tag.
    • Customize a tag's colors - Click Tag Colors > Click the tag icon > Background or Foreground > Pick a color > OK > CLOSE.

To remove a tag - Hover over the tag and click X.

Locating the Experiment (Task) ID

The task ID appears in the experiment page's header.