Tracking Experiments and Visualizing Results

While an experiment is running, and any time after it finishes, track it and visualize the results in the ClearML Web UI, including:

  • Execution details - Code, the base Docker image use for ClearML Agent, output destination for artifacts, and the logging level.

  • Configuration - Hyperparameters, user properties, and configuration objects.

  • Artifacts - Input model, output model, model snapshot locations, other artifacts.

  • General information - Information about the experiment, for example: the experiment start, create, and last update date, user creating the experiment, and its description.

  • Logs - stdout, stderr, output to the console from libraries, and ClearML explicit reporting.

  • Scalars - Metric plots.

  • Plots - Other plots and data, for example: Matplotlib, Plotly, and ClearML explicit reporting.

  • Debug samples - Images, audio, video, and HTML.

Viewing modes

The ClearML Web UI provides two viewing modes for experiment details: the info panel and full screen details mode. Both contain all experiment details. When either view is open, switch to the other mode by clicking image (View in experiments table), or in the menu (image) > Results.

Info panel

The info panel keeps the experiment table in view so that you can perform experiment actions from the table (as well as the menu in the info panel).

View a screenshot
image

Full screen details view

The full screen details view allows you to more easily view and work with experiment tracking and results. The experiments table is not visible when the full screen details view is open. Perform experiment actions from the menu.

View a screenshot
image

Execution details

Source code, ClearML Agent configuration, and output details

  • Execution details include the experiment’s repository, commit ID, script path, and working directory; source code, ClearML Agent configuration, artifacts destination, and log level.

    The ClearML Agent base image is a pre-configured Docker that ClearML Agent will use to remotely execute this experiment (see the Building Docker containers).

    The output destination is used for storing model checkpoints (snapshots) and artifacts (see also, default_output_uri in the configuration file, and output_uri in Task.init parameter). The logging level for the experiment uses the standard Python logging levels.

    View a screenshot
    image
  • Uncommitted changes.

    View a screenshot
    image
  • Installed Python packages and their versions.

    View a screenshot
    ../../_images/webapp_tracking_20.png
    alt

    image

Configuration

All parameters and configuration objects appear in the CONFIGURATION tab.

Hyperparameters

Important

In older versions of ClearML Server, the CONFIGURATION tab was named HYPER PARAMETERS, and it contained all parameters. The renamed tab contains a HYPER PARAMETER section, and subsections for hyperparameter groups.

Hyperparameters are grouped by their type and appear in CONFIGURATION > HYPER PARAMETERS.

Command line arguments

The Args section shows automatically logged argparse arguments, and all older experiments parameters, except TensorFlow Definitions. Hover over a parameter, and the type, description, and default value appear, if they were provided.

View a screenshot
image

Environment variables

The Environment section can show environment variables, if you set the CLEARML_LOG_ENVIRONMENT environment variable, see this FAQ).

View a screenshot
image

Custom parameter groups

Custom sections shows parameter dictionaries, if they connected to the Task and the Task.connect method name parameter is provided.

View a screenshot
image

TensorFlow Definitions

The TF_DEFINE sections shows automatic TensorFlow logging.

View a screenshot
image

Once an experiment is run and stored in ClearML Server, any of these hyperparameters may have been set by modifying the experiment.

User properties

User properties allow you to store any descriptive information in key-value pair format. They are editable in any experiment, except experiments whose status is Published (read-only).

View a screenshot
image

Configuration objects

ClearML tracks experiment (Task) model configuration objects, which appear in Configuration Objects > General. This includes automatic tracking, and those connected to a Task in code (see Task.connect_configuration). ClearML supports providing a name for a Task model configuration (see the name parameter in Task.connect_configuration.

Important

In older versions of ClearML Server, the Task model configuration appeared in the ARTIFACTS tab, MODEL CONFIGURATION section. Task model configurations now appear in the Configuration Objects section, in the CONFIGURATION tab.

View a screenshot
image

Custom configuration objected.

View a screenshot
image

Artifacts

Artifacts tracked with an experiment appear in the ARTIFACTS tab and include models and other artifacts.

Copy the location of models and artifacts stored in local files (file://) to the clipboard. Download models and artifacts in remote storage (for example https:// or s3).

Models

The input and output model appear in the ARTIFACTS tab. It shows the model names, their model designs, and the name of the experiment that created the model (stored it in ClearML Server). In ClearML, models are associated with the experiment, but the model details, including the model design, label enumeration, and general information about the model are in the MODELS tab. The model name is a hyperlink to those details.

To retrieve a model:

  1. In the ARTIFACTS tab > MODELS > Input Model or Output Model, click the model name hyperlink.

  2. In the model details > GENERAL tab > MODEL URL, either:

    • Download the model (), if it is stored in remote storage.

    • Copy its location to the clipboard (), if it is in a local file.

View a screenshot
image

Other artifacts

To retrieve another artifact:

  1. In the ARTIFACTS tab > DATA AUDIT or OTHER > Select an artifact > Either:

    • Download the artifact (), if it is stored in remote storage.

    • Copy its location to the clipboard (), if it is in a local file.

Data audit

Artifacts which are uploaded and dynamically tracked by ClearML appear in the DATA AUDIT section. They include the file path, file size, hash, and metadata stored with the artifact.

View a screenshot
image

Other

Other artifacts, those which are uploaded but not dynamically tracked after the upload, appear in the OTHER section. They include the file path, file size, and hash.

View a screenshot
image

General information

General experiment is in the INFO tab. This includes information describing the stored experiment, including the parent experiment, project, creation, start, and last update date and time, the user creating the experiment, experiment state (status), and whether the experiment is archived.

View a screenshot
image

Experiment results

Log

The complete experiment log containing everything printed to stdout and strerr is in the LOG tab. The full log is downloadable. When viewing the log, you can click Jump to end.

View a screenshot
image

Scalars

All scalars that ClearML automatically logs, as well as those explicitly reported in code, appear in RESULTS > SCALARS.

Scalar plot tools

Use the scalar tools to improve your analysis of scalar metrics. The plot tools allow you to:

  • Group by metric (all variants for a metric on the same plot)

    View a screenshot
    image
  • Group by metric-variant combination (individual metric-variant plots).

    View a screenshot
    image

    In the info panel, click to use the tools. In the full screen details view, the tools are on the left side of the window.

    In Group by > Select Metric (all variants, same plot) or None (each plot, metric-variant combination).

  • Show / hide plots - Click HIDE ALL , and then (show) those you want to see.

  • Horizontal axis modes (scalars, only) - In Horizontal axis > Select ITERATIONS, RELATIVE (time since experiment began), or WALL (local clock time).

  • Curve smoothing (scalars, only) - In Smoothing > Move the slider or type a smoothing factor between 0 and 0.999.

Plot controls

Each plot supports plot controls allowing you better analyze the results. The table below lists the plot controls which may be available for any plot. Hover over a plot, and the controls appear.

Icon Description
image Download plots as PNG files.
image Pan around plot. Click image, click the plot, and then drag.
image To examine an area, draw a dotted box around it. Click image and then drag.
image To examine an area, draw a dotted lasso around it. Click image and then drag.
image Zoom into a section of a plot. Zoom in - Click image and drag over a section of the plot. Reset to original scale - Click image.
image Zoom in.
image Zoom out.
image Reset to autoscale after zooming ( image, image, or image ).
image Reset axes after a zoom.
image Show / hide spike lines.
image Show the closest data point on hover, including horizontal and vertical axes values. Click image and then hover over a series on the plot.
image Compare data on hover. Click image and then hover over the plot.
image Switch to logarithmic view.
image Hide / show the legend.
image To get metric data for further analysis, download plot data to JSON file.

Other plots

Other plots include data reported by libraries, visualization tools, and ClearML explicit reporting. These may include 2D and 3D plots, tables (Pandas and CSV files), and Plotly plots. Other plots appear in RESULTS > PLOTS. You can show / hide individual plots and filter by title.

View a screenshot
image

Debug samples

View debug samples by metric at any iteration. The most recent iteration appears first. Use the viewer / player to inspect images, audio, video samples and do any of the following:

  • Move to the same sample in a different iteration (move the iteration slider).

  • Show the next or previous iteration’s sample.

  • Download the file ().

  • Zoom.

  • View the sample’s iteration number, width, height, and coordinates.

View a screenshot
image
View a screenshot
image

To view debug samples:

  1. Click the DEBUG SAMPLES tab. The most recent iteration appears at the top.

  2. Locate debug samples by doing the following:

    • Filter by metric. In the Metric list, choose a metric.

    • Show other iterations. Click image (Older images), image (New images), or image (Newest images).

To view a debug sample in the viewer / player:

  1. Click the debug sample click the thumbnail.

  2. Do any of the following:

    • Move to the same sample in another iteration - Move the slider, or click < (previous) or > (next).

    • Download the file - Click .

    • Zoom

    • For images, locate a position on the sample - Hover over the sample and the X, Y coordinates appear in the legend below the sample.

Tagging experiments

Tags are user-defined, color-coded labels that you can add to experiments (and models) allowing you to easily identify and group of experiments. A tag shows any text you choose, for any purpose. For example, add tags for the type of remote machine experiments execute on, label versions of experiments, or apply team names to organize experimentation.

  • Adding tags and changing tag colors:

    1. Click the experiment > Hover over the tag area > +ADD TAG or image (menu)

    2. Do one of the following:

      • Add a new tag - Type the new tag name > (Create New).

      • Add an existing tag - Click a tag.

      • Change a tag’s colors - Click Tag Colors > Click the tag icon > Background or Foreground > Pick a color > OK > CLOSE.

  • Remove a tag - Hover over the tag > X.

Locating the experiment (Task) ID

  • In the info panel, top area, click ID. The Task ID appears.