Experiment Details

Overview

This page describes the Trains Web-App Profile page, Experiments tab, details panel which contains all information for an experiment, with easy access to model details, and is organized by the following tabs explained in the Overview section:

The Features section of this page highlights features and actions you can perform for an experiment.

Experiment details

EXECUTION

The EXECUTION tab shows source control and execution environment information from the most recent run of the experiment.

SOURCE CODE

  • In Draft status experiments, you can select a different repository, branch, commit, script, and / or working directory for the experiment.

UNCOMMITTED CHANGES and INSTALLED PACKAGES

  • In Draft status experiments, you can discard the git diff and select different Python packages and / or versions for the experiment.

AGENT CONFIGURATION and OUTPUT

  • In a Draft status experiment, you select a different Docker image, if a Docker image is used to run a worker in Docker mode, for the experiment, and output destination for model snapshots, artifacts, and uploaded images.

HYPER PARAMETERS

The HYPER PARAMETERS tab shows you the names and values of the hyperparameters.

  • In Draft status experiments, you can add, change, or delete hyperparameters.

ARTIFACTS

MODELS section - Input Model

This section shows the input model MODEL NAME and CREATING EXPERIMENT.

  • In Draft status experiments, you can switch to a different input model, and edit the MODEL CONFIGURATION.
  • The model name and creating experiment are hyperlinks to view model details and experiment details, respectively.
  • The model is downloadable.
  • The model's path can be copied to the clipboard.

MODELS - Output Model

This section shows the output model MODEL NAME and the output MODEL CONFIGURATION.

  • The output model name is a hyperlink to view the model details.
  • The model is downloadable.
  • The model's path can be copied to the clipboard.

DATA AUDIT

This section shows registered (dynamically synchronize with Trains) artifacts.

  • Artifact are shown with their file path, file size, hash, and metadata.
  • Artifacts are downloadable.

OTHER

This section shows uploaded (one-time, static) artifacts.

  • Artifact are shown with their file path, file size, hash, and metadata.
  • Artifacts are downloadable.
  • Open HTML file articles in your browser.

These uploaded artifacts can include:

  • Numpy arrays which are stored as NPZ files.
  • Static Pandas DataFrame which are one-time, stored version of a Pandas DataFrame (not dynamically synchronized).
  • Dictionaries which are stored as a JSONs.
  • Local files
  • Local folders which are stored as a ZIP files.
  • Images which are stored as PNG files.
  • Any other objects you store.

INFO

The INFO tab shows general information about the experiment, including the dates and times for the experiment activities (creation, start, last update, completion), and last or most recent iteration, as well as other information.

RESULTS

This tab contains experiment results automagically captured by Trains and explicit reporting which you can add to your Python experiment scripts. To learn about explicit reporting, see our Explicit Reporting tutorial.

LOG

This sub-tab shows the experiment log, including stdout, stderr, and explicit reporting you may add to your experiment script.

  • The full log is downloadable.
  • The log is searchable.

SCALARS

This sub-tab shows the scalar metrics plots that Trains automagically captures from metrics, resource monitoring, and visualization tools you may use such as TensorBoard/TensorBoardX, Matplotlib, and Seaborn, as well as explicit scalar reporting you may add to your experiment script.

Each scalar metrics plot provides controls allowing you to better analyze your results, including:

  • Switching the horizontal axis to iterations, time relative to the start of the experiment, or wall time.
  • Smoothing the curve.
  • Zoom, pan, and view the closest data point as you hover over the plot.
  • Switch the plot between inear and logarithmic views.
  • Downloading the plot as a PNG or JSON file.

PLOTS

The PLOTS sub-tab shows plots of any data that Trains automagically captures from visualization tools, as well as explicit reporting you may add to your experiment script.

These plots provide the same controls as scalar metrics plots in the SCALARS sub-tab.

DEBUG IMAGES

This sub-tab shows thumbnail previews of the debugging images stored by your experiment. Debug images allow you to do the following:

  • View all debug images for each iteration.
  • Debug images are downloadable.
  • View the ROIs in debug images.
  • Zoom, fit to window, and pan debug images.
  • View the height and width of the image.
  • Hover over the image and see the X, Y coordinates.

Experiment views

The Trains Web-App Web-App provides two views of an experiment.

Experiment table view

  • Shows the experiments list and the details panel for one experiment.
  • Use to view all experiment information.
  • Includes all tabs.
  • Quick and easy to switch between experiments.
  • Drag the details panel wider/narrower.
  • Click an experiment in the experiment table, its details panel slides open.

Full screen view

  • View the experiment results, only.
  • The details panel is screen wide.
  • Use to more easily view results.

To switch views, do either of the following:

  • Click the View full screen / View in experiment table button (upper right hand corner of detail panel).
  • In the experiment table view, click the menu and then click Results.

View an experiment (Task) Id

  • In the details panel, top area, click ID. The Task Id appears.

Manage experiments

The Trains Web-App supports features allowing you to manage your experiments. These features are available in the details panel menu, and in the experiment context menu.

To manage an experiment (perform a model action), do the following:

  • In the experiments details panel, click the menu and do any of the actions described in Experiments models on the Experiments Table page.

Modify experiments for tuning

Modify Draft status experiments for tuning by editing experiment details such as hyperparameters, input model, configuration, class enumeration, Git repository, Python packages and versions, Docker image, output destination, and log level.

  • In the details panel, EXECUTION, HYPER PARAMETERS, or ARTIFACTS sub-tabs - Hover over the area whose details you want to edit and then click EDIT. Add, change, or delete the details and then click SAVE.