Tuning experiments

Tune experiments and execute the tuned experiments on local or remote machines by cloning the experiment (make an exact, editable copy), editing the experiment and then enqueuing newly cloned experiment for execution by a worker .

Edit an experiment’s execution details, including: repository, branch, commit ID, Python packages and versions; select the Docker image that ClearML Agent will use for remote execution; set the log level; configuration (network design, parameters - parameters originating in scripts as hyperparameters, command-line arguments, and TensorFlow Definitions, as well parameters added later); and, the model configuration.

Tuning experiments (and executing remotely)

To tune an experiment and execute it remotely:

  1. Locate the experiment. Open the Project page for the experiment from the Home page or the main Projects page.

    • On the Home page, click a recent experiment, project card, or VIEW ALL and then click a project card.

    • On the Projects page, click project card, or the All projects card.

  2. Clone the experiment. In the experiments table:

    1. Click Clone.

    2. In the Clone experiment dialog, Project textbox, select or create a project. To search for another project, start typing the project name. To create a new project, type new experiment name and click Create New.

    3. Enter an optional description.

    4. Click CLONE.

    The experiment’s status becomes Draft.

  3. Edit the experiment. See modifying experiments.

  4. Enqueue the experiment for execution. Right click the experiment > Enqueue > Select a queue > ENQUEUE.

    The experiment’s status becomes Pending. When the worker fetches the Task (experiment), the status becomes Running. You can now track your experiment and visualize the results.

Modifying experiments

Experiments whose status is Draft are editable (see the user properties exception). In the ClearML Web UI, edit any of the following

Execution details

Source code

Select source code by changing any of the following:

  • Repository, commit (select by ID, tag name, or choose the last commit in the branch), script, and /or working directory.

  • Installed Python packages and / or versions - Edit or clear (remove) them all.

  • Uncommitted changes - Edit or clear (remove) them all.

To select different source code:

  • In the EXECUTION tab, hover over a section > EDIT or (DISCARD DIFFS for UNCOMMITTED CHANGES) > edit > SAVE.

Base Docker image

Select a pre-configured Docker that ClearML Agent will use to remotely execute this experiment (see the Building Docker containers).

To add, change, or delete a base Docker image:

  • In EXECUTION > AGENT CONFIGURATION > BASE DOCKER IMAGE > hover > EDIT > Enter the base Docker image.

Output destination

Set an output destination for model checkpoints (snapshots) and other artifacts. Examples of supported types of destinations and formats for specifying locations include:

  • A shared folder: /mnt/share/folder

  • S3: s3://bucket/folder

  • Google Cloud Storage: gs://bucket-name/folder

  • Azure Storage: azure://company.blob.core.windows.net/folder/

To add, change, or delete an artifact output destination:

  • In EXECUTION > OUTPUT > DESTINATION > hover > EDIT > edit > SAVE.

Note

Also set the output destination for artifacts in code (see the output_uri parameter of the Task.init method), and in the ClearML configuration file for all experiments (see default_output_uri on the ClearML Configuration Reference page).

Log level

Set a logging level for the experiment (see the standard Python logging levels).

To add, change, or delete a log level:

  • In EXECUTION > OUTPUT > LOG LEVEL > hover > EDIT > Enter the log level.

Configuration

Hyperparameters

Important

In older versions of ClearML Server, the CONFIGURATION tab was named HYPER PARAMETERS, and it contained all parameters. The renamed tab contains a HYPER PARAMETER section, and subsections for hyperparameter groups.

Add, change, or delete hyperparameters, which in the ClearML Web UI are organized in the following groups:

  • Command line arguments and all older experiments parameters, except TensorFlow definitions - In the Args section (from code, argparse argument automatic logging).

  • TensorFlow definitions - In the TF_DEFINE section (from code, TF_DEFINEs automatic logging).

  • Parameter dictionaries - In the General section (from code, connected to the Task by calling the Task.connect) method (see the connecting a dict object)).

  • Environment variables - Tracked if you set the CLEARML_LOG_ENVIRONMENT environment variable (see this FAQ).

  • Custom named parameter groups (see the name parameter in Task.connect).

To add, change, or delete hyperparameters:

  • In the CONFIGURATIONS tab > HYPER PARAMETERS > General > hover > EDIT > add, change, or delete keys and /or values > SAVE.

User properties

User properties allow you to store any descriptive information in key-value pair format. They are editable in any experiment, except experiments whose status is Published (read-only).

To add, change, or delete user properties:

  • In CONFIGURATIONS > USER PROPERTIES > Properties > hover > EDIT > add, change, or delete keys and /or values > SAVE.

Configuration objects

Important

In older versions of ClearML Server, the Task model configuration appeared in the ARTIFACTS tab, MODEL CONFIGURATION section. Task model configurations now appear in CONFIGURATION > Configuration Objects.

Edit the experiment (Task) model configurations.

To add, change, or delete the Task model configurations:

  • In CONFIGURATIONS > CONFIGURATION OBJECTS > GENERAL > hover > EDIT or CLEAR (if the configuration is not empty).

Artifacts

Initial weights input model

Edit your model configuration and label enumeration, choose a different initial input weight model for the same project or any other project, or remove the model.

Note

The models are editable in the MODELS tab, not the EXPERIMENTS tab. Clicking the model name hyperlink shows the model in the MODELS tab.

To select a different model:

  1. In ARTIFACTS > Input Model > Hover and click EDIT.

  2. If a model is associated with the experiment, click .

  3. In the SELECT MODEL dialog, select a model from the current project or any other project.

To edit a model’s configuration or label enumeration:

  1. Click the model name hyperlink. The model details appear (in the MODELS tab).

  2. Edit the model configuration or label enumeration.

    • Model configuration - In the NETWORK tab > Hover and click EDIT. > CLick EDIT or CLEAR (to remove the configuration

      You can also search the configuration (hover over the configuration textbox, the search box appears) and copy the configuration to the clipboard (hover and click ).

    • Label enumeration - In the LABELS tab > Hover and click EDIT > Add, change, or delete label enumeration key-value pairs.

To remove a model from an experiment:

  • Hover and click EDIT. > Click .