Manual Random Parameter Search

The manual_random_param_search_example.py example scripts demonstrate a random parameter search by automating the execution of an experiment, multiple times, each time with a different set of random hyperparameters. That experiment must run first. It is named Keras HP optimization base, and is created by running another Trains example script, base_template_keras_simple.py

This example accomplishes the automated random parameter search by doing the following:

  1. Creating a parameter dictionary, which we connect to the Task by calling Task.connect so that the parameters are logged by Trains.
  2. Adding the random search hyperparameters and parameters defining the search (e.g., the experiment name, and number of times to run the experiment).
  3. Creating a Task object referencing the experiment.
  4. For each set of parameters:
    1. Cloning the Task object, see Task.clone.
  5. Getting the newly cloned Task's parameters, see Task.get_parameters.
    1. Setting the newly cloned Task's parameters to the search values in the parameter dictionary (Step 1), see Task.set_parameters.
    2. Enqueuing the newly cloned Task to execute, see Task.enqueue.

When the example script runs, it creates an experiment named Random Hyper-Parameter Search Example which is associated with the examples project. This starts the parameter search, and creates the experiments:

  • Keras HP optimization base 0
  • Keras HP optimization base 1
  • Keras HP optimization base 2.

When they complete, you can compare the experiment results.