PyTorch with TensorBoard

The pytorch_tensorboard.py example demonstrates the integration of Trains into code which uses PyTorch and TensorBoard. It trains a simple deep neural network on the PyTorch built-in MNIST dataset. It creates a TensorBoard SummaryWriter object to log scalars during training, scalars and debug samples during testing, and a test text message to the console (a test message to demonstrate Trains). When the script runs, it creates an experiment named pytorch with tensorboard which is associated with the examples project.

Scalars

In the example script, the train and test functions call the TensorBoard SummaryWriter.add_scalar method to log loss. These scalars appear in RESULTS > SCALARS, along with the resource utilization plots, which are titled :monitor: machine.

image

Debug samples

Trains automatically tracks images and text output to TensorFlow. They appear in RESULTS > DEBUG SAMPLES.

image

Hyperparameters

Trains automatically logs TensorFlow DEFINEs. They appear in CONFIGURATIONS > HYPER PARAMETERS > TF_DEFINE.

image

Log

Text printed to the console for training progress, as well as all other console output, appear in RESULTS > LOG.

image

Artifacts

Model artifacts associated with the experiment appear in the experiment info panel (in the EXPERIMENTS tab), and in the model info panel (in the MODELS tab).

The experiment info panel shows model tracking, including the model name and design (in this case, no design was stored).

image

The model info panel contains the model details, including the model URL, framework, and snapshot locations.

image