HPO¶
TorchX integrates with Ax to provide hyperparameter optimization (HPO) support. Since the semantics of an HPO job is highly customizable, especially in the case of Bayesian optimization, an HPO application is hard to generalize as an executable. Therefore, HPO is offered as a runtime module rather than a builtin component. This means that TorchX provides you with the libraries and tools to simplify the building of your own HPO application and component.
HPO App¶
See HPO with Ax + TorchX to learn how to author an HPO application
Build an image (typically a Docker image) that includes your HPO app, author a component
Run it with the torchx CLI or
torchx.pipelines
At a high level, the HPO app sets up the HPO experiment and search space. Each HPO trial is a job that is defined by the AppDef obtained by evaluating the TorchX component at a point in the parameter space. This point is determined by the bayesian optimizer within the Ax platform.
The search space dimensions have to line up with the arguments of the component that you will be running as trials. To launch the HPO app, you can either run the main directly or invoke it remotely using TorchX (you’ll need to author a component for your HPO app in this case). The diagram below depicts how this works.

Example
The ax_test.py
unittest is a great end-to-end example of how everything works. It demonstrates running an
HPO experiment where each trial the TorchX builtin component torchx.components.utils.booth()
.