目录

HPO

TorchX integrates with Ax to provide hyperparameter optimization (HPO) support. Since the semantics of an HPO job is highly customizable, especially in the case of Bayesian optimization, an HPO application is hard to generalize as an executable. Therefore, HPO is offered as a runtime module rather than a builtin component. This means that TorchX provides you with the libraries and tools to simplify the building of your own HPO application and component.

HPO App

  1. See HPO with Ax + TorchX to learn how to author an HPO application

  2. Build an image (typically a Docker image) that includes your HPO app, author a component

  3. Run it with the torchx CLI or torchx.pipelines

At a high level, the HPO app sets up the HPO experiment and search space. Each HPO trial is a job that is defined by the AppDef obtained by evaluating the TorchX component at a point in the parameter space. This point is determined by the bayesian optimizer within the Ax platform.

The search space dimensions have to line up with the arguments of the component that you will be running as trials. To launch the HPO app, you can either run the main directly or invoke it remotely using TorchX (you’ll need to author a component for your HPO app in this case). The diagram below depicts how this works.

../_images/hpo_diagram.png

Example

The ax_test.py unittest is a great end-to-end example of how everything works. It demonstrates running an HPO experiment where each trial the TorchX builtin component torchx.components.utils.booth().

文档

访问 PyTorch 的全面开发人员文档

查看文档

教程

获取面向初学者和高级开发人员的深入教程

查看教程

资源

查找开发资源并解答您的问题

查看资源