Search
  • Roman Kazinnik

Recipe for Distributed Hyperparameter Search for an unlikely couple: Keras Tuner and Kubeflow

Kubeflow offers a builtin Hyperparameter Search (HS) 'Katib', there are still a number of reasons one may want to prefer Keras Tuner as HS:

  • keep codebase independent of Kubeflow

  • keep Kubeflow independent of HS and codebase

  • keep everything TensorFlow

  • preference for a codebase based HS with versioning and CI/CD

  • the hesitance of running hundreds of HS experiments in a single Kubeflow endpoint

HS produces a list of model architectures sorted by its model score. Asynchronous HS will produce such list progressively, by having it updated with newly computed trials. Keras Tuner allows making HS distributed with a set of CPU or GPU without any code change. The progressive mode can be achieved with Keras Tuner by creating a loop with an increasing number of trials and the same tuner directory.


I will show how to scale HS horizontally using both single and multi-GPU nodes.


TO BE CONTINUED





3 views

© 2018 by Challenge. Proudly created with Wix.com