Back to the list

Warm Starting CMA-ES for Hyperparameter Optimization

Conference Paper
Peer Reviewed

Association for the Advancement of ArtificialIntelligence (AAAI2021)

Download PDF Experimental Codes

@inproceedings{nomura2021warmstarting,

title={Warm Starting CMA-ES for Hyperparameter Optimization},

author={Masahiro, Nomura and Shuhei, Watanabe and Youhei, Akimoto and Yoshihiko, Ozaki and Masaki, Onishi},

booktitle={Association for the Advancement of ArtificialIntelligence (AAAI2021)},

year={2021}

}

Authors

represents the equally contributed authors.

  • Masahiro Nomura
  • Shuhei Watanabe
  • Youhei Akimoto
  • Yoshihiko Ozaki
  • Masaki Onishi
  • Abstract

    Hyperparameter optimization (HPO), formulated as black-box optimization (BBO), is recognized as essential for au-tomation and high performance of machine learning ap-proaches. The CMA-ES is a promising BBO approach witha high degree of parallelism, and has been applied to HPOtasks, often under parallel implementation, and shown supe-rior performance to other approaches including Bayesian op-timization (BO). However, if the budget of hyperparameterevaluations is severely limited, which is often the case forend users who do not deserve parallel computing, the CMA-ES exhausts the budget without improving the performancedue to its long adaptation phase, resulting in being outper-formed by BO approaches. To address this issue, we proposeto transfer prior knowledge on similar HPO tasks through theinitialization of the CMA-ES, leading to significantly short-ening the adaptation time. The knowledge transfer is designedbased on the novel definition of task similarity, with which thecorrelation of the performance of the proposed approach isconfirmed on synthetic problems. The proposed warm start-ing CMA-ES, called WS-CMA-ES, is applied to differentHPO tasks where some prior knowledge is available, showingits superior performance over the original CMA-ES as well asBO approaches with or without using the prior knowledge.