Back to the list

Warm Starting Method for CMA-ES

Workshop Paper
Peer Reviewed
Oral Presentation

NeurIPS Workshop on Meta-Learning (MetaLearn2019)

Download PDF Experimental Codes

@inproceedings{nomura2019warmstarting,

title={Warm Starting Method for CMA-ES},

author={Masahiro, Nomura and Shuhei, Watanabe and Yoshihiko, Ozaki and Masaki, Onishi},

booktitle={NeurIPS Workshop on Meta-Learning (MetaLearn2019)},

year={2019}

}

Authors

represents the equally contributed authors.

  • Masahiro Nomura
  • Shuhei Watanabe
  • Yoshihiko Ozaki
  • Masaki Onishi
  • Abstract

    The covariance matrix adaptation evolution strategy (CMA-ES) is one of the most promising black-box optimization methods. However, one issue with the CMA-ES is to require substantial amount of time to fit the internal parameters to sample good solutions even if previous results on similar tasks are available. To alleviate the problem, we propose the warm starting method for the CMA-ES. The method determines the initial internal parameters of the CMA-ES through minimization of KL divergence between the initial multivariate Gaussian distribution of the CMA-ES and Gaussian mixture models of good solutions on a previous similar task. The results show that the proposed method converges faster than the vanilla CMA-ES, Bayesian optimization, and multi-task Bayesian optimization.