Back to the list

Speeding up of the Nelder-Mead Method by Data-driven Speculative Execution

Conference Paper
Peer Reviewed
Oral Presentation

Asian Conference on Pattern Recognition (ACPR2019)

Download PDF Experimental Codes

@inproceedings{watanabe2019speeding,

title={Speeding up of the Nelder-Mead Method by Data-driven Speculative Execution},

author={Shuhei, Watanabe and Yoshihiko, Ozaki and Yoshiaki, Bando and Masaki, Onishi},

booktitle={Asian Conference on Pattern Recognition (ACPR2019)},

year={2019}

}

Authors

  • Shuhei Watanabe
  • Yoshihiko Ozaki
  • Yoshiaki Bando
  • Masaki Onishi
  • Abstract

    The performance of machine learning algorithms considerably depends on the hyperparameter configurations. Previous studies reported that the Nelder-Mead (NM) method known as a local search method requires a small number of evaluations to converge and that these properties enable achieving considerable success in the hyperparameter optimization (HPO) of machine learning algorithms for image recognition. However, most evaluations using the NM method need to be implemented sequentially, which requires a large amount of time. To alleviate the problem that the NM method cannot be computed in parallel, we propose a data-driven speculative execution method based on the statistical features of the NM method. We analyze the behaviors of the NM method on several benchmark functions and experimentally demonstrated that the NM method tends to take certain specific operations. The experimental results show that the proposed method reduced the elapsed time by approximately 50% and the number of evaluations by approximately 60% compared to the naive speculative execution.