SC21 Proceedings

The International Conference for High Performance Computing, Networking, Storage, and Analysis

AgEBO-Tabular: Joint Neural Architecture and Hyperparameter Search with Autotuned Data-Parallel Training for Tabular Data


Authors: Romain Egele (Ecole Polytechnique, France; Argonne National Laboratory (ANL)); Prasanna Balaprakash (Argonne National Laboratory (ANL)); Isabelle Guyon (National Institute for Research in Computer Science and Automation (Inria), France; University of Paris-Saclay); Venkatram Vishwanath, Fangfang Xia, and Rick Stevens (Argonne National Laboratory (ANL)); and Zhengying Liu (National Institute for Research in Computer Science and Automation (Inria), France)

Abstract: Developing high-performing predictive models for large tabular data sets is a challenging task. Neural architecture search (NAS) is an AutoML approach that generates and evaluates multiple neural networks (NNs) with different architectures concurrently to automatically discover a high-performing model. A key issue in NAS, particularly for large data sets, is the large computation time required to evaluate each generated architecture. While data-parallel training has the potential to address this issue, a straightforward approach can result in significant loss of accuracy. To that end, we develop AgEBO-Tabular, which combines Aging Evolution (AE) to search over neural architectures and asynchronous Bayesian optimization (BO) to search over hyperparameters to adapt data-parallel training. We evaluate the efficacy of our approach on the ECP-Candle Benchmarks.


Presentation: file


Back to Technical Papers Archive Listing