No Travel? No Problem.

Remote Participation
AgEBO-Tabular: Joint Neural Architecture and Hyperparameter Search with Autotuned Data-Parallel Training for Tabular Data
Event Type
Paper
Tags
Algorithms
Applications
Performance
Reproducibility Badges
Registration Categories
TP
TimeTuesday, 16 November 20212:30pm - 3pm CST
Location240-241-242
DescriptionDeveloping high-performing predictive models for large tabular data sets is a challenging task. Neural architecture search (NAS) is an AutoML approach that generates and evaluates multiple neural networks (NNs) with different architectures concurrently to automatically discover a high-performing model. A key issue in NAS, particularly for large data sets, is the large computation time required to evaluate each generated architecture. While data-parallel training has the potential to address this issue, a straightforward approach can result in significant loss of accuracy. To that end, we develop AgEBO-Tabular, which combines Aging Evolution (AE) to search over neural architectures and asynchronous Bayesian optimization (BO) to search over hyperparameters to adapt data-parallel training. We evaluate the efficacy of our approach on the ECP-Candle Benchmarks.
Back To Top Button