SC21 Proceedings

The International Conference for High Performance Computing, Networking, Storage, and Analysis

Bootstrapping In-Situ Workflow Auto-Tuning via Combining Performance Models of Component Applications

Authors: Tong Shu (Southern Illinois University); Yanfei Guo and Justin Wozniak (Argonne National Laboratory (ANL)); Xiaoning Ding (New Jersey Institute of Technology); Ian Foster (Argonne National Laboratory (ANL), University of Chicago); and Tahsin Kurc (Stony Brook University)

Abstract: In an in-situ workflow, multiple components such as simulation and analysis applications are coupled with streaming data transfers. The multiplicity of possible configurations necessitates an auto-tuner for workflow optimization. Existing auto-tuning approaches are computationally expensive because many configurations must be sampled by running the whole workflow repeatedly in order to train the auto-tuner surrogate model or otherwise explore the configuration space. To reduce these costs, we instead combine the performance models of component applications by exploiting the analytical workflow structure, selectively generating test configurations to measure and guide the training of a machine learning workflow surrogate model. Because the training can focus on well-performing configurations, the resulting surrogate model can achieve high prediction accuracy for good configurations despite training with fewer total configurations. Experiments with real applications demonstrate that our approach can identify significantly better configurations than other approaches for a fixed computer time budget.

Back to Technical Papers Archive Listing