We are excited to announce that the SC21 Reproducibility Challenge Committee has selected the SC20 paper “A parallel framework for constraint-based Bayesian network learning via Markov blanket discovery” by Ankit Srivastava, Sriram P Chockalingam, and Srinivas Aluru to serve as the Student Cluster Competition (SCC) benchmark for this year’s Reproducibility Challenge. A team of reviewers selected the paper from 96 accepted SC20 papers, based on the paper’s Artifact Descriptor (AD) and its suitability to the SCC. The authors and the Reproducibility Committee have been working to create a reproducible benchmark that builds on the paper’s results. At SC21, the SCC teams will be asked to run the benchmark, replicating the findings from the original paper under different settings and with different datasets.
What makes the work of the student teams particularly relevant is the replication of the paper’s work across the different clusters that will be fielded by the teams. In the era of heterogeneous computing, porting applications from one platform to another is not a simple task. The work of the student teams at SC21 is a fantastic way to dive into reproducibility challenges across various platforms and emerge with shareable, robust insights. It is the ensemble of each team’s implementation and execution of the challenge on sixteen different platforms that will earn this paper ACM’s “Results Replicated” badge in the ACM Digital Library. Sharing is at the core of the Reproducibility Challenge – so, the work of the SCC teams will be collected and published. We have already published three special issues in Parallel Computing from previous years.
Behind the Scenes: Putting Together the SC21 SCC Benchmark
Many volunteers participate in the selection of the paper, the creation of the benchmark for the SCC teams, the assessment of the students’ work, and the publication of the special journal issue.
During the first round of reviews to determine feasibility for the competition, the reviewers looked at whether the finalist papers had an application that could be run by the student teams on the broad range of hardware types and cluster configurations that are typically fielded by SCC teams. This initial review eliminated nearly 80% of the potential papers because, for example, they used proprietary compilers, ran only on specific hardware, or reproducing the results required a larger scale than the SCC clusters could provide.
A second round of reviews, including at least two for each paper, looked for which application would be best suited for the SCC teams. The committee then ranked the submissions based on criteria such as the application’s real-world impact as understood by undergraduates, and the anticipated student experience while working with the benchmark. Discussions with the authors of the finalist papers focused on the feasibility of adapting their applications to the Student Cluster Competition and the availability for the authors to devote time to the success of the challenge. Following these interviews, the committee met to determine which application to invite.
The selection of the paper is only one step in a long process that ends with the preparation of the Reproducibility Challenge benchmark – one of many benchmarks that the students must meet during the competition. The reproducibility benchmark will be revealed at SC21. Following the conference, we will publish the students’ reports from the SC21 SCC Reproducibility Challenge, to demonstrate the effectiveness of the SCC teams and their success in replicating the code on their platforms.
Mark Your Calendar
The Student Cluster Competition will be held November 15–17 during SC21 at the America’s Center in St. Louis, Missouri. Visit the SCC booth on the exhibit floor at SC21 and chat with students about the Reproducibility Challenge. We invite you to celebrate the student participants and the authors of the selected paper at the Awards Ceremony on Thursday of the conference. And don’t miss next year’s SCC reports! Join us in St. Louis to meet these amazing students and watch them race to reproduce this benchmark and other HPC applications.
Le Mai Weakley, SC21 Reproducibility Challenge Chair, Indiana University