No Travel? No Problem.

Remote Participation

Days Held

Monday–Wednesday, November 15–17, 2021

Track team progress on Grafana

The Student Cluster Competition (SCC) was developed in 2007 to provide an immersive high performance computing experience to undergraduate and high school students.

For SC21, the competition has moved to the cloud to accommodate remote participation. For the first time, students will compete while utilizing two cloud platforms, Microsoft Azure Cloud and Oracle Cloud. Student teams must prioritize and utilize a budget on Microsoft Azure Cloud, and teams must understand how to optimize for and utilize the specific hardware provided by Oracle Cloud. With the two different types of constraints, student teams must learn scientific applications, apply optimization techniques, and compete in a 48-hour challenge around the world to complete a set of benchmarks and real-world scientific workloads. The SCC gives teams the opportunity to show off their HPC knowledge for conference attendees and judges.

Reproducibility Challenge

One of the applications presented to the student teams is the Reproducibility Challenge, in which students attempt to reproduce results from an accepted paper from the prior year’s Technical Program.

Students have the opportunity to interact directly with the paper’s authors as they attempt to reproduce specific results and conclusions from the paper. As part of this challenge, each student team writes a reproducibility report detailing their experience in reproducing the results from the paper. Authors of the most highly rated reproducibility reports may be invited to submit their reports to a reproducibility special issue.

Teams & Process

Teams are composed of six students, an advisor, and vendor partners. The students provide their skills and enthusiasm, the advisor provides guidance, the vendor provides resources (e.g., software, expertise, travel funding), and Microsoft Azure and Oracle provide the cloud credits and a specified set of hardware, respectively. Students work with their advisors to craft a proposal that describes the team and their approach to the competition. The SCC committee reviews each proposal and provides comments for all submissions.

Support Provided

For the Student Cluster Competition, support is not provided for team travel, but on-site housing is provided to teams that choose to compete on-site at the SC Student Cluster Competition booth. Selected teams received full conference registration for each team member and one advisor. As the competition is part of the Students@SC program, students can also participate in Mentor–Protégé Matching and the Job Fair

Thank you SCC Supporters

azure
oracle

History

For more information about SCC in past years, including team profiles, photos, winners, and more: studentclustercompetition.us

SCC Mystery Application

The SCC is looking for scientific applications from the HPC community that could be used as the SCC Mystery Application. If you have a scientific application that you think would be a great fit for the competition, please complete the form via the button below.

The owner of the selected application receives complimentary SC21 registration.

Submit an App

Eligibility

  • Each submission must list an application owner who will:
    1. be responsible for answering questions from the SCC teams.
    2. prepare test and input decks for the competition.
    3. be available to serve as judge during SC21.
  • The application should not have export control restrictions.
  • The application must have updated documentation.
  • Submissions and selections must be kept confidential until the beginning of the SCC when the mystery application selected will be revealed.

SCC Benchmarks & Applications

Three Benchmarks and Applications

Benchmarks

 

LINPACK Benchmark
http://top500.org/project/linpack/

The Linpack Benchmark is a measure of a computer’s floating-point rate of execution. It is determined by running a computer program that solves a dense system of linear equations. It is used by the TOP 500 as a tool to rank peak performance. The benchmark allows the user to scale the size of the problem and to optimize the software in order to achieve the best performance for a given machine. This performance does not reflect the overall performance of a given system, as no single number ever can. It does, however, reflect the performance of a dedicated system for solving a dense system of linear equations. Since the problem is very regular, the performance achieved is quite high, and the performance numbers give a good correction of peak performance.

 

HPCG Benchmark
http://hpcg-benchmark.org/

The High Performance Conjugate Gradients (HPCG) Benchmark project is an effort to create a new metric for ranking HPC systems. HPCG is intended as a complement to the High Performance LINPACK (HPL) benchmark, currently used to rank the TOP500 computing systems. The computational and data access patterns of HPL are still representative of some important scalable applications, but not all. HPCG is designed to exercise computational and data access patterns that more closely match a different and broad set of important applications, and to give incentive to computer system designers to invest in capabilities that will have impact on the collective performance of these applications.

 

IO500 Benchmark
http://io500.org

The IO500 benchmark is a benchmark suite for High-Performance IO. It harnesses existing and trusted open-source benchmarks such as IOR and MDTest and bundles execution rules and multiple workloads with the purpose to evaluate and analyze the storage devices for various IO patterns. The IO500 benchmark is designed to provide performance boundaries of the storage for HPC applications regarding data and metadata operations under what are commonly observed to be both easy and difficult IO patterns from multiple concurrent clients. Moreover, there is a phase that scans for previously-created files that match certain conditions using a (possibly file system-specific) parallel find utility to evaluate the speed of namespace traversal and file attribute retrieval. The final score that is used to rank submissions in the list is a combined score across all the executed benchmarks.

 

Applications

 

Cardioid

Cardioid is a cardiac simulation suite for simulating clinical cardiac phenomena. It is capable of simulating both electrophysiological and mechanical organ-level simulations, and has processing tools for computing cardiac meshes, reconstructions of torso ECGs, and generating realistic cardiac fiber orientations. The Cardioid electrophysiology solver was a Gordon Bell finalist and has strong-scaled to all of the Vulcan supercomputer for a clinically relevant problem. The code is parallelized using MPI, and has separate optimized loops for taking advantage of OpenMP, SIMD instruction sets for CPU architectures, and CUDA for GPU architectures.

 

Quantum ESPRESSO

https://www.quantum-espresso.org/project/manifesto

Quantum ESPRESSO is a software package for first-principles electronic-structure calculations and materials modeling based on density-functional theory, plane wave basis sets, and pseudopotentials.

 

Mystery Application
At the start of the competition, teams will be given an application and datasets for a mystery application. Students will be expected to build, optimize and run this mystery application all at the competition.

 

Reproducibility Challenge
Once again, students in the cluster competition will be asked to replicate the results of a publication from the previous year’s SC conference. For this challenge, students will take on the role of reviewing an SC20 paper to see if its results are reproducible. The SC21 Reproducibility Committee has selected the paper “A Parallel Framework for Constraint-Based Bayesian Network Learning via Markov Blanket Discovery” by Ankit Srivastava, Sriram P. Chockalingam, and Srinivas Aluru to be the Student Cluster Competition (SCC) benchmark for the Reproducibility
Challenge this year.

Last year, thanks to the adoption of automatically generating an Artifact Descriptor (AD) during submission time, all accepted papers from SC20 featured an AD in their appendix. A team of reviewers selected the paper from these past papers based on the AD, author interviews, and its
suitability for the SCC. The authors and the Reproducibility Committee have been working to create a reproducible benchmark that builds on the paper’s results. During the SCC, the student teams will be asked to run the benchmark, attempting to reproduce the findings from the original paper under different settings with different data sets.

SCC Schedule

Orientation Briefing & Competition Schedule

Safety Briefing

Saturday, November 13, 2021

This will be a virtual briefing and will be recorded and posted. Students must participate in the live briefing or view the recording before the competition begins.

 

Competition

Monday–Wednesday, November 15–17, 2021

The competition will run continuously Monday–Wednesday, November 15–17, 2021. Both benchmarks and applications will be released at the SCC Kickoff on Monday, November 15, 2021. Results can be submitted throughout the competition.

SCC Rules

Rules and System Software

Violation of any rule may result in a team’s disqualification from the competition or point penalization at the discretion of the SCC committee. Any unethical conduct not otherwise covered in these rules will also be penalized at the discretion of the SCC Committee.

The following violations will result in immediate disqualification:

  • Having anyone other than the 6 registered team members working on the team’s cloud resources during competition hours.
  • Any communication between your cloud resource and a network other than the approved cloud networks.

 

General Competition Rules

Safety First
All SCC operations are always subject to safety as first consideration. If a task cannot be done safely, then it is unacceptable. When in doubt, ask a SCC committee member or your SCC team liaison.

No Assistance From Non-team Members
Once the competition starts, student teams will not be allowed to receive assistance from anyone, including their advisor.

Stay Under Budget
The Microsoft Azure Cloud budget allowed for the S21 SCC will be given to teams at the start of the competition. Point penalties will be assessed if teams go over this budget.

No External Computational Assistance
All benchmarks and applications workloads must be run on Microsoft Azure Cloud and Oracle Cloud using the budget allotted and the hardware provided. Submitting results or using computational resources during the competition other than the provided Microsoft Azure Cloud and Oracle Cloud resources is not permitted and will result in disqualification.

Teams Must Conduct Themselves Professionally
Teams must conduct themselves professionally and adhere to the SC21 Code of Conduct. Students must compete fairly and ethically.

 

System Software

Only run on the specified cloud resources. Benchmarks and applications must be run on the specified cloud as follows:

Microsoft Azure Cloud

  • Benchmarking
  • Quantum ESPRESSO
  • Mystery Application

Oracle Cloud

  • Cardioid
  • Reproducibility Application

Teams may choose any operating system and software stack that will run the applications and display visualizations to conference attendees.

Teams may study and tune the open-source benchmarks and applications for their platforms. Any changes to application source code must be shared with the SCC committee.

Teams shall not modify or obstruct the data collection of metrics for their cloud systems. Trying to purposefully modify or obstruct the metrics database can result in a full disqualification.

Metrics collection software has been pre-installed and pre-configured on student systems. They should not be disabled or removed. List includes: InfluxDB, WAA Agent.

SCC Webinars

Register for upcoming informational webinars tailored to help accepted teams understand various components of the SCC, watch recordings of past webinars, and download webinar slides.

Show me the webinars

Back To Top Button