No Travel? No Problem.

Remote Participation

Days Held

Monday–Wednesday, November 15–17, 2021

The Student Cluster Competition (SCC) was developed in 2007 to provide an immersive high performance computing experience to undergraduate and high school students.

With sponsorship from hardware and software vendor partners, student teams design and build small clusters, learn scientific applications, apply optimization techniques for their chosen architectures, and compete in a non-stop, 48-hour challenge at the SC conference to complete real-world scientific workloads, showing off their HPC knowledge for conference attendees and judges.

New for SC21: Power Component

Supercomputing continues to grow and be used for a broader range of efforts. The SC21 tagline “Science & Beyond” highlights and celebrates these varied applications. The Student Cluster Competition has highlighted a variety of applications in the past and continues to do so.
 
While the demand for and application of supercomputing continues to grow, the resources to power these systems may be limited. The cost of energy, for example, can vary for a variety of reasons including time of day and weather. In the past, the competition has used a static power cap during the competition. This year, the power cap will be changed at least once for a certain amount of time during the competition so that students can experience another type of real world situation.

This may require teams to adjust their operation plans to account for the reduced power cap:

  • Teams must adapt to power caps between 2500W and 4000W without turning off any hardware.
  • There will be a minimum of 30 minutes notice before each new power limit is active and points are lost if teams go over.
  • The default power cap will remain 3000W.

Reproducibility Challenge

One of the applications presented to the student teams is the Reproducibility Challenge, in which students attempt to reproduce results from an accepted paper from the prior year’s Technical Program.

Students have the opportunity to interact directly with the paper’s authors as they attempt to reproduce specific results and conclusions from the paper. As part of this challenge, each student team writes a reproducibility report detailing their experience in reproducing the results from the paper. Authors of the most highly rated reproducibility reports may be invited to submit their reports to a reproducibility special issue.

Teams & Process

Teams are composed of six students, an advisor, and vendor partners. The advisor provides guidance and recommendations, the vendor provides the resources (hardware and software), and the students provide the skill and enthusiasm. Students work with their advisors to craft a proposal that describes the team, the suggested hardware, and their approach to the competition. The SCC committee reviews each proposal and provides comments for all submissions. The two hardware requirements for team clusters are that they are able to run the applications and exercises of the competition and use less than 3,000 watts of power.

Support Provided

Selected teams receive full conference registration for each team member and one advisor. Each team is also provided with three hotel rooms (two beds per room) for the students, plus one room (one bed per room) for the advisor. Please note, this may change to seven single occupancy rooms depending on COVID restrictions. As the competition is part of the Students@SC program, students can also participate in Mentor–Protégé Matching and the Job Fair. Travel to the conference and per diem are not provided.

Thank you SCC Supporters

azure
oracle

History

For more information about SCC in past years, including team profiles, photos, winners, and more: studentclustercompetition.us

SCC Mystery Application

The SCC is looking for scientific applications from the HPC community that could be used as the SCC Mystery Application. If you have a scientific application that you think would be a great fit for the competition, please complete the form via the button below.

The owner of the selected application receives complimentary SC21 registration.

Submit an App

Eligibility

  • Each submission must list an application owner who will:
    1. be responsible for answering questions from the SCC teams.
    2. prepare test and input decks for the competition.
    3. be available to serve as judge during SC21.
  • The application should not have export control restrictions.
  • The application must have updated documentation.
  • Submissions and selections must be kept confidential until the beginning of the SCC when the mystery application selected will be revealed.

SCC Benchmarks & Applications

Three Benchmarks and Applications

Benchmarks

 

LINPACK Benchmark
http://top500.org/project/linpack/

The Linpack Benchmark is a measure of a computer’s floating-point rate of execution. It is determined by running a computer program that solves a dense system of linear equations. It is used by the TOP 500 as a tool to rank peak performance. The benchmark allows the user to scale the size of the problem and to optimize the software in order to achieve the best performance for a given machine. This performance does not reflect the overall performance of a given system, as no single number ever can. It does, however, reflect the performance of a dedicated system for solving a dense system of linear equations. Since the problem is very regular, the performance achieved is quite high, and the performance numbers give a good correction of peak performance.

 

HPCG Benchmark
http://hpcg-benchmark.org/

The High Performance Conjugate Gradients (HPCG) Benchmark project is an effort to create a new metric for ranking HPC systems. HPCG is intended as a complement to the High Performance LINPACK (HPL) benchmark, currently used to rank the TOP500 computing systems. The computational and data access patterns of HPL are still representative of some important scalable applications, but not all. HPCG is designed to exercise computational and data access patterns that more closely match a different and broad set of important applications, and to give incentive to computer system designers to invest in capabilities that will have impact on the collective performance of these applications.

 

IO500 Benchmark
http://io500.org

The IO500 benchmark is a benchmark suite for High-Performance IO. It harnesses existing and trusted open-source benchmarks such as IOR and MDTest and bundles execution rules and multiple workloads with the purpose to evaluate and analyze the storage devices for various IO patterns. The IO500 benchmark is designed to provide performance boundaries of the storage for HPC applications regarding data and metadata operations under what are commonly observed to be both easy and difficult IO patterns from multiple concurrent clients. Moreover, there is a phase that scans for previously-created files that match certain conditions using a (possibly file system-specific) parallel find utility to evaluate the speed of namespace traversal and file attribute retrieval. The final score that is used to rank submissions in the list is a combined score across all the executed benchmarks.

 

Applications

 

Cardioid

Cardioid is a cardiac simulation suite for simulating clinical cardiac phenomena. It is capable of simulating both electrophysiological and mechanical organ-level simulations, and has processing tools for computing cardiac meshes, reconstructions of torso ECGs, and generating realistic cardiac fiber orientations. The Cardioid electrophysiology solver was a Gordon Bell finalist and has strong-scaled to all of the Vulcan supercomputer for a clinically relevant problem. The code is parallelized using MPI, and has separate optimized loops for taking advantage of OpenMP, SIMD instruction sets for CPU architectures, and CUDA for GPU architectures.

 

Quantum ESPRESSO

https://www.quantum-espresso.org/project/manifesto

Quantum ESPRESSO is a software package for first-principles electronic-structure calculations and materials modeling based on density-functional theory, plane wave basis sets, and pseudopotentials.

 

Mystery Application
At the start of the competition, teams will be given an application and datasets for a mystery application. Students will be expected to build, optimize and run this mystery application all at the competition.

 

Reproducibility Challenge
Once again, students in the cluster competition will be asked to replicate the results of a publication from the previous year’s SC conference. For this challenge, students will take on the role of reviewing an SC20 paper to see if its results are reproducible. The SC21 Reproducibility Committee has selected the paper “A Parallel Framework for Constraint-Based Bayesian Network Learning via Markov Blanket Discovery” by Ankit Srivastava, Sriram P. Chockalingam, and Srinivas Aluru to be the Student Cluster Competition (SCC) benchmark for the Reproducibility
Challenge this year.

Last year, thanks to the adoption of automatically generating an Artifact Descriptor (AD) during submission time, all accepted papers from SC20 featured an AD in their appendix. A team of reviewers selected the paper from these past papers based on the AD, author interviews, and its
suitability for the SCC. The authors and the Reproducibility Committee have been working to create a reproducible benchmark that builds on the paper’s results. During the SCC, the student teams will be asked to run the benchmark, attempting to reproduce the findings from the original paper under different settings with different data sets.

SCC Logistics

Orientation Briefing & Competition Stages

Full information regarding SCC logistics will be announced at a later date.

SCC Webinars

Register for upcoming informational webinars tailored to help accepted teams understand various components of the SCC, watch recordings of past webinars, and download webinar slides.

Show me the webinars

Back To Top Button