nav search

The stage is set: Prepare for Hamburg cluster carnage

Sprints, mystery apps and less amps in student contest

By Dan Olds, Gabriel Consulting, 29 May 2012

ISC 2012 Now that China has settled on its two entrants to the ISC’12 Student Cluster Challenge (I’m dubbing it iSCC for short), it’s time to get a feel for the number-cruncher design competition and what the entrants will face.

Airbus is the big-name sponsor of the iSCC this year and has played a vital role in getting this fledgling competition off the ground – so to speak. (As a gesture of gratitude, I’ll be flying in an Airbus-manufactured plane to Hamburg this year. I really hope the back-of-the-seat power works – it’s a damned long flight.)

We now have two large HPC organisations - the SC and the aforementioned ISC - running separate competitions in which university teams build and benchmark clusters of their own design against their peers' computers. The two contests are similar in that the teams get their hardware from vendor sponsors, they all have to run HPL and submit a LINPACK score on the first day, and they receive awards for both the highest LINPACK and the highest overall performance on the assorted benchmarks.

Teams can’t go nuts and add hardware willy-nilly: there are hard caps on power consumption. The SC competition (or SCC) has a ceiling of 26 amps, but the iSSC allows only half as much – 13 amps. Energy-efficient designs are key in both competitions, but students participating in the iSSC will have much less room in which to wriggle when it comes to power. Plus you have to run what you’ve got; you can’t swap out hardware mid-stream.

Both competitions ask students to run the HPC Challenge benchmark on the first day, and both give an award for the highest LINPACK score. At SC10 in New Orleans, three teams broke through the teraflop barrier. The bout in Seattle a year later saw six of the eight teams beat the 1.0 teraflop mark; Team Russia topped all competitors with their 1.926 teraflop score. It’ll be interesting to see what the iSCC teams can do in June. They’ll probably have better hardware, but only half the power budget to work with.

Sprint versus marathon

One of the biggest differences between the SCC and iSCC is the timeframe in which hopefuls compete. The SCC is a 46-hour non-stop marathon that begins when the students get their data sets on Monday evening and ends when they turn in their final results on Wednesday afternoon.

In the iSCC, students will have a limited amount of time to complete their runs and submit their results. On Monday, they have from 3pm to 8.30pm to run HPCC and a separate LINPACK, submitting their scores at the end of the evening. The winner of the highest LINPACK award will be announced that evening.

Tuesday and Wednesday are devoted to application runs. They’re required to run six applications including:

  • OpenFOAM – computational fluid dynamics
  • CP2K – atomistic and molecular simulation
  • CPMD – molecular dynamics
  • NEMO – oceanographic research modeling

Sharp-eyed readers may notice that there are only four apps on the above list. That’s because each day the students are going to be confronted with a "surprise application" that they’ll have to complete along with the other applications.

In the real world, we don’t know exactly what we’ll be doing from day to day, so why should the iSCC be any different? This will add a bit more pressure on the teams as they struggle to fit the mystery app into their already busy system schedule.

At the end of the competition, team scores for LINPACK, application runs, and an interview with iSCC officials will be compiled and an overall winning team announced. There is also a "fan favourite" award for the team that captures the hearts of ISC attendees during the event.

We’ll preview the teams in upcoming articles so that you armchair cluster types can get a feel for the field and start to fill in your office betting pool brackets. ®