Statistics Search

banner1
banner4
banner5
banner6
banner7
 
 

The Little Green500 List - November 2013

The November 2013 release of the Green500 list was announced today at the SC|13 conference in Denver, Colorado, USA. Continuing the trend from previous years, heterogeneous supercomputing systems totally dominates the top 10 spots of the Green500. A heterogeneous system uses computational building blocks that consist of two or more types of “computing brains.” These types of computing brains include traditional processors (CPUs), graphics processing units (GPUs), and co-processors. In this edition of the Green500, one system smashes through the 4-billion floating-point operation per second (gigaflops) per watt barrier.

TSUBAME-KFC, a heterogeneous supercomputing system developed at the Tokyo Institute of Technology (TITech) in Japan, tops the list with an efficiency of 4.5 gigaflops/watt. Each computational node within TSUBAME-KFC consists of two Intel Ivy Bridge processors and four NVIDIA Kepler GPUs. In fact, all systems in the top ten of the Green500 use a similar architecture, i.e., Intel CPUs combined with NVIDIA GPUs. Wilkes, a supercomputer housed at Cambridge University, takes the second spot. The third position is filled by the HA-PACS TCA system at the University of Tsukuba. Of particular note, this list also sees two petaflop systems, each capable of computing over one quadrillion operations per second, achieve an efficiency of over 3 gigaflops/watt, namely Piz Daint at Swiss National Supercomputing Center and TSUBAME 2.5 at Tokyo Institute of Technology. Thus, Piz Daint is the greenest petaflop supercomputer on the Green500. As a point of reference, Tianhe-2, the fastest supercomputer in the world according to the Top500 list, achieves an efficiency of 1.9 gigaflops/watt.

This list marks a number of “firsts” for the Green500. It is the first time that a supercomputer has broken through the 4 gigaflops/watt barrier. Second, it is first time that all of the top 10 systems on the Green500 are heterogeneous systems. Third, it is the first time that the average of the measured power consumed by the systems on the Green500 dropped with respect to the previous edition of the list. “A decrease in the average measured power coupled with an overall increase in performance is an encouraging step along the trail to exascale,” noted Wu Feng of the Green500. Fourth, assuming that TSUBAME-KFC’s energy efficiency can be maintained for an exaflop system, it is the first time that an extrapolation to an exaflop supercomputer has dropped below 300 megawatts (MW), specifically 222 MW. “This 222-MW power envelope is still a long way away from DARPA’s target of an exaflop system in the 20-MW power envelope,” says Feng.

Starting with this release, the Little Green500 list only includes machines with power values submitted directly to the Green500. In fact, there are more than 400 systems that have submitted directly to the Green500 over the past few years. As in previous years, the Little Green500 list has better overall efficiency than the Green500 list on average.

Earlier this year, the Green500 adopted new methodologies for measuring the power of supercomputing systems and providing a more accurate representation of the energy efficiency of large-scale systems. In June 2013, the Green500 formally adopted measurement rules (a.k.a. “Level 1” measurements), developed in cooperation with the Energy-Efficient High-Performance Computing Working Group (EE HPC WG). Moreover, power-measurement methodologies with higher precision and accuracy were developed as a part of this effort (a.k.a. “Level 2” and “Level 3” measurements). With growing support and interest in the energy efficiency of large-scale computing systems, the Green500 is welcoming two more submissions at Level 2 and Level 3 than in the previous edition of the Green500 list. Of particular note, Piz Daint, the greenest petaflop supercomputer in the world, submitted the highest-quality Level 3 measurement.

The Green500 List

Listed below are the November 2013 Little Green500's energy-efficient supercomputers ranked from 1 to 100.

Green500 Rank MFLOPS/W Site* Computer* Total Power
(kW)
1 4,503.17 GSIC Center, Tokyo Institute of Technology TSUBAME-KFC - LX 1U-4GPU/104Re-1G Cluster, Intel Xeon E5-2620v2 6C 2.100GHz, Infiniband FDR, NVIDIA K20x 27.78
2 3,631.86 Cambridge University Wilkes - Dell T620 Cluster, Intel Xeon E5-2630v2 6C 2.600GHz, Infiniband FDR, NVIDIA K20 52.62
3 3,517.84 Center for Computational Sciences, University of Tsukuba HA-PACS TCA - Cray 3623G4-SM Cluster, Intel Xeon E5-2680v2 10C 2.800GHz, Infiniband QDR, NVIDIA K20x 78.77
4 3,208.83 CINECA Eurora - Eurotech Aurora HPC 10-20, Xeon E5-2687W 8C 3.100GHz, Infiniband QDR, NVIDIA K20 30.70
5 3,185.91 Swiss National Supercomputing Centre (CSCS) Piz Daint - Cray XC30, Xeon E5-2670 8C 2.600GHz, Aries interconnect , NVIDIA K20x
Level 3 measurement data available
1,753.66
6 3,130.95 ROMEO HPC Center - Champagne-Ardenne romeo - Bull R421-E3 Cluster, Intel Xeon E5-2650v2 8C 2.600GHz, Infiniband FDR, NVIDIA K20x 81.41
7 3,068.71 GSIC Center, Tokyo Institute of Technology TSUBAME 2.5 - Cluster Platform SL390s G7, Xeon X5670 6C 2.930GHz, Infiniband QDR, NVIDIA K20x 922.54
8 2,702.16 University of Arizona iDataPlex DX360M4, Intel Xeon E5-2650v2 8C 2.600GHz, Infiniband FDR14, NVIDIA K20x 53.62
9 2,629.10 Max-Planck-Gesellschaft MPI/IPP iDataPlex DX360M4, Intel Xeon E5-2680v2 10C 2.800GHz, Infiniband, NVIDIA K20x 269.94
10 2,629.10 Financial Institution iDataPlex DX360M4, Intel Xeon E5-2680v2 10C 2.800GHz, Infiniband, NVIDIA K20x 55.62
11 2,449.57 National Institute for Computational Sciences/University of Tennessee Beacon - Appro GreenBlade GB824M, Xeon E5-2670 8C 2.600GHz, Infiniband FDR, Intel Xeon Phi 5110P 45.11
12 2,358.69 CSIRO CSIRO GPU Cluster - Nitro G16 3GPU, Xeon E5-2650 8C 2.000GHz, Infiniband FDR, Nvidia K20m 71.01
13 2,351.10 King Abdullah University of Science and Technology SANAM - Adtech ESC4000/FDR G2, Xeon E5-2650 8C 2.000GHz, Infiniband FDR, AMD FirePro series 179.15
14 2,299.15 IBM Thomas J. Watson Research Center BlueGene/Q, Power BQC 16C 1.60 GHz, Custom 82.19
15 2,299.15 DOE/SC/Argonne National Laboratory Cetus - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 82.19
16 2,299.15 Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 82.19
17 2,299.15 IBM Rochester BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 82.19
18 2,299.15 DOE/SC/Argonne National Laboratory Vesta - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 82.19
19 2,299.15 University of Rochester BlueGene/Q, Power BQC 16C 1.60GHz, Custom 82.19
20 2,177.13 DOE/NNSA/LLNL Vulcan - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 1,972.00
21 2,176.82 Forschungszentrum Juelich (FZJ) JUQUEEN - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 2,301.00
22 2,176.78 Rensselaer Polytechnic Institute BlueGene/Q, Power BQC 16C 1.60GHz, Custom 410.90
23 2,176.60 University of Edinburgh DiRAC - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 493.12
24 2,176.60 High Energy Accelerator Research Organization /KEK HIMAWARI - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 246.56
25 2,176.60 High Energy Accelerator Research Organization /KEK SAKURA - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 246.56
26 2,176.59 Science and Technology Facilities Council - Daresbury Laboratory Blue Joule - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 575.31
27 2,176.58 DOE/NNSA/LLNL Sequoia - BlueGene/Q, Power BQC 16C 1.60 GHz, Custom 7,890.00
28 2,176.58 DOE/SC/Argonne National Laboratory Mira - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 3,945.00
29 2,176.58 Victorian Life Sciences Computation Initiative Avoca - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 328.75
30 2,176.58 EDF R&D; Zumbrota - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 328.75
31 2,176.58 CNRS/IDRIS-GENCI BlueGene/Q, Power BQC 16C 1.60GHz, Custom 328.75
32 2,176.57 CINECA Fermi - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 821.88
33 2,176.52 IBM Rochester BlueGene/Q, Power BQC 16C 1.60 GHz, Custom 164.38
34 2,176.52 IBM Rochester BlueGene/Q, Power BQC 16C 1.60 GHz, Custom 164.38
35 2,176.52 Southern Ontario Smart Computing Innovation Consortium/University of Toronto BGQ - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 164.38
36 2,143.03 DOE/SC/Oak Ridge National Laboratory Titan - Cray XK7 , Opteron 6274 16C 2.200GHz, Cray Gemini interconnect, NVIDIA K20x 8,208.00
37 2,121.71 Swiss National Supercomputing Centre (CSCS) Todi - Cray XK7 , Opteron 6272 16C 2.100GHz, Cray Gemini interconnect, NVIDIA Tesla K20 Kepler 129.00
38 2,101.39 Southern Ontario Smart Computing Innovation Consortium/University of Toronto BGQdev - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 41.09
39 2,101.39 DOE/NNSA/LLNL rzuseq - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 41.09
40 2,101.39 IBM Thomas J. Watson Research Center BlueGene/Q, Power BQC 16C 1.60GHz, Custom 41.09
41 1,935.32 NASA Center for Climate Simulation Discover - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband QDR, Intel Xeon Phi 5110P 215.60
42 1,801.36 Intel Data Center Demos c/o ViaWest DataCenter Cherry Creek - Supercmicro F627G3-F Cluster, Intel Xeon E5-2697v2 12C 2.700GHz, Intel Truscale, Intel Xeon Phi 7120P 73.00
43 1,793.96 DOE/SC/Pacific Northwest National Laboratory cascade - Atipa Visione IF442 Blade Server, Xeon E5-2670 8C 2.600GHz, Infiniband FDR, Intel Xeon Phi 5110P 1,307.60
44 1,760.20 Center for Development of Advanced Computing (C-DAC) PARAM Yuva - II - R2208GZ Cluster, Xeon E5-2670 8C 2.600GHz, Infiniband FDR, Intel Xeon Phi 5110P
Level 3 measurement data available
220.68
45 1,379.79 Nagasaki University DEGIMA - DEGIMA Cluster, Intel i5, ATI Radeon GPU, Infiniband QDR 47.00
46 1,247.57 Météo France Bullx DLC B710 Blades, Intel Xeon E5 v2 12C 2.700GHz, Infiniband FDR 401.00
47 1,036.64 Electronics BladeCenter HS23 Cluster, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 109.21
48 1,036.64 Electronics BladeCenter HS23 Cluster, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 109.21
49 1,036.64 Electronics BladeCenter HS23 Cluster, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 109.21
50 1,036.62 Electronics BladeCenter HS23 Cluster, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 127.90
51 1,036.62 Electronics BladeCenter HS23 Cluster, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 127.90
52 1,036.62 Electronics BladeCenter HS23 Cluster, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 127.90
53 1,015.55 EDF R&D; Athos - iDataPlex DX360M4, Intel Xeon E5-2697v2 12C 2.700GHz, Infiniband FDR14 347.27
54 1,010.11 CEA/TGCC-GENCI Curie hybrid - Bullx B505, Xeon E5640 2.67 GHz, Infiniband QDR 108.80
55 991.64 Aerospace Company BladeCenter HS23 Cluster, Xeon E5-2680 8C 2.700GHz, Infiniband QDR 151.84
56 991.61 Classified BladeCenter HS23 Cluster, Xeon E5-2680 8C 2.700GHz, Infiniband QDR 177.84
57 974.64 Navy DoD Supercomputing Resource Center (Navy DSRC) Cernan - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 80.14
58 935.82 IBM Development Engineering iDataPlex DX360M4, Intel Xeon E5-2680v2 10C 2.800GHz, Infiniband FDR14 139.10
59 935.82 IBM Development Engineering iDataPlex DX360M4, Intel Xeon E5-2680v2 10C 2.800GHz, Infiniband FDR14 139.10
60 935.82 IBM Development Engineering iDataPlex DX360M4, Intel Xeon E5-2680v2 10C 2.800GHz, Infiniband FDR14 139.10
61 932.83 University of Chicago iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 73.10
62 932.70 Automotive Company iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 91.72
63 932.63 Science and Technology Facilities Council - Daresbury Laboratory Blue Wonder - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 170.16
64 932.62 Automotive Company iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 67.30
65 932.62 University of Chicago iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 86.41
66 932.62 Indian Institute of Technology Madras Virgo - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 97.71
67 932.60 NASA/Goddard Space Flight Center iDataPlex DX360M3, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 159.53
68 932.60 Centro Euro-Mediterraneo per i Cambiamenti Climatici iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 159.53
69 932.59 Durham University iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 139.59
70 932.19 Universidad de Cantabria - SSC ALTAMIRA - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 79.80
71 922.52 University of Miami iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR, Intel Xeon Phi 119.25
72 919.44 Institute of Process Engineering, Chinese Academy of Sciences Mole-8.5 - Mole-8.5 Cluster, Xeon X5520 4C 2.27 GHz, Infiniband QDR, NVIDIA 2050 540.00
73 910.88 National Center for Medium Range Weather Forecast iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 349.60
74 910.85 Barcelona Supercomputing Center MareNostrum - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 1,015.60
75 910.82 Max-Planck-Gesellschaft MPI/IPP iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 205.72
76 910.80 University of Southampton IRIDIS 4 - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 249.26
77 910.80 Indian Institute of Tropical Meteorology iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 789.66
78 910.80 University of Chicago Midway - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 142.91
79 910.79 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 138.59
80 910.79 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 138.59
81 910.79 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 138.59
82 910.79 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 138.59
83 910.79 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 138.59
84 910.79 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 138.59
85 910.79 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 138.59
86 910.79 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 138.59
87 910.78 Exploration & Production - Eni S.p.A. HPCC1 - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR14 498.53
88 910.78 Geoscience iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 251.26
89 910.78 Financial Institution (P) iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 153.88
90 910.78 Financial Institution (P) iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 153.88
91 910.78 Financial Institution (P) iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband 153.88
92 910.78 Bombardier Aerospace Argus - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR14 149.56
93 910.76 CLUMEQ - McGill University Guillimin Phase 2 - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 154.88
94 908.83 Leibniz Rechenzentrum SuperMUC - iDataPlex DX360M4, Xeon E5-2680 8C 2.70GHz, Infiniband FDR
Level 3 measurement data available
2,841.00
95 891.87 Forschungszentrum Juelich (FZJ) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband QDR, NVIDIA 2070 128.75
96 891.88 CINECA / SCS - SuperComputing Solution iDataPlex DX360M3, Xeon E5645 6C 2.40 GHz, Infiniband QDR, NVIDIA 2070 160.00
97 875.34 NCAR (National Center for Atmospheric Research) Yellowstone - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 1,436.72
98 875.33 Army Research Laboratory DoD Supercomputing Resource Center (ARL DSRC) Pershing - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 400.68
99 875.33 National Centers for Environment Prediction Gyre - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 203.52
100 875.33 National Centers for Environment Prediction Tide - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 203.52

* Only systems with power data submitted directly to the Green500 are included in the Little list. Performance data may be obtained from publicly available sources including TOP500