Statistics Search

banner1
banner4
banner5
banner6
banner7
 
 

The Green500 List - November 2012

Heterogeneous Systems Re-Claim Green500 List Dominance

SALT LAKE CITY, UT - November 14, 2012 - The latest Green500 List was released today (http://www.green500.org/lists/green201211) and the top spots on the list have been taken over by machines that combine commodity processors with coprocessors or graphics processing units (GPUs) to form heterogeneous high-performance computing systems.

With all eyes on the new TOP500 number one system, Oak Ridge National Labs' Titan, it was a system belonging to a neighbor at the University of Tennessee that debuted at the top of the November Green500 List. The National Institute for Computational Sciences' Beacon system has set the new energy efficiency bar at nearly two-and-a-half billion floating-point operations per second (gigaflops) per watt. Employing Intel's Sandy Bridge series of Xeon central processing units (CPUs) and four of Intel's Xeon Phi coprocessors per node Beacon achieved a peak 112,200 gigaflops of performance running the LINPACK benchmark while consuming only 44.89 kW of power.

The Intel Xeon Phi--"Knights Corner"-- is a 22nm multicore coprocessor featuring the world's first 3D Tri-Gate transistors. Like its GPU counterparts, the Intel Xeon Phi resides on a PCI Express board that plugs into a machine's expansion slots.

Rounding out the top five systems are three other machines using GPU accelerators combined with traditional AMD or Intel CPUs. "Metaphorically, think of CPU-GPU systems operating like the human brain, where the CPU could be viewed as the left brain and the GPU as the right brain," says Dr. Wu Feng, founder of the Green500 List. "Each side of the brain is suited to process different types of tasks." In second place is the SANAM system from the King Abdulaziz City for Science and Technology, an Intel CPU system that uses AMD's new FirePro S10000 GPU accelerators, and in third and fourth are the Titan and Todi systems by Cray that employ AMD Opteron CPUs and nVIDIA Tesla K20 GPU accelerators. In number five is the previous Green500 List number one system, IBM's Blue Gene/Q that uses PowerPC BQC CPUs. All of these systems are over two gigaflops per watt.

Overall, the performance of machines in the Green500 List has increased at a higher rate compared to power consumption. "That's why the machines' efficiencies are going up," says Feng. "We are more performance for the same amount of power." For machines based on commodity components--machines built with off-the-shelf components-- coprocessors and GPUs are attributing a great deal to the efficiency gains. So much so that they are keeping pace and in the latest list even outpacing purpose built systems like IBM's Blue Gene/Q. "Power consumption is still going up," says Wu, and that is still a concern.

The Green500 List has provided a ranking of the most energy-efficient supercomputers in the world since November 2007. For decades, the notion of "performance" has been synonymous with "speed" as measured in FLOPS. This particular focus has led to the emergence of supercomputers that consume egregious amounts of electrical power and produce so much heat that extravagant cooling facilities must be constructed to ensure proper operation. In addition, the emphasis on speed as the ultimate metric has caused other metrics such as reliability, availability, and usability to be largely ignored. As a result, there has been an extraordinary increase in the total cost of ownership (TCO) of a supercomputer.

The Green500 List

Listed below are the November 2012 The Green500's energy-efficient supercomputers ranked from 1 to 100.

Green500 Rank MFLOPS/W Site* Computer* Total Power
(kW)
1 2,499.44 National Institute for Computational Sciences/University of Tennessee Beacon - Appro GreenBlade GB824M, Xeon E5-2670 8C 2.600GHz, Infiniband FDR, Intel Xeon Phi 5110P 44.89
2 2,351.10 King Abdulaziz City for Science and Technology SANAM - Adtech ESC4000/FDR G2, Xeon E5-2650 8C 2.000GHz, Infiniband FDR, AMD FirePro S10000 179.15
3 2,142.77 DOE/SC/Oak Ridge National Laboratory Titan - Cray XK7 , Opteron 6274 16C 2.200GHz, Cray Gemini interconnect, NVIDIA K20x 8,209.00
4 2,121.71 Swiss Scientific Computing Center (CSCS) Todi - Cray XK7 , Opteron 6272 16C 2.100GHz, Cray Gemini interconnect, NVIDIA Tesla K20 Kepler 129.00
5 2,102.12 Forschungszentrum Juelich (FZJ) JUQUEEN - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 1,970.00
6 2,101.39 Southern Ontario Smart Computing Innovation Consortium/University of Toronto BGQdev - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 41.09
7 2,101.39 DOE/NNSA/LLNL rzuseq - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 41.09
8 2,101.39 IBM Thomas J. Watson Research Center BlueGene/Q, Power BQC 16C 1.60GHz, Custom 41.09
9 2,101.12 IBM Thomas J. Watson Research Center BlueGene/Q, Power BQC 16C 1.60 GHz, Custom 82.19
10 2,101.12 Ecole Polytechnique Federale de Lausanne CADMOS BG/Q - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 82.19
11 2,101.12 Interdisciplinary Centre for Mathematical and Computational Modelling, University of Warsaw BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 82.19
12 2,101.12 DOE/SC/Argonne National Laboratory Cetus - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 82.19
13 2,101.12 DOE/SC/Argonne National Laboratory Vesta - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 82.19
14 2,101.12 Rensselaer Polytechnic Institute BlueGene/Q, Power BQC 16C 1.60GHz, Custom 82.19
15 2,101.12 University of Rochester BlueGene/Q, Power BQC 16C 1.60GHz, Custom 82.19
16 2,099.48 High Energy Accelerator Research Organization /KEK HIMAWARI - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 246.56
17 2,099.48 High Energy Accelerator Research Organization /KEK SAKURA - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 246.56
18 2,099.48 University of Edinburgh DiRAC - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 493.12
19 2,099.47 Science and Technology Facilities Council - Daresbury Laboratory Blue Joule - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 575.31
20 2,099.46 CNRS/IDRIS-GENCI BlueGene/Q, Power BQC 16C 1.60GHz, Custom 328.75
21 2,099.46 EDF R&D Zumbrota - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 328.75
22 2,099.46 Victorian Life Sciences Computation Initiative Avoca - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 328.75
23 2,099.45 CINECA Fermi - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 821.88
24 2,099.39 IBM - Rochester BlueGene/Q, Power BQC 16C 1.60 GHz, Custom 164.38
25 2,099.39 IBM - Rochester BlueGene/Q, Power BQC 16C 1.60 GHz, Custom 164.38
26 2,099.39 Southern Ontario Smart Computing Innovation Consortium/University of Toronto BGQ - BlueGene/Q, Power BQC 16C 1.600GHz, Custom Interconnect 164.38
27 2,099.39 DOE/NNSA/LLNL Vulcan - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 164.38
28 2,069.04 DOE/SC/Argonne National Laboratory Mira - BlueGene/Q, Power BQC 16C 1.60GHz, Custom 3,945.00
29 2,069.04 DOE/NNSA/LLNL Sequoia - BlueGene/Q, Power BQC 16C 1.60 GHz, Custom 7,890.00
30 1,949.37 Joint Supercomputer Center MVS-10P - RSC Tornado, Xeon E5-2690 8C 2.900GHz, Infiniband FDR, Intel Xeon Phi 181.70
31 1,935.32 NASA Center for Climate Simulation Discover - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband QDR, Intel Xeon Phi 5110P 215.60
32 1,870.12 Seoul National University Chundoong - Chundoong Cluster, Xeon E5-2650 8C 2.000GHz, Infiniband QDR, AMD Radeon HD 7970 56.25
33 1,612.97 NASA/Ames Research Center/NAS Maia - SGI Rackable C1104G-RP5, Xeon E5-2670 8C 2.600GHz, Infiniband FDR, Intel Xeon Phi 132.00
34 1,427.73 Intel Endeavor - Intel Cluster, Xeon E5-2670 8C 2.600GHz, Infiniband FDR, Intel Xeon Phi SE10 257.71
35 1,266.26 Barcelona Supercomputing Center Bullx B505, Xeon E5649 6C 2.53GHz, Infiniband QDR, NVIDIA 2090 81.50
36 1,050.26 Los Alamos National Laboratory Moonlight - Xtreme-X , Xeon E5-2670 8C 2.600GHz, Infiniband QDR, NVIDIA 2090 226.80
37 1,038.29 CSIRO CSIRO GPU Cluster - Nitro G16 3GPU, Xeon E5-2650 8C 2.000GHz, Infiniband FDR, NVIDIA 2050 128.77
38 1,035.13 Center for Computational Sciences, University of Tsukuba HA-PACS - Xtream-X GreenBlade 8204, Xeon E5-2670 8C 2.600GHz, Infiniband QDR, NVIDIA 2090 407.29
39 1,010.11 CEA/TGCC-GENCI Curie hybrid - Bullx B505, Xeon E5640 2.67 GHz, Infiniband QDR 108.80
40 995.52 South Ural State University RSC Tornado SUSU - RSC Tornado, Xeon X5680 6C 3.330GHz, Infiniband QDR, Intel Xeon Phi 147.46
41 974.69 Center for Development of Advanced Computing (C-DAC) iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 127.20
42 974.64 Navy DoD Supercomputing Resource Center (Navy DSRC) Cernan - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 80.14
43 953.51 Virginia Tech HokieSpeed - SuperServer 2026GT-TRF, Xeon E5645 6C 2.40GHz, Infiniband QDR, NVIDIA 2050 126.27
44 932.70 Automotive Company iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 91.72
45 932.63 Science and Technology Facilities Council - Daresbury Laboratory Blue Wonder - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 170.16
46 932.62 University of Chicago iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 86.41
47 932.62 Indian Institute of Technology Madras Virgo - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 97.71
48 932.60 Centro Euro-Mediterraneo per i Cambiamenti Climatici iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 159.53
49 932.59 Durham University iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 139.59
50 919.44 Institute of Process Engineering, Chinese Academy of Sciences Mole-8.5 - Mole-8.5 Cluster, Xeon X5520 4C 2.27 GHz, Infiniband QDR, NVIDIA 2050 540.00
51 910.82 Max-Planck-Gesellschaft MPI/IPP iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 205.72
52 910.80 Barcelona Supercomputing Center MareNostrum - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 699.26
53 908.83 Leibniz Rechenzentrum SuperMUC - iDataPlex DX360M4, Xeon E5-2680 8C 2.70GHz, Infiniband FDR
Level 3 measurement data available
2,841.00
54 891.87 Forschungszentrum Juelich (FZJ) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband QDR, NVIDIA 2070 128.75
55 891.88 CINECA / SCS - SuperComputing Solution iDataPlex DX360M3, Xeon E5645 6C 2.40 GHz, Infiniband QDR, NVIDIA 2070 160.00
56 886.30 Information Technology Center, The University of Tokyo Oakleaf-FX - PRIMEHPC FX10, SPARC64 IXfx 16C 1.848GHz, Tofu interconnect 1,176.80
57 880.51 National Computational Infrastructure, Australian National University Fujitsu PRIMERGY CX250 S1, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 1,111.40
58 875.34 NCAR (National Center for Atmospheric Research) Yellowstone - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 1,436.72
59 875.33 Army Research Laboratory DoD Supercomputing Resource Center (ARL DSRC) Pershing - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 400.68
60 875.33 National Centers for Environment Prediction Gyre - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 203.52
61 875.33 National Centers for Environment Prediction Tide - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 203.52
62 875.33 Navy DoD Supercomputing Resource Center (Navy DSRC) Haise - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 373.97
63 875.33 Navy DoD Supercomputing Resource Center (Navy DSRC) Kilrain - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 373.97
64 875.32 Army Research Laboratory DoD Supercomputing Resource Center (ARL DSRC) Hercules - iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 347.26
65 875.32 Electronics iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 148.19
66 875.32 University of Miami iDataPlex DX360M4, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 100.49
67 865.90 Geoscience iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 108.15
68 865.90 Geoscience iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 108.15
69 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
70 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
71 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
72 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
73 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
74 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
75 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
76 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
77 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
78 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
79 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
80 865.88 Geoscience (P) iDataPlex DX360M3, Xeon X5650 6C 2.66 GHz, Infiniband, NVIDIA 2090 99.14
81 852.27 GSIC Center, Tokyo Institute of Technology TSUBAME 2.0 - HP ProLiant SL390s G7 Xeon 6C X5670, Nvidia GPU, Linux/Windows 1,398.61
82 848.69 Sandia National Laboratories Dark Bridge - Appro Xtreme-X Supercomputer, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 315.90
83 839.72 Research Institute for Information Technology, Kyushu University Fujitsu PRIMERGY CX400, Xeon E5-2680 8C 2.700GHz, Infiniband FDR 548.16
84 837.19 Lawrence Livermore National Laboratory Zin - Xtreme-X GreenBlade GB512X, Xeon E5 (Sandy Bridge - EP) 8C 2.60GHz, Infiniband QDR 924.16
85 830.18 RIKEN Advanced Institute for Computational Science (AICS) K computer, SPARC64 VIIIfx 2.0GHz, Tofu interconnect 12,659.89
86 824.79 Lawrence Livermore National Laboratory Cab - Xtreme-X , Xeon E5-2670 8C 2.600GHz, Infiniband QDR 421.20
87 809.54 National Center for High Performance Computing Formosa 5 - Hybrid Cluster, Xeon X5670 6C 2.930GHz, Infiniband QDR, NVIDIA 2070 111.10
88 809.36 Bull Bull Benchmarks SuperComputer I - Bullx B510, Xeon E5 (Sandy Bridge - EP) 8C 2.70GHz, Infiniband QDR 132.20
89 808.47 Central Research Institute of Electric Power Industry/CRIEPI SGI Altix X, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 720.00
90 808.47 Wright-Patterson AFB SGI Altix X, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 419.93
91 803.95 CEA/TGCC-GENCI Curie fat nodes - Bullx S6010 Cluster, Xeon 2.26 Ghz 8-core, QDR Infiniband 108.80
92 801.93 Bull Bull Benchmarks SuperComputer II - Bullx B510, Xeon E5 (Sandy Bridge - EP) 8C 2.70GHz, Infiniband QDR 450.00
93 799.62 Sandia National Laboratories Pecos - Xtreme-X , Xeon E5-2670 8C 2.600GHz, Infiniband QDR 421.20
94 799.31 National Supercomputer Centre (NSC) Triolith - Cluster Platform SL230s Gen8, Xeon E5-2660 8C 2.200GHz, Infiniband FDR 380.00
95 797.43 UCSD/San Diego Supercomputer Center Gordon - Xtreme-X GreenBlade GB512X, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 358.40
96 791.72 Commissariat a l'Energie Atomique (CEA) Tera-100 Hybrid - Bullx B505, Xeon E56xx (Westmere-EP) 2.40 GHz, Infiniband QDR 194.51
97 787.63 CNRS/IDRIS-GENCI Ada - xSeries x3750 Cluster, Xeon E5-2680 8C 2.700GHz, Infiniband FDR 243.69
98 786.78 University of Oslo Abel - MEGWARE MiriQuid, Xeon E5-2670 8C 2.600GHz, Infiniband FDR 227.00
99 786.17 CSIR Centre for Mathematical Modelling and Computer Simulation Cluster Platform 3000 BL460c Gen8, Xeon E5-2670 8C 2.60GHz, Infiniband FDR 386.56
100 775.45 Los Alamos National Laboratory Luna - Xtreme-X GreenBlade GB512X, Xeon E5-2670 8C 2.600GHz, Infiniband QDR 448.00

* Performance data obtained from publicly available sources including TOP500