Science & Technology

Titan Crowned New King of Supercomputers

Note: This article is hosted here for archival purposes only. It does not necessarily represent the values of the Iron Warrior or Waterloo Engineering Society in the present day.

On November 12th, 2012, it was officially announced that the latest supercomputer developed by Cray Inc., at the Oak Ridge National Laboratory, is now the fastest in existence. Dubbed ‘Titan’, this supercomputer has ousted IBM’s Sequoia from the top position, beating Sequoia’s performance rate of 16.32 petaflops with its own impressive 17.59 petaflops. The United States now occupies the top two positions on the list for the first time in three years.
Supercomputing has come a long way from its humble origins in the 1960s, when the machines designed had a performance rate of about 0.01% of that of a modern day personal computer that runs on an Intel Core i5 processor. Supercomputers were designed to mainly carry out simulations of complicated physical phenomena that would otherwise take a lot of effort and money to replicate in reality. Their applications include everything from weather prediction, to quantum physics, to fluid dynamics, to molecular modelling. Climate change, nuclear phenomena, microbiology, cosmological events, natural disasters and neuron simulation are some other interesting applications of supercomputers. Most supercomputers are funded and built by government laboratories of a country for defence purposes. However, there are some that are privately owned, that are primarily used by interested customers who ‘rent’ the machine for a certain period of time to carry out simulations.
The architecture of supercomputers has evolved significantly since the 1960s. The initial designs utilized only a few processors; current architectures use over 100,000 processors interconnected for high performance and efficiency. The concept of parallel processing was introduced early on in supercomputer designs. As the number of processors began to increase, the distribution of memory and processing had to be addressed. Efficient communication between these many processors also had to be accomplished. Along with the progress of network architecture, designers have also had to focus on improving cooling mechanisms, as a lot of heat is generated when running so many processors simultaneously. Cooling methods have changed from using coolants such as Fluorinert to using efficient air-cooling mechanisms. More recently, designs have focused on increasing energy efficiency by achieving greater processor speeds at reduced power consumption levels.
The net developmental costs for building Titan amounted to $97 million. Titan was not built from scratch; it has in fact, evolved from Jaguar, Oak Ridge’s previous supercomputer project. This was done mainly to save the $20 million cost of building another set of power and cooling systems. The entire system covers about 404 square meters, and a total of 8.2 MW of power is drawn. Titan is cooled using chilled air that is pumped through the cabinets and re-chilled before re-circulation.
The uniqueness in Titan’s architecture lies in the fact that it uses central processing units (CPUs) as well as graphics processing units (GPUs) in tandem. While CPU cores are designed to efficiently handle one calculation at a time, GPUs are made to carry out multiple instructions in parallel – this allows a job to be broken down into bits and run them simultaneously. This allows a division of labour that is suited to the performance of the CPU or GPU cores.
The field of high-performance computing is a highly competitive one. As such, bigger and greater advancements are seen every year, and no machine manages to retain the top position for long. In the future, exaflop supercomputers may become a reality. The Indian Space Research Organization in collaboration with the Indian Institute of Science, both located in the south of India, have stated that a 132.8 exaflop machine is in the works for completion by 2017. This computer would be faster than any ever planned, and estimated to cost $2 billion. It is hoped that with better computing power, humankind will gain a better understanding of life, the universe and everything.

Leave a Reply