Supercomputers are the aptly named heroes of Big Data Science. Here at Brookhaven, we have three generations of IBM’s Blue Gene supercomputers, which make it possible to do immense amounts of number-crunching and data-sifting that would otherwise be impossible (or take a very, very, very long time). Our Supercomputing Center houses 23 racks of thousands of processors that give our scientists the computer power they need.
When scientists use our National Synchrotron Light Source to beam high-energy x-rays into a protein to determine its structure, they’re looking for the shape that we can then fit a drug into for treatments of various diseases. It’s a bit like a lock and key. But without supercomputers to help us try out all the possibilities, figuring out which drug fits into which protein would be like trying to unlock a door with a box full of hundreds or even thousands of keys and not knowing which one works best without trying them all.
Supercomputers also come in handy when scientists are trying to create catalysts for zero-emission fuel cells. Nanoscale engineering requires cutting-edge facilities and expertise in the dizzying laws of quantum mechanics that govern the interactions between atoms. The sheer number of variables in effect during any chemical reaction would make complete understanding impossible without a powerful computer to connect the atomic dots.
Our supercomputers also help us dig through the massive amounts of data produced by the particle collisions in our atom smasher. At the Relativistic Heavy Ion Collider, physicists use supercomputers capable of performing 10 trillion arithmetic calculations per second to interpret the data from those collisions. Without these computer workhorses, we wouldn’t be able to sift through the billions of data-dense “pictures” taken at RHIC that reveal details about the early universe.