
Image by Image Editor via
Flickr
The Large Hadron Collider (LHC) Project at CERN, European Organization for
Nuclear Research, is the largest particle physics project in the world,
collecting enormous amounts of data. To give you a perspective on the amount of
data we are talking here, let me quote this article from Yahoo News.
The data flow will be about 700 megabytes per second or 15 million gigabytes
a year for 10 to 15 years — enough to fill three million DVDs a year or create
a tower of CDs more than twice as high as Mount Everest.
CERN’s computing power can only handle 20% of this data. In order to process
this massive amounts of data and also to offer an easy access to more than 7000
scientists from 33 countries, CERN has launched LHC Computing Grid Project on
Friday. This is the world’s largest computing grid connecting more than 100,000
CPUs located at more than hundreds of universities and research institutions.
Even though the initial reason for CERN to use distributed computing was the
budget constraint it faced, the move proved to be much more advantageous than
they originally envisioned. Distributed computing shares many of the advantages
of cloud computing, which we advocate in this blog, such as redundant copies of
data over distributed geographical locations, optimum use of space and computing
capacities, scalability, easy manageability, energy efficiency, etc..
LHC Computing Grid uses a 3 tiered approach to ensure the reliability of the
system. In Tier-0, CERN’s on-site data center, all data is backed up on a tape
based archive after an initial processing. It is then passed on to Tier-1,
consisting of large high powered computer centers located in various
institutions at Canada, France, Germany, Italy, the Netherlands, the Nordic
countries, Spain, Taipei, and the UK, with two sites in the USA. These Tier-1
centers have sufficient storage facilities for the large fraction of the
experiment’s data and it is then passed on to Tier-2 centers, around 140 sites
from around the world. From these sites, data is passed on to various clusters
in physics research institutes around the world and also to the desktops of
scientists from around the world for processing specific tasks, based on the
individual focus of various research groups and scientists.
Even though this grid is a boon for LHC project, it has also opened up a new
era in scientific computing. Many scientific calculations require very powerful
computing resources and lack of such resources were a big handicap for
scientists around the world. Starting with simulations of statistical physics
and condensed matter physics to structure predictions of complex biological
molecules to drug discovery, scientists need very powerful computing resources.
This grid itself will help scientists working on various other projects. More
importantly, this project will be a front runner to various other computing grids
that could come up at different research institutions and bio-tech companies
working on projects like drug discovery.
I strongly believe that cloud computing will be a big enabler of scientific computing. It will put tremendous processing powers at the doorstep of scientists while saving lots of money due to its utility nature. In the United States, scientific funding has been in a really bad shape since the last few years. The cost effectiveness of cloud computing will come handy for the cash starved scientists in need of heavy computing power.
I am trying to get some scientists/researchers with experience in high
performance scientific computation to write about how cloud computing helps
scientists in their research. I do understand how cloud computing could be
useful for scientists but I would like to hear from them on how they are
tapping the clouds in their research work. If you are a scientist using cloud
computing in your work, we would like to publish your experience in this blog. Please feel
free to contact us so that we can help you share how your research
group uses cloud computing in the scientific research.
Important Links to this Grid Project: