presented by Md Moshiur Rahman
When the LHC is fully operational, it will produce roughly 1 billion proton-proton collision events per second in the detectors (40 million bunch crossings per second). This data will be heavily filtered so that only about 100 events of interest per second will be recorded permanently. Each event represents a few Megabytes of data, so the total data rate from the experiments will be of order 1 Gigabyte per second.
Including raw data, processed data and simulated data, the LHC will produce each year about 15 Petabytes (15 million Gigabytes) of data, the equivalent of about 20 million CDs! Copies of the data from one or more experiments will be stored at a dozen major computing centres, the so-called Tier-1 centres, and the analysis will be carried out by a Grid of over 100 computer centres in universities and research labs around the world, the Tier-2 centres. This computing Grid will allow thousands of scientists to access and analyse the LHC data, a task which will require a total computing power equivalent to ~ 100,000 of today's standard PC processors.
Several projects are involved in providing the necessary computing Grid infrastructure and associated middleware for LHC computing. The LHC Computing Grid project (LCG) currently operates the world's largest scientific Grid, with over 130 sites in 31 countries contributing resources, including more than 10,000 CPUs and several Petabytes of storage.
Other Grid projects contributing essential resources and know-how for LHC computing include:
For more information about Grid technology, visit the GridCafé