The LHC produces about 15 Petabytes (15 million gigabytes) of data per year. All this data is not processed at one site alone but at about 150 computing centers around the globe that put together 250000 CPU cores, 160 Petabyte of disk storage and 90 Petabytes of tape storage at the time of writing. These distributed resources are combined in a ‘Grid’ infrastructure that allows more than 8000 physicists around the world to access and process this distributed data in a transparent way.
How does Google do it?
During a recent renovation project, a total of 145 sensors were embedded in and attached to the HollandseBrug by Strukton. These sensors include strain sensors, vibration sensors, temperature sensors, and also a video camera. This large array of sensors continuously monitor all events on the bridge, and provides a huge data flood from it: every day, about 5 GB of non-video data is gathered, more than one DVD a day!