Quote:
Originally Posted by Tru2Chevy
I'm a supercomputer operator for a federal government weather research lab: http://gfdl.noaa.gov
We have an 8,000 core SGI Altix HPCS and about 20 petabytes of archive storage for the scientist's data.
- Justin
|
- The Internet Archive contains almost 2 petabytes of data.
- Google processes about 20 petabytes of data a day.
- The 4 experiments in the Large Hadron Collider will produce about 15 petabytes of data per year, which will be distributed over the LHC Computing Grid.
- Facebook has just over 1 petabyte of users' photos stored, translating into roughly 10 billion photos.
- Isohunt has about 1 petabyte of files contained in torrents indexed globally.
Do you really need all that room?