By John Timmer, Ars Technica

How much information can the world transmit, process, and store? Estimating this sort of thing can be a nightmare, but the task can provide valuable information on trends that are changing our computing and broadcast infrastructure. So a pair of researchers have taken the job upon themselves and tracked the changes in 60 different analog and digital technologies, from newsprint to cellular data, for a period of over 20 years.
The trends they spot range from the expected—Internet access has pushed both analog and digital phones into a tiny niche—to the surprising, such as the fact that, in aggregate, gaming hardware has always had more computing power than the world’s supercomputers.

The authors were remarkably thorough. For storage media, they considered things like paper, film, and vinyl records, and such modern innovations as Blu-ray discs and memory cards. To standardize their measurements across media, they used Shannon’s information theory to consider data storage in terms of optimally compressed bits. They also tracked technology, noting that in the year 2000, bits of video were compressed using cinepak, which was far less efficient than the current MPEG-4 format; calculations were adjusted accordingly.
Even so, there are some significant estimations here. “For example,” the authors note, “after normalization on optimally compressed bits we can say things like ‘a 6 square-cm newspaper image is worth a 1,000 words.’”

Similar sorts of estimates are required for things like broadcast capability and two-way communications, both of which are compiled as bits-per-second figures. The researchers estimate typical consumption of broadcast media to figure out how much of the existing capacity is used, and they figure that, since telecom equipment is run to maximize the use of its capacity, it’s usually booked to close to its limit.
Computing capacity is converted into MIPS, and estimates for the total number and class of chips are available. The big question mark here is mostly in embedded controllers; it’s hard to estimate both their computational capacity and how many are out there.
So these are pretty rough estimates, but similar assumptions are made at all four time points examined between 1986 and 2007. That should allow comparisons of trends across the time period, even if the absolute values of the estimates are a bit off.

read rest of story:

http://m.wired.com/wiredscience/2011/02/world-computer-data/