James Gleick’s History of Information

From The New York Times:

Book The universe, the 18th-century mathematician and philosopher Jean Le Rond d’Alembert said, “would only be one fact and one great truth for whoever knew how to embrace it from a single point of view.” James Gleick has such a perspective, and signals it in the first word of the title of his new book, “The Information,” using the definite article we usually reserve for totalities like the universe, the ether — and the Internet. Information, he argues, is more than just the contents of our overflowing libraries and Web servers. It is “the blood and the fuel, the vital principle” of the world. Human consciousness, society, life on earth, the cosmos — it’s bits all the way down.

Gleick makes his case in a sweeping survey that covers the five millenniums of humanity’s engagement with information, from the invention of writing in Sumer to the elevation of information to a first principle in the sciences over the last half-century or so. It’s a grand narrative if ever there was one, but its key moment can be pinpointed to 1948, when Claude Shannon, a young mathematician with a background in cryptography and telephony, published a paper called “A Mathematical Theory of Communication” in a Bell Labs technical journal. For Shannon, communication was purely a matter of sending a message over a noisy channel so that someone else could recover it. Whether the message was meaningful, he said, was “irrelevant to the engineering problem.” Think of a game of Wheel of Fortune, where each card that’s turned over narrows the set of possible answers, except that here the answer could be anything: a common English phrase, a Polish surname, or just a set of license plate numbers. Whatever the message, the contribution made by each signal — what he called, somewhat provocatively, its “information” — could be quantified in binary digits (i.e., 1s and 0s), a term that conveniently condensed to “bits.”

More here.