“The fundamental problem of communication is that of reproducing at
one point either exactly or approximately a message selected at another point.
Frequently the messages have meaning.”
— Claude Shannon (1948)
“When information is cheap, attention becomes expensive.” Information is something we are all curious about but how accurately can we predict the future if we fail to understand the past? This is part of what noted science writer James Gleick explores in The Information: A History, a Theory, a Flood.
It is not the amount of knowledge that makes a brain. It is not even the distribution of knowledge. It is the interconnectedness.
The “history” explores African drum languages, writing and lexicography, the story of Morse code, the telegraph and telephone, and brings us into computing with our desire to increase the efficiency with which we communicate language. The “theory” touches on Claude Shannon, Norbert Wiener, and Alan Turing among others who laid the foundation. The “flood” explains how biology uses genetics as a mechanism for information exchange and self-replicating memes.
For the purposes of science, information had to mean something special. Three centuries earlier, the new discipline of physics could not proceed until Isaac Newton appropriated words that were ancient and vague — force, mass, motion, and even time — and gave them new meanings. Newton made these terms into quantities, suitable for use in mathematical formulas. Until then, motion (for example) had been just as soft and inclusive a term as information. For Aristotelians, motion covered a far-flung family of phenomena: a peach ripening, a stone falling, a child growing, a body decaying. That was too rich. Most varieties of motion had to be tossed out before Newton’s laws could apply and the Scientific Revolution could succeed. In the nineteenth century, energy began to undergo a similar transformation: natural philosophers adapted a word meaning vigor or intensity. They mathematicized it, giving energy its fundamental place in the physicists’ view of nature.
It was the same with information. A rite of purification became necessary.
And then, when it was made simple, distilled, counted in bits, information was found to be everywhere. Shannon’s theory made a bridge between information and uncertainty; between information and entropy; and between information and chaos. It led to compact discs and fax machines, computers and cyberspace, Moore’s law and all the world’s Silicon Alleys. Information processing was born, along with information storage and information retrieval. People began to name a successor to the Iron Age and the Steam Age. “Man the food-gatherer reappears incongruously as information-gatherer,” remarked Marshall McLuhan in 1967. He wrote this an instant too soon, in the first dawn of computation and cyberspace.
We can see now that information is what our world runs on: the blood and the fuel, the vital principle. It pervades the sciences from top to bottom, transforming every branch of knowledge. Information theory began as a bridge from mathematics to electrical engineering and from there to computing. What English speakers call “computer science” Europeans have known as informatique, informatica, and Informatik. Now even biology has become an information science, a subject of messages, instructions, and code. Genes encapsulate information and enable procedures for reading it in and writing it out. Life spreads by networking. The body itself is an information processor. Memory resides not just in brains but in every cell. No wonder genetics bloomed along with information theory. DNA is the quintessential information molecule, the most advanced message processor at the cellular level — an alphabet and a code, 6 billion bits to form a human being. “What lies at the heart of every living thing is not a fire, not warm breath, not a ‘spark of life,’ ” declares the evolutionary theorist Richard Dawkins. “It is information, words, instructions. . . . If you want to understand life, don’t think about vibrant, throbbing gels and oozes, think about information technology.” The cells of an organism are nodes in a richly interwoven communications network, transmitting and receiving, coding and decoding. Evolution itself embodies an ongoing exchange of information between organism and environment.
In an interview with PW Gleick answers the deceptively simple question: What is information?
My first inclination is to define information by listing all the forms it takes—words, music, visual images, and all the ways we store and transmit our knowledge of the world. But in 1948 engineers came up with a more technical definition. At its most fundamental, information is a binary choice. In other words, a single bit of information is one yes-or-no choice. This is a very powerful concept that has made a lot of modern technology possible. But as empowering as this definition is, it is also desiccating, because it strips away any notion of meaning, usefulness, knowledge, or wisdom. By the technical definition, all information has a certain value, regardless of whether the message it conveys is true or false. A message could be complete nonsense, for example, and still take 1,000 bits. So while the technical definition has helped us become powerful users of information, it also instantly put us on thin ice, because everything we care about involves meaning, truth, and, ultimately, something like wisdom. And as we now flood the world with information, it becomes harder and harder to find meaning. That paradox is the final tension in my book.
In the age of print, scarcity was the issue. In the digital age, it is abundance. What are the implications of that shift?
There are two keys to cope with the information flood: searching and filtering. Think about how many times you are having a conversation with a group of people, and the most interesting feature of the conversation is some dispute over something you can't quite remember. Now, any one of us has the power to pull out their iPhone and do a Google search—it's just a matter of who is going to be rude enough to do it first [laughs]. We are now like gods in our ability to search for and find information.
But where we remain all too mortal is in our ability to process it, to make sense of it, and to filter and find the information we want. That's where the real challenges lie. Take, for example, writing a nonfiction book. The tools at my disposal now compared to just 10 years ago are extraordinary. A sentence that once might have required a day of library work now might require no more than a few minutes on the Internet. That is a good thing. Information is everywhere, and facts are astoundingly accessible. But it's also a challenge, because authors today must pay more attention than ever to where we add value. And I can tell you this, the value we add is not in the few minutes of work it takes to dig up some factoid, because any reader can now dig up the same factoid in the same few minutes.
In The Information, Gleick neatly captures today's reality. “We know about streaming information, parsing it, sorting it, matching it, and filtering it. Our furniture includes iPods and plasma screens, our skills include texting and Googling, we are endowed, we are expert, so we see information in the foreground,” he writes. “But it has always been there.”
We have met the Devil of Information Overload and his impish underlings, the computer virus, the busy signal, the dead link, and the PowerPoint presentation.
Still curious? See The Filter Bubble.