April 20, 2009
The Next Great Discontinuity
Part Two: The Data Deluge
(Link to Part One)
Speed is the elegance of thought, which mocks stupidity, heavy and slow. Intelligence thinks and says the unexpected; it moves with the fly, with its flight. A fool is defined by predictability...
But if life is brief, luckily, thought travels as fast as the speed of light. In earlier times philosophers used the metaphor of light to express the clarity of thought; I would like to use it to express not only brilliance and purity but also speed. In this sense we are inventing right now a new Age of Enlightenment...
A lot of... incomprehension... comes simply from this speed. I am fairly glad to be living in the information age, since in it speed becomes once again a fundamental category of intelligence.
Michel Serres, Conversations on Science, Culture and Time
Human beings are often described as the great imitators:
We perceive the ant and the termite as part of nature. Their nests and mounds grow out of the Earth. Their actions are indicative of a hidden pattern being woven by natural forces from which we are separated. The termite mound is natural, and we, the eternal outsiders, sitting in our cottages, our apartments and our skyscrapers, are somehow not. Through religion, poetry, or the swift skill of the craftsman smearing pigment onto canvas, humans aim to encapsulate that quality of existence that defies simple description. The best art, or so it is said, brings us closer to attaining a higher truth about the world that remains elusive from language, that perhaps the termite itself embodies as part of its nature. Termite mounds are beautiful, but were built without a concept of beauty. Termite mounds are mathematically precise, yet crawling through their intricate catacombs cannot be found one termite in comprehension of even the simplest mathematical constituent. In short, humans imitate and termites merely are.
This extraordinary idea is partly responsible for what I referred to in Part One of this article as The Fallacy of Misplaced Concreteness. It leads us to consider not only the human organism as distinct from its surroundings, but it also forces us to separate human nature from its material artefacts. We understand the termite mound as integral to termite nature, but are quick to distinguish the axe, the wheel, the book, the skyscraper and the computer network from the human nature that bore them.
When we act, through art, religion or with the rational structures of science, to interface with the world our imitative (mimetic) capacity has both subjective and objective consequence. Our revelations, our ideas, stories and models have life only insofar as they have a material to become invested through. The religion of the dance, the stone circle and the summer solstice is mimetically different to the religion of the sermon and the scripture because the way it interfaces with the world is different.
Likewise, it is only with the consistency of written and printed language that the technical arts could become science, and through which our ‘modern’ era could be built. Dances and stone circles relayed mythic thinking structures, singular, imminent and ethereal in their explanatory capacities. The truth revealed by the stone circle was present at the interface between participant, ceremony and summer solstice: a synchronic truth of absolute presence in the moment. Anyone reading this will find truth and meaning through grapholectic interface. Our thinking is linear, reductive and bound to the page. It is reliant on a diachronic temporality that the pen, the page and the book hold in stasis for us. Imitation alters the material world, which in turn affects the texture of further imitation. If we remove the process from its material interface we lose our objectivity. In doing so we isolate the single termite from its mound and, after much careful study, announce that we have reduced termite nature to its simplest constituent.
The reason for the tantalizing involutions here is obviously that intelligence is relentlessly reflexive, so that even the external tools that it uses to implement its workings become ‘internalized’, that is, part of its own reflexive process...
To say writing is artificial is not to condemn it but to praise it. Like other artificial creations and indeed more than any other, it is utterly invaluable and indeed essential for the realisation of fuller, interior, human potentials. Technologies are not mere exterior aids but also interior transformations of consciousness, and never more than when they affect the word.
Walter J. Ong, Orality and Literacy
Anyone reading this article cannot fail but be aware of the changing interface between eye and text that has taken place over the past two decades or so. New Media – everything from the internet database to the Blackberry – has fundamentally changed the way we connect with each other, but it has also altered the way we connect with information itself. The linear, diachronic substance of the page and the book have given way to a dynamic textuality blurring the divide between authorship and readership, expert testament and the simple accumulation of experience.
The main difference between traditional text-based systems and newer, data-driven ones is quite simple: it is the interface. Eyes and fingers manipulate the book, turning over pages in a linear sequence in order to access the information stored in its printed figures. For New Media, for the digital archive and the computer storage network, the same information is stored sequentially in databases which are themselves hidden to the eye. To access them one must commit a search or otherwise run an algorithm that mediates the stored data for us. The most important distinction should be made at the level of the interface, because, although the database as a form has changed little over the past 50 years of computing, the Human Control Interfaces (HCI) we access and manipulate that data through are always passing from one iteration to another. Stone circles interfacing the seasons stayed the same, perhaps being used in similar rituals over the course of a thousand years of human cultural accumulation. Books, interfacing text, language and thought, stay the same in themselves from one print edition to the next, but as a format, books have changed very little in the few hundred years since the printing press. The computer HCI is most different from the book in that change is integral to it structure. To touch a database through a computer terminal, through a Blackberry or iPhone, is to play with data at incredible speed:
Sixty years ago, digital computers made information readable. Twenty years ago, the Internet made it reachable. Ten years ago, the first search engine crawlers made it a single database. Now Google and like-minded companies are sifting through the most measured age in history, treating this massive corpus as a laboratory of the human condition...
Kilobytes were stored on floppy disks. Megabytes were stored on hard disks. Terabytes were stored in disk arrays. Petabytes are stored in the cloud. As we moved along that progression, we went from the folder analogy to the file cabinet analogy to the library analogy to — well, at petabytes we ran out of organizational analogies.
At the petabyte scale, information is not a matter of simple three- and four-dimensional taxonomy and order but of dimensionally agnostic statistics...
This is a world where massive amounts of data and applied mathematics replace every other tool that might be brought to bear. Out with every theory of human behavior, from linguistics to sociology. Forget taxonomy, ontology, and psychology. Who knows why people do what they do? The point is they do it, and we can track and measure it with unprecedented fidelity. With enough data, the numbers speak for themselves.
Wired Magazine, The End of Theory, June 2008
And as the amount of data has expanded exponentially, so have the interfaces we use to access that data and the models we build to understand that data. On the day that Senator John McCain announced his Vice Presidential Candidate the best place to go for an accurate profile of Sarah Palin was not the traditional media: it was Wikipedia. In an age of instant, global news, no newspaper could keep up with the knowledge of the cloud. The Wikipedia interface allowed knowledge about Sarah Palin from all levels of society to be filtered quickly and efficiently in real-time. Wikipedia acted as if it was encyclopaedia, as newspaper as discussion group and expert all at the same time and it did so completely democratically and at the absence of a traditional management pyramid. The interface itself became the thinking mechanism of the day, as if the notes every reader scribbled in the margins had been instantly cross-checked and added to the content.
In only a handful of years the human has gone from merely dipping into the database to becoming an active component in a human-cloud of data. The interface has begun to reflect back upon us, turning each of us into a node in a vast database bigger than any previous material object. Gone are the days when clusters of galaxies had to a catalogued by an expert and entered into a linear taxonomy. Now, the same job is done by the crowd and the interface, allowing a million galaxies to be catalogued by amateurs in the same time it would have taken a team of experts to classify a tiny percentage of the same amount.
This method of data mining is called ‘crowdsourcing’ and it represents one of the dominant ways in which raw data will be turned into information (and then knowledge) over the coming decades. Here the cloud serves as more than a metaphor for the group-driven interface, becoming a telling analogy for the trans-grapholectic culture we now find ourselves in. To grasp the topological shift in our thought patterns it pays to move beyond the interface and look at a few of the linear, grapholectic models that have undergone change as a consequence of the information age. One of these models is evolution, a biological theory the significance of which we are still in the process of discerning:
If anyone now thinks that biology is sorted, they are going to be proved wrong too. The more that genomics, bioinformatics and many other newer disciplines reveal about life, the more obvious it becomes that our present understanding is not up to the job. We now gaze on a biological world of mind-boggling complexity that exposes the shortcomings of familiar, tidy concepts such as species, gene and organism.
A particularly pertinent example [was recently provided in New Scientist] - the uprooting of the tree of life which Darwin used as an organising principle and which has been a central tenet of biology ever since. Most biologists now accept that the tree is not a fact of nature - it is something we impose on nature in an attempt to make the task of understanding it more tractable. Other important bits of biology - notably development, ageing and sex - are similarly turning out to be much more involved than we ever imagined. As evolutionary biologist Michael Rose at the University of California, Irvine, told us: "The complexity of biology is comparable to quantum mechanics."
New Scientist, Editorial, January 2009
As our technologies became capable of gathering more data than we were capable of comprehending, a new topology of thought, reminiscent of the computer network, began to emerge. For the mindset of the page and the book science could afford to be linear and diachronic. In the era of The Data Deluge science has become more cloud-like, as theories for everything from genetics to neuroscience, particle physics to cosmology have shed their linear constraints. Instead of seeing life as a branching tree, biologists are now speaking of webs of life, where lineages can intersect and interact, where entire species are ecological systems in themselves. As well as seeing the mind as an emergent property of the material brain, neuroscience and philosophy have started to consider the mind as manifest in our extended, material environment. Science has exploded, and picking up the pieces will do no good.
Through the topology of the network we have begun to perceive what Michel Serres calls ‘The World Object’, an ecology of interconnections and interactions that transcends and subsumes the causal links propounded by grapholectic culture. At the limits of science a new methodology is emerging at the level of the interface, where masses of data are mined and modelled by systems and/or crowds which themselves require no individual understanding to function efficiently. Where once we studied events and ideas in isolation we now devise ever more complex, multi-dimensional ways for those events and ideas to interconnect; for data sources to swap inputs and output; for outsiders to become insiders. Our interfaces are in constant motion, on trajectories that curve around to meet themselves, diverge and cross-pollinate. Thought has finally been freed from temporal constraint, allowing us to see the physical world, life, language and culture as multi-dimensional, fractal patterns, winding the great yarn of (human) reality:
The advantage that results from it is a new organisation of knowledge; the whole landscape is changed. In philosophy, in which elements are even more distanced from one another, this method at first appears strange, for it brings together the most disparate things.
People quickly crit[cize] me for this... But these critics and I no longer have the same landscape in view, the same overview of proximities and distances. With each profound transformation of knowledge come these upheavals in perception.
Michel Serres, Conversations on Science, Culture and Time
Posted by Daniel Rourke at 12:08 AM | Permalink
TrackBack URL for this entry: http://www.typepad.com/services/trackback/6a00d8341c562c53ef0115702e3adc970b
Listed below are links to weblogs that reference The Next Great Discontinuity: