Google It! The Internet and the Nature of Knowing

by Ali Minai

“I’ll just google it again”, said my daughter when I asked her to remember something. It was a very reasonable suggestion, but it led me down an interesting line of thought about the nature of knowing and its recent transformation. Much has been said and written about how the Internet has changed human knowledge, in both positive and negative ways. The positives are obvious. The magic of the Internet, the World-Wide Web, and utilities such as Google and Wikipedia, have put enormous knowledge at our disposal. Now any teenager with a smartphone has effortless access to far more information than the greatest minds of a century ago. Even more importantly, the Internet has opened up vast new possibilities of learning from others, and allowed people to share ideas in ways that were unimaginable until recently. Not surprisingly, all this has led to a great flowering of knowledge and creativity – though, unfortunately, not without an equally great multiplication of error and confusion.

In terms of negative consequences, perhaps the most widely explored idea is that of epistemic bubbles – closed informational ecosystems enabled by social media connecting largely like-minded individuals. With little external input and minimal internal dissent, people in such bubbles can quickly fall prey to outlandish beliefs without the possibility of correction. The hazard of epistemic bubbles is compounded by the Internet’s facilitation of false information – fake news, so to speak. Of course, neither epistemic bubbles nor false information is new, but the Internet has supercharged both by lowering the barriers to entry so far that random TikTok personalities and Twitter ideologues can have a degree of influence that was previously reserved for a small, better-informed elite. On the positive side, this democratization of information has broken through the systematic indoctrination that elites have imposed on societies throughout history. At the same time, it has also led to an unprecedented undermining of facts and the disintegration of a shared experience of reality. But my concern in this article is with a more subtle, well-known – but perhaps insufficiently understood – effect of the Internet on human knowing: The externalization of knowledge and its consequences for creative thought.

The great Urdu poet Ghalib famously wrote: These ideas come to my mind from the world beyond: The sound of pen on paper is a seraph’s voice. The notion that new ideas come from somewhere beyond human reach is, of course, as old as the ages. The concept of “inspiration”, after all, entails the earthly body being imbued by a sublime breath. But in a modern, non-dualistic scientific framework, it is understood that new ideas must emerge – like all aspects of the mind – from the physical body, and in particular, the brain. Somehow, the same system of neurons and synapses that allows us to sense, perceive, remember, recall, plan, and behave also has the capacity to generate thoughts that are original. Clearly, they don’t come from “nowhere.” They must arise from the resources available within the brain.

In the last few decades, psychologists have put on a firm theoretical footing a notion that people have always understood intuitively: New ideas arise from the mixing, breaking, recombination, resonance, and transformation of existing knowledge. One way to encourage this process is to do it collectively, as in group brainstorming. In this case, a diversity of ideas from multiple minds can mix and breed to produce new ones. This approach has proved very successful for certain types of creativity, such as that involved in problem-solving, decision-making, engineering design, and even some types of scientific research. The Internet is an immense booster for this type of creativity because it allows interactions among individuals and knowledge sources that would otherwise be impossible. However, the quintessential thing that has always been recognized as creativity is an individual process. The creativity of poets, artists, writers, philosophers, mathematicians, scientists, and visionaries is always the emanation of one mind, and depends entirely on the capacity of that mind. But even when groups undertake a creative task, each idea must still originate necessarily in the mind of one person in the group – and then be shared with others to elicit more. Thus, ultimately, all creative thought comes from individual minds. That is where the aforementioned recombination and transformation of ideas occurs. If we wish to understand the essence of creativity, we must look not at committees of problem-solvers but at the multitudes contained within each individual. To quote Ghalib again: Each person is, in themselves, a tumult of thought; what seems to be solitude is, in fact, a gathering. It is from this “tumult of thought” that all ideas emerge. But how?

The capacity for creativity and innovation is among the deepest mysteries of the mind. Despite attempts to capture it through controlled experiments, its biological basis has mostly eluded scientific study because it is inherently unpredictable, and resists the rigor of the experimental method. After all, how can moments of inspiration be produced in a replicable way in a lab? When experiment fails, theory must fill the void, and this has been the case with creativity.

One useful way to think of the mind is to consider it as pure memory – a vast reservoir of associations and patterns instilled by instinct or abstracted from experience. To be sure, it is stirred continuously by sensations from the outside world, but the nature of the mental response is always one of recollection – the triggering of a pattern of activity across millions of neurons, corresponding to a percept, a thought, a memory, and occasionally an action. The brain cannot bring to mind that which it has not already structured within the pattern of its synaptic connections in some way – either explicitly or implicitly. External stimulus may generate a truly new pattern of neural activity, but it makes sense only when it relaxes to something already present, and thus recognized. What then of new thoughts and ideas?

Models of memory formation suggest that the process of storing true experiences in the brain may implicitly but inevitably create false ones as a side-effect. What’s more, these false memories tend to be built by recombining pieces of the true memories. It has even been speculated by Francis Crick and Graeme Mitchison – without any firm evidence – that dreams may actually be the brain replaying these false memories to erase them – a kind of mental garbage collection. It is these false memories – “recollections” of things never encountered through experience but formed by accident in the roiling cauldron of the mind – that bubble up occasionally as new ideas.  To build upon the notion that “all cognition is recognition,” one might say that “all imagination is confabulation.” (In a limited form, this approach to the dynamics of thought and creativity has been at the core of my own research in computational cognitive science.)

Some of the churn in the mental cauldron is created by the inner urges of the mind, and some is elicited by the constant barrage of stimulus in the form of sights, sounds, sensations, tastes, smells, words, music, emotion, and all the things that comprise experience. A chance remark, a piece of song or music, a picture on the wall, a line of poetry, a statement of scientific fact, an elegant mathematical equation – any of these can send a mind on a voyage of discovery, usually futile, but occasionally leading to the treasure of something new. Or in the reverie before falling asleep, or on a mindless drive to work, or in the shower when all other activity is impossible, the mind may wander and new thoughts may emerge. And sometimes, though rarely, it is the stress of an urgent situation – a deadline, a problem that must be solved, a temporary obsession requiring consummation – that leads to epiphany. But whatever drives them, inspiration and innovation always require prior raw material within the mind. The cauldron cannot boil and bubble if there’s nothing in it!

As we increasingly outsource knowledge to Google, Wikipedia, and other online repositories of information, we exclude it from presence in our minds. Knowledge in cyberspace is inert – accessible, but inert, not known by anyone. It does not interact with other knowledge. It does not break apart and recombine into the false memories that are new ideas. It does not produce dreams or nightmares. It does not drive the anxieties and desires that are the source of great art and great science. It just sits there in a passive medium, waiting for a query from a mind that needs to borrow it for a brief time, and then forget it, confident that, if needed, it can be found again through the magic of Google. Such freedom from the need to remember is liberating, but perhaps impoverishing too. While increasing our access to knowledge in countless ways, the Internet also poses a real danger of diminishing the innovative capacities of our minds.

Before the Internet age, people seeking to cultivate their minds put a premium on enriching their own store of memory with large amounts of knowledge: Historical and geographical facts, scientific ideas, tracts of poetry, mathematical concepts, pieces of music, patterns, genealogies, maps, routes, and a thousand other things. And these things did not sit inertly in their minds as they do on Google’s servers; they interbred and produced a progeny of ideas. And that is still how those who wish to be creative in any endeavor succeed today. They spend lifetimes acquiring the knowledge that enables them to create new things, be they works of art or mathematical theorems. But even in this, there is a difference. The greatest mathematician in the world no longer needs to learn the way to a friend’s house if she has GPS. But that means that she no longer has a map of her city – or anywhere else – in her mind. She only needs to remember what she needs for her work or to pursue other personal interests, but all other knowledge has become discretionary. And that makes even her mind a poorer incubator of new ideas.

It might be argued, of course, that what is lost in the internal content of the mind is more than made up in the exponentially greater availability of knowledge from external sources, which can inspire in ways that were never possible before. That is true, but availability is not the same thing as presence. Knowledge must be internalized to participate in the largely involuntary process of creativity. Knowledge accessed voluntarily for a temporary purpose is too often not internalized. In some cases, this is a blessing. Not needing to remember a lot of phone numbers or street addresses certainly frees up the mind to do better things, but there is still collateral damage as the capacity to remember things and let them sit and germinate in the mind gradually dissipates. And, in particular, it is the need for breadth of knowledge that is most at risk. People still acquire deep knowledge in their professional fields and areas of interest – often through the Internet – but the premium on knowing many different types of things, from the sublime to the trivial, is being lost, and the price of that loss is incalculable.

Some may also argue that most knowledge has always lain inert in books – far harder to access than it is on the Internet. That too is true, but, paradoxically, this works in favor of the point being made in this article. It is precisely the ease of access on the Internet that has made knowledge externalizable. Books – expensive and often unavailable – tantalize with the possibility of knowledge but relinquish it only after considerable effort. Indexes, index cards, and databases have eroded this to some degree in the modern era, but, on balance, books have only enhanced the value of knowing. The Internet is trading off this value against the promise of access. Knowing is now less valuable because information can always be found when needed. But ideas are not synthesized with “just in time” knowledge or knowledge requested explicitly. They do not arise on schedule or on demand, but emerge organically through rumination within the mind, through the mix and churn of knowledge that is already available within the mind for no reason. Increasing dependence on externalized and on-demand knowledge changes the entire process of thinking from a deep but subjective one to a superficial though perhaps more objective one. In some cases, the latter is a plus. For example, decision-making based on more information can be better than that based on instinct or “gut feeling”. In this situation, the incomparably larger store of externalized knowledge can be a boon, though still not without some hazard: Conclusions based on external knowledge are necessarily biased by what was searched for and accessed. Even in the outsourced mind, one needs to know what to look for and what can be found. In addition to the risk of biased decisions, this also greatly increases the danger of misuse. The information that resides in an individual’s mind has become situated and interwoven into the entire fabric of their knowledge – their episteme, their worldview. When it is used, that occurs within the context of this well-developed, complex inner structure, which may be wrong and misguided but is also well-rooted and thus somewhat coherent because it has been shaped by experience. The quality of this coherence is an important feature of what we recognize as wisdom in some individuals (and as delusion in others.) In contrast, information accessed only at the point of use is often poorly understood, is not rooted in the individual’s experience, and can easily be misapplied when put into careless minds. The person trying to self-medicate by looking up their symptoms on WebMD may be slightly better off than their grandparents who relied on folk remedies, but is probably much worse off than if they had left the treatment to an actual physician.

To be sure, there are types of knowledge, and a lot of knowledge is “just information”. It might be argued that outsourcing inert facts like the capital of Benin or the date of Alexander the Great’s death is no great loss. Given the finite capacity of the human brain, this is a reasonable argument, but, while outsourcing factual information may be fine, the habit of doing so is potentially toxic. Ultimately, facts are a very important component of deeper knowledge. For example, it is only by knowing facts such as the relative locations of countries, their histories, the nature of their terrain, the locations of their ports, etc., that we can think about geopolitics, the possibility of conflict, and strategies for addressing global problems. A person who does not know the basic facts of geography or history without looking them up on Google or relying totally on advisors is incapable of truly understanding the world, let alone thinking productively of new ideas about it. America’s ongoing debacles in Afghanistan and Iraq are stark examples of historical and geographical ignorance breeding failure. But even beyond this limitation, facts play a crucial role in enriching thought. Facts are what allow abstract ideas to become concrete, and for concrete ideas to emerge in the first place. Knowing the facts of history – and even the “facts” of mythology – enriches all literature, art, and culture in general. We often speak of the “universals” underlying great art – and they do – but so do the “particulars”, and those derive mainly from fact – from the stories of the Bible, the landscape of the Lake District, the history of England and France, the French and Russian revolutions, the issues underlying the American Civil War, the minutiae of Roman history, the horrifying statistics of the Holocaust. And this is just for Western literature! Outsourcing knowledge of facts leads to superficial understanding and superficial thinking, which inevitably lead to a superficial culture.

This is not at all to say that the Internet is making us more stupid. Quite the contrary!  The Internet is opening up to all of us a wealth of information and potential knowledge that was unimaginable for previous generations. But, ultimately, the Internet and the resources it provides are tools, and, as with all tools, it is up to the user to derive benefit or harm from them. Knowledge implies an act of knowing, and a knower. Lacking that, what we have is not knowledge but mere information. We can choose to use the vast amount of information offered by the Internet to enrich our minds and make them more creative, or to access and discard it in a purely utilitarian way. The former is a great boon, but the latter is often a greater temptation. Too often, we prefer to rent knowledge rather than to possess it. In doing so, I fear that we are in the process of replacing the deeply-rooted knowledge and deeply-rooted ignorance that have run human affairs until now with a much larger, externalized – and thus superficial – store of disposable information in cyberspace which is increasingly shaping our choices and behaviors. And, just as the knowledge in our minds has always competed with ignorance and error, so it is in the cybermind – except that the latter is much easier to hack.

This article is not a Luddite harangue against modern technology or an elitist argument against the democratization of knowledge. It is, rather, a cautionary analysis directed towards the beneficiaries of this technological democratization: Don’t take knowledge for granted because it is so easily available! The Internet is the greatest engine ever invented for enriching the mind – but only if it is used mindfully for this purpose. Personally, I have found that I learn most online when I let serendipity take me to unexpected places, when I connect with people who value knowledge, and when I choose to dwell on and understand the information I find rather than use it and move on. For all the advances in artificial intelligence, wisdom and creativity are still products of the living mind, and need that mind to be rich in knowledge. Access to a smartphone is not enough – yet.

Like what you're reading? Don't keep it to yourself!
Share on Facebook
Tweet about this on Twitter
Share on Reddit
Share on LinkedIn
Email this to someone