Killing Mnemosyne

It was, in retrospect, inevitable that once we began referring to the data stores of computers as “memory,” we would begin to confuse machine memory with the biological memory inside our minds. At the moment, though, there seems to be a renewed interest in the remarkable, and not at all machinelike, workings of biological memory, due at least in part to the popularity of Joshua Foer’s new book Moonwalking with Einstein. When I was writing The Shallows, the research that was most fascinating and enlightening to me came when I looked into what we know (and don’t know) about human memory and its role in our thinking and the development of our sense of self. (An excellent book on the science of memory is Eric Kandel’s In Search of Memory.) Here’s an excerpt from the start of “Search, Memory,” the chapter of The Shallows devoted to this subject.

* * *

Socrates was right. As people grew accustomed to writing down their thoughts and reading the thoughts others had written down, they became less dependent on the contents of their own memory. What once had to be stored in the head could instead be stored on tablets and scrolls or between the covers of codices. People began, as the great orator had predicted, to call things to mind not “from within themselves, but by means of external marks.” The reliance on personal memory diminished further with the spread of the letterpress and the attendant expansion of publishing and literacy. Books and journals, at hand in libraries or on the shelves in private homes, became supplements to the brain’s biological storehouse. People didn’t have to memorize everything anymore. They could look it up.

But that wasn’t the whole story. The proliferation of printed pages had another effect, which Socrates didn’t foresee but may well have welcomed. Books provided people with a far greater and more diverse supply of facts, opinions, ideas, and stories than had been available before, and both the method and the culture of deep reading encouraged the commitment of printed information to memory. In the seventh century, Isidore, the bishop of Seville, remarked how reading “the sayings” of thinkers in books “render[ed] their escape from memory less easy.” Because every person was free to chart his own course of reading, to define his own syllabus, individual memory became less of a socially determined construct and more the foundation of a distinctive perspective and personality. Inspired by the book, people began to see themselves as the authors of their own memories. Shakespeare has Hamlet call his memory “the book and volume of my brain.”

In worrying that writing would enfeeble memory, Socrates was, as the Italian novelist and scholar Umberto Eco says, expressing “an eternal fear: the fear that a new technological achievement could abolish or destroy something that we consider precious, fruitful, something that represents for us a value in itself, and a deeply spiritual one.” The fear in this case turned out to be misplaced. Books provide a supplement to memory, but they also, as Eco puts it, “challenge and improve memory; they do not narcotize it.”

The Dutch humanist Desiderius Erasmus, in his 1512 textbook De Copia, stressed the connection between memory and reading. He urged students to annotate their books, using “an appropriate little sign” to mark “occurrences of striking words, archaic or novel diction, brilliant flashes of style, adages, examples, and pithy remarks worth memorizing.” He also suggested that every student and teacher keep a notebook, organized by subject, “so that whenever he lights on anything worth noting down, he may write it in the appropriate section.” Transcribing the excerpts in longhand, and rehearsing them regularly, would help ensure that they remained fixed in the mind. The passages were to be viewed as “kinds of flowers,” which, plucked from the pages of books, could be preserved in the pages of memory.

Erasmus, who as a schoolboy had memorized great swathes of classical literature, including the complete works of the poet Horace and the playwright Terence, was not recommending memorization for memorization’s sake or as a rote exercise for retaining facts. To him, memorizing was far more than a means of storage. It was the first step in a process of synthesis, a process that led to a deeper and more personal understanding of one’s reading. He believed, as the classical historian Erika Rummel explains, that a person should “digest or internalize what he learns and reflect rather than slavishly reproduce the desirable qualities of the model author.” Far from being a mechanical, mindless process, Erasmus’s brand of memorization engaged the mind fully. It required, Rummel writes, “creativeness and judgment.”

Erasmus’s advice echoed that of the Roman Seneca, who also used a botanical metaphor to describe the essential role that memory plays in reading and in thinking. “We should imitate bees,” Seneca wrote, “and we should keep in separate compartments whatever we have collected from our diverse reading, for things conserved separately keep better. Then, diligently applying all the resources of our native talent, we should mingle all the various nectars we have tasted, and then turn them into a single sweet substance, in such a way that, even if it is apparent where it originated, it appears quite different from what it was in its original state.” Memory, for Seneca as for Erasmus, was as much a crucible as a container. It was more than the sum of things remembered. It was something newly made, the essence of a unique self.

Erasmus’s recommendation that every reader keep a notebook of memorable quotations was widely and enthusiastically followed. Such notebooks, which came to be called “commonplace books,” or just “commonplaces,” became fixtures of Renaissance schooling. Every student kept one. By the seventeenth century, their use had spread beyond the schoolhouse. Commonplaces were viewed as necessary tools for the cultivation of an educated mind. In 1623, Francis Bacon observed that “there can hardly be anything more useful” as “a sound help for the memory” than “a good and learned Digest of Common Places.” By aiding the recording of written works in memory, he wrote, a well-maintained commonplace “supplies matter to invention.” Through the eighteenth century, according to American University linguistics professor Naomi Baron, “a gentleman’s commonplace book” served “both as a vehicle for and a chronicle of his intellectual development.”

The popularity of commonplace books ebbed as the pace of life quickened in the nineteenth century, and by the middle of the twentieth century memorization itself had begun to fall from favor. Progressive educators banished the practice from classrooms, dismissing it as a vestige of a less enlightened time. What had long been viewed as a stimulus for personal insight and creativity came to be seen as a barrier to imagination and then simply as a waste of mental energy. The introduction of new storage and recording media throughout the last century—audiotapes, videotapes, microfilm and microfiche, photocopiers, calculators, computer drives—greatly expanded the scope and availability of “artificial memory.” Committing information to one’s own mind seemed ever less essential. The arrival of the limitless and easily searchable data banks of the Internet brought a further shift, not just in the way we view memorization but in the way we view memory itself. The Net quickly came to be seen as a replacement for, rather than just a supplement to, personal memory. Today, people routinely talk about artificial memory as though it’s indistinguishable from biological memory.

Clive Thompson, the Wired writer, refers to the Net as an “outboard brain” that is taking over the role previously played by inner memory. “I’ve almost given up making an effort to remember anything,” he says, “because I can instantly retrieve the information online.” He suggests that “by offloading data onto silicon, we free our own gray matter for more germanely ‘human’ tasks like brainstorming and daydreaming.” David Brooks, the popular New York Times columnist, makes a similar point. “I had thought that the magic of the information age was that it allowed us to know more,” he writes, “but then I realized the magic of the information age is that it allows us to know less. It provides us with external cognitive servants—silicon memory systems, collaborative online filters, consumer preference algorithms and networked knowledge. We can burden these servants and liberate ourselves.”

Peter Suderman, who writes for the American Scene, argues that, with our more or less permanent connections to the Internet, “it’s no longer terribly efficient to use our brains to store information.” Memory, he says, should now function like a simple index, pointing us to places on the Web where we can locate the information we need at the moment we need it: “Why memorize the content of a single book when you could be using your brain to hold a quick guide to an entire library? Rather than memorize information, we now store it digitally and just remember what we stored.” As the Web “teaches us to think like it does,” he says, we’ll end up keeping “rather little deep knowledge” in our own heads. Don Tapscott, the technology writer, puts it more bluntly. Now that we can look up anything “with a click on Google,” he says, “memorizing long passages or historical facts” is obsolete. Memorization is “a waste of time.”

Our embrace of the idea that computer databases provide an effective and even superior substitute for personal memory is not particularly surprising. It culminates a century-long shift in the popular view of the mind. As the machines we use to store data have become more voluminous, flexible, and responsive, we’ve grown accustomed to the blurring of artificial and biological memory. But it’s an extraordinary development nonetheless. The notion that memory can be “outsourced,” as Brooks puts it, would have been unthinkable at any earlier moment in our history. For the Ancient Greeks, memory was a goddess: Mnemosyne, mother of the Muses. To Augustine, it was “a vast and infinite profundity,” a reflection of the power of God in man. The classical view remained the common view through the Middle Ages, the Renaissance, and the Enlightenment—up to, in fact, the close of the nineteenth century. When, in an 1892 lecture before a group of teachers, William James declared that “the art of remembering is the art of thinking,” he was stating the obvious. Now, his words seem old-fashioned. Not only has memory lost its divinity; it’s well on its way to losing its humanness. Mnemosyne has become a machine.

The shift in our view of memory is yet another manifestation of our acceptance of the metaphor that portrays the brain as a computer. If biological memory functions like a hard drive, storing bits of data in fixed locations and serving them up as inputs to the brain’s calculations, then offloading that storage capacity to the Web is not just possible but, as Thompson and Brooks argue, liberating. It provides us with a much more capacious memory while clearing out space in our brains for more valuable and even “more human” computations. The analogy has a simplicity that makes it compelling, and it certainly seems more “scientific” than the suggestion that our memory is like a book of pressed flowers or the honey in a beehive’s comb. But there’s a problem with our new, post-Internet conception of human memory. It’s wrong.

6 thoughts on “Killing Mnemosyne

  1. Thefrailestthing

    In his recent review of The Shallows in the London Review of Books, Jim Holt highlighted just this particular issue with an emphasis on creativity via an anecdote from the life of French mathematician, Henri Poincare. Very helpful.

    From a graduate student fascinated by memory and beginning to research the complex relationship between technology and memory, thanks for pointing to these sources.

    Mike

  2. Charles

    You remind me of what my old pal Timothy Leary said, when he just got on his “cybernetic mind-expansion” kick, “Why should I memorize stuff when I can just put it in a computer and look it up? I told him, his Apple ][ wasn’t quite up to that task in general, maybe store your address book and simple accounting, and besides, his computer was frequently broken and in my repair shop, what do you do then? But in general, I agreed. Soon enough, computers would be up to the task of remembering a much wider range of our personal data.

    Lately, if I need to recall what I was doing at a certain time, I go back and check my blog. I don’t really write a diary of what I’m doing, like some bloggers. But if I can read my own article describing what I was thinking about at that time, I can usually remember what I was doing at that time. But sometimes I just draw a blank, or my blog was blank at that time. Our computers are not yet a “Life-Box” where everything we do is preserved.

  3. Aacurtis

    Admittedly, I’ve caught myself using the same reasoning. Two things that come to mind from my studies of memory and cognition. 1) So far as we understand our brains don’t store memories as bit patterns to be read by a single program (like video replay). It’s a complex association of activations of senses that occur together. This is why a smell or a taste can bring back a flood of memories. The activation of one part of the memory brings the others back. Our brains aren’t built to function as instant replay devices, although we can memorize sequences of events with enough work.

    2) I agree there’s tremendous cognitive value in writing things down. Just as reciting oral traditions embedded them in memory and allowed those traditions to be folded and adapted into an individual’s own set of stories and meanings, writing down things as we learn on paper or digitally is a different type of memory than a video of an external event. When we force ourselves to retell the story, we force ourselves to think about it. Automated retelling (i.e., retweeting) does not force this kind of thought.

  4. RPIZARRO

    Nick: you will strongly enjoy reading Martin Heidegger (1889 – 1976). The memory thing, the thinkink thing and the whole modern epoch thing, were all deeply-deeply thinked and asked by this big thinker (the biggest of all, a lot of people say), as all the twenty century was going on :-)

  5. Michael Piazza

    For most of my adult life I subscribed to the idea that knowledge easily accessible is not worth memorizing. Filling up space in my brain and wasting energy by memorizing was not wise. I have reversed this position.

    There are processes that occur in our brains with memorized data which increase our right and left brain intelligence. Death, disease, and time may be the only limits to our capacity to contain information. Devoting my capacity for memory to links is atrophying my brain and my life.

    I learned to drive recently and have never driven without a GPS. When my GPS malfunctions – I am lost. Filling up my brain with lists of links is not satisfying. Just as re-reading books of interest brings us deeper understanding, memorizing relevant information is rewarding. “The Art of Memory” by Frances A. Yates reviews the memory techniques used to memorize terabytes of data when memory was valued.

  6. Marty Besant

    I don’t have a degree or even an education in information technology. Taught high school chemistry through the 80s and 90s. Used to teach my students about the Law of Conservation of Memory. Like matter or energy, memory cannot be created nor destroyed; it can only be reorganized. As computers increased the amount of RAM, that memory had to come from somewhere. Where else but from the students’ brains. Used to joke that every time the scanner at the grocery store beeped, a small bit of their memory was being sucked into the store’s computer network, to be accumulated and sold to the chip manufacturers. Perhaps this was a premonition of your far more scholarly text, The Shallows.

Comments are closed.