Between December 2012 and March 2015, Facebook introduced the Year in Review, A Look Back, and On This Day—memory features that dredge up old photos of you and your friends on the network.
But the effect hasn’t always been positive. Facebook received such bad publicity for its 2014 edition of Year in Review—its algorithm cheerfully displaying photos of dead pets and burned-down houses—that it was forced to apologize to offended parties and tweak its code.
As Eric Meyer, who was presented with a photo of his recently deceased daughter, wrote on his blog, “for those of us who lived through the death of loved ones, or spent extended time in the hospital, or were hit by divorce or losing a job or any one of a hundred crises, we might not want another look at this past year.”
If Facebook’s ham-fisted sentimentality has taught us anything, it’s that when we outsource our memories to the internet, the internet gets to decide when we remember them. The social network’s engineered nostalgia presents us with a frightful if far-fetched vision of the past—one controlled, not by a government, as Orwell foretold, but by a company.
Forced acts of remembrance aside, a more far-reaching problem lies at the core of our digital lives. As we capture every moment on our high-res cameras; as we type and text rather than speak with our friends; as we buy, and watch, and listen, and read online, we’re inadvertently committing a great percentage of our experience to the permanent record that (with an increasingly unignorable irony) we call the “cloud.” But as we digitally embalm everything we say and do, bringing more and more of the past along with us, we begin to lose something essential: the future.
In his book Present Shock, Douglas Rushkoff warns that technology’s preservation of the past has trapped us in a permanent state of now, collapsing our sense of going anywhere new.
“Fifteen minutes spent on Facebook, for example, mashes together our friendships from elementary school with new requests for future relationships. Everything we have lived, and everyone we have met, is compressed into a virtual now… We live all of our ages at once.”
In this all-inclusive present, rather than move forward, we seem to go in circles. Reboots of Star Wars, Star Trek, Full House, Twin Peaks, X-Files, Gilmore Girls (the list goes on) suggests that our sense of the new has atrophied. Our social lives bulge with people from whom we would otherwise have grown apart. Blogs like Too Hard to Keep pay homage to the pain of memories that won’t go away, caused by images we can’t destroy.
Caught in the web of our own pasts, are we ever able to grow, or grow up? Considering the surgically inclined cast of Real Housewives of Beverly Hills, Rushkoff notes that, “In attempting to stop the passage of time and extend the duration of youth, they have succeeded only in distancing themselves from the moment in which their real lives are actually transpiring.”
Even if our coping strategies are less extreme, we’re all familiar with the expression “30 is the new 20” and no doubt subscribe to it in some form or another. Deferred careers, postponed partnerships, delayed parenthood—you name it. Unable to shed earlier versions of ourselves, we’re finding it harder and harder to act our age.
Memory is one of our greatest powers. Though we tend to imagine the past as if it’s just “there,” the entirety of it would slip away if it weren’t for our hippocampus. Outside the exercise of memory, the past has no presence to us at all.
And our memories have improved over time. Verbal, written, and material methods of preserving the past (much of it through storytelling) have benefitted from more recent technological advancements. As Will Self noted on BBC’s A Point of View,
“Before the late nineteenth century, the manufacture of memory was a laborious business, requiring cumbersome mechanical processes and even craft. Offset printing, followed by the mass dissemination of photographic images allowed the generality of people—who heretofore had been denied a record of the times—to line their shelves with them.”
What it must have been like to have a foggy understanding of the past we shall never know. For now our high-res, cloud-stored, time-stamped and otherwise digitally augmented memories are so powerful that we can recall great patches of our lives with crystalline clarity. The past is now as present as the present.
But now that we’ve been gifted this power over oblivion, the present is starting to feel a little bit bloated. Saturated with photographs, overwhelmed by “moments,” how can we hope to make sense of all this detail?
In Borges’s story “Funes the Memorious,” a Uruguayan is thrown from his horse and awakens to find he has a perfect memory.
“He knew by heart the forms of the southern clouds at dawn on the 30th of April, 1882, and could compare them in his memory with the mottled streaks on a book in Spanish binding he had only seen once and with the outlines of the foam raised by an oar in the Río Negro the night before the Quebracho uprising.”
Marvelous though his mind was, the narrator goes on to explain that, precisely because of his strange gift, Funes was left completely unable to think.
“He was, let us not forget, almost incapable of ideas of a general, Platonic sort. Not only was it difficult for him to comprehend that the generic symbol dog embraces so many unlike individuals of diverse size and form; it bothered him that the dog at three fourteen (seen from the side) should have the same name as the dog at three fifteen (seen from the front). His own face in the mirror, his own hands, surprised him every time he saw them.”
As the shutter-speeds on our memories become faster, snapping photos at ever-shorter intervals, we risk becoming a collective Ireneo Funes. Consumed by a world of detail “instantaneously and almost intolerably exact,” are we ever able to get the distance and perspective necessary to understand our world?
“To think is to forget differences, generalize, make abstractions. In the teeming world of Funes, there were only details, almost immediate in their presence.”
If works of fiction leave you unconvinced, the truth, ever stranger, is full of such cases. Hyperthymesia, or Highly Superior Autobiographical Memory (HSAM), is a real-life condition that endows its subject with a near-perfect mental record of their life. The first such case—Jill Price—was diagnosed in a University of California, Irvine study as recently as 2006. Able to recall the day-to-day details of her life—without omission—since she was 14, Price has effectively been given the gift of time travel. Name the date, and she can replay the footage.
But this gift is no doubt also a curse. Whether or not they choose to, hyperthymesics are given to relive painful moments from the past, and to do so with a level of sensory and emotional vividness that can be hard to stomach. It reminds us that forgetting isn’t just a failure to remember, but a deliberate act on the part of our minds in order to live happier lives.
The lesson here is an indirect one, because unlike Price, our innate memories are in all likelihood getting worse, not better, as we increasingly use technology as a mnemonic crutch. Materially, however, our sense of the past is bigger and richer than it’s ever been—and so too, our temptation to brood on it.
And brood we shall. For now that our memories are literally stored in a steel trap, forgetting is no longer an option. In a cautionary tale that is all-too-imaginable, the television series Black Mirror depicts a world in which brain implants allow us to record and replay everything around us. In the episode “The Entire History of You,” we witness a man driven to madness as he feverishly scrutinizes past scenes of his wife flirting with another man.
Growing up, making sense, being happy. How about starting over? Considering the value of the clean slate in society, Rushkoff laments that “the new permanence of our most casual interactions—and their inextricability from more legal, financial, and professional data about us—turns every transient thought or act into an indelible public recording.”
When we tweet, when we post, when we share, we act as if we were speaking in the ephemeral, off-the-cuff medium of sound rather than the permanent, on-the-record medium of sight. In this way, we make it hard to keep the different sectors of our lives separate from one another, and even put ourselves at risk.
As a case in point, Rushkoff reminds us of an episode where, after Googling his name, “US immigration agents at the Canadian border denied entry to Andrew Feldmar, a seventy-year-old college professor, because they found an obscure reference to the fact that he had taken LSD in the 1960s.”
Are we not entitled to a fresh start—or was that simply the convenience of an earlier and more forgetful age? Has life become a trial in which a single misstep will haunt us forever? When we can’t forget, how well can we forgive?
Fortunately, newer social media applications like Snapchat are building forgetting into the structural heart of their applications, deleting videos shortly after they’re shared. Perhaps that’s the key to their popularity. (Facebook take heed.)
When it comes to writing about big topics like memory and technology and time, one could of course always say more. But I’ll stop here and leave the concluding words to Faulkner. He said it best when he wrote that, “The past is never dead. It's not even past.” I remember that line because of an old photo on Facebook.