• Death and The Cloud: How to Grieve in the Digital Afterlife

    Angela Rose Brussel on Too Much Memory—and Not Enough

    Here are a few details that have remained more or less intact: J, when he was alive, had dark brown hair, large eyes, olive skin. He loved Proust and Fitzgerald, Marsalis and Bechet. Said things like “this too shall pass,” wrote them on rogue scraps of paper, and hid them in the bottoms of our backpacks. 

    Then, one morning, he was gone.

    I tried to remember all of the details that constituted him, but my memory would not allow it. Instead, I saw a body vaulting through the sky, a bike clumsily skidding below, and I was furious with myself for not being able to imagine something else or, at the very least, remember more. Me, the keeper of a thousand notebooks. The guardian of a thousand pens. But The Cloud, my trusty companion, seemed to have stored all of him. And when grief took me, it preempted my needs like any good lover would; showed me parts of myself I didn’t know were there and parts of others I had almost completely forgotten—and because the bereaved and the writer in me were rattled in their obsessive need to remember, they both gave themselves over to it entirely.  

    In my consideration of what it would be like to possess absolute recall, I remember the extraordinary case of Jorge Luis Borges’ fictitious Ireneo Funes who was the “solitary and lucid spectator of a multiform, instantaneous and almost intolerably precise world.” The outlines of the foam raised by an oar, for instance; every leaf of every tree, of every wood; the forms of the southern clouds raised at dawn on a particular day.

    The bereaved in me trills with envy. The writer in me, even more.

    “My memory is like a garbage heap,” says Funes in a tone that I reason is somewhat vainglorious, but not without a strong undercurrent of woe, and I imagine him, a bed-ridden prophet draped in damp sheets, his eyes emanating the restraint of an ascete, his utterances bearing the weariness of worlds seen, but never  lived. And beside him is me. The millennial stoic. A ghost of his future gossamer self holding a brain with the dimensions of a processor in my hands, eyes stung red from ruthless scrolling through photos I don’t remember having taken, words I don’t remember having written.

    I have spent so much time trying to remember what I lost that I seem to have lost all of it. J, for instance, I have tried to conjure, reanimate, render upright and alive, but it is of no avail. With each attempt, I feel like I am resurrecting him only to bury him and that the Cloud is more wicked accomplice than benign companion.

    At first, my desire to remember was formidable, but ultimately harmless. Then, the hunting became fierce. My desire, obstinate.

    “There are three deaths,” writes David Eagleman in his collection of essays concerning fabled afterlives. “The first is when the body ceases to function. The second is when the body is consigned to the grave. The third is that moment, sometime in the future, when your name is spoken for the last time.” 

    Before that third moment strikes, in Eagleman’s conception of one of many speculative afterlives, the deceased wait in a lobby for their name to be called; but not everyone is sad when the Callers enter the room, and some of them go as far as to prostrate themselves at the Callers’ feet, begging and pleading to be released. 

    “He’s stuck and he’s miserable,” writes Eagleman about a farmer who’s been trapped in this purgatory for hundreds of years. “The more his story is told, the more the details drift. He is utterly alienated from his name; it is no longer identical with him but continues to bind.” 

    I imagine J in this room. He is remembered and he is loved, but as the years pass, he grows weary and his features begin to distort. Eyes flutter through the color spectrum; limbs elongate and shorten in turn; skin, sometimes brown and taut, turns sallow and soft. He doesn’t recognize himself anymore, is no longer made in his image, and begins to dread his circumstance. 

    “And that is the curse of this room,” continues Eagleman. “Since we live in the heads of those who remember us, we lose control of our lives and become who they want us to be.”


    At first, my desire to remember was formidable, but ultimately harmless. Maybe, even, understandable to anyone who has had something or someone taken from them. I had lost what I loved and with each detail I unearthed, I felt like I was regaining it. Then, the hunting became fierce. My desire, obstinate. I wanted it all and the bounty was sweet, but I was alone. And the person I wanted to share it with most was gone.

    “One of the great adaptive virtues of our brains, the feature that makes our gray matter so much smarter than any machine yet devised” Jonathan Franzen writes in the New Yorker, “is our ability to forget almost everything that has ever happened to us.”

    How odd to think of forgetting as an adaptive virtue. How odd, too, that it can be considered a characteristic that not only distinguishes man from machine, but places him above it. There is, in fact, so much that I have forgotten. So much that I am probably better off without. But this does not deter me from trying to dredge it all up and merge myself with what once was my notebook and now is my machine.

    “Keepers of private notebooks,” writes Joan Didion, “are a different breed altogether, lonely and resistant rearrangers of things, anxious malcontents, children afflicted apparently at birth with some presentiment of loss.” 

    In this way, the bereaved and the writer in me felt as though they could not have been gifted a better device. Forgetting, I was certain, was not an evolutionary advantage. It was a grave weakness. And because, as Didion writes, “we forget all too soon the things we thought we could never forget,” I fell prey to it.

    Remember this I command, but I am no longer speaking to myself. I am speaking to it. My computer, my phone. The irony, though, is this: I depend on a machine to do my remembering, but it doesn’t. It can’t. Instead, it gives me memories twice removed. Gives me J twice-distorted. And the more I defer to it, the more my reality slips.

    “Say that the limits of language are the world’s limits,” says the omniscient narrator in Godard’s Two Or Three Things I Know About Her. “That the limits of my language are my world’s limits. And that when I speak, I limit the world. I finish it.”

    Say, why don’t we, that the limits of the computer are the world’s limits. That the limits of my computer are my world’s limits. And that when I use it, I limit the world, I finish it.


    When we first met, our brains, as well as our perceptions of ourselves, were just beginning to morph, but what exactly they were morphing into was unclear. And what parts of ourselves would remain, haunting us like specters of our experiences from before the inauguration of The Cloud, and what parts of ourselves would become enhanced and distorted after it, was also unclear. Zadie Smith, in her piece for the New York Review Of Books, gives names to these parts. They are the warring factions of one bifurcated self. Reader, meet Person 1.0 and Person 2.0. You are most likely familiar with both. 

    “One of the great adaptive virtues of our brains is our ability to forget almost everything that has ever happened to us.”

    Person 1.0 is flesh and bone. Blood flows through them. Their lids close when sleep comes, open when waking life does. If it is too cold, numbness grips the denuded body and arrests muscle movement. Too hot and the organs fail. Person 2.0, on the other hand, is algorithmically realized. Throw all the devices you own into acid pools, fling them into incinerators and choppy seas, and Person 2.0 will rise like Lazarus, technologically manufactured and disturbingly indestructible. Person 2.0, in other words, will not die. 

    To clarify this dichotomy and the pernicious effects of not being able to discern between the two people that constitute it, Zadie Smith alludes to master programmer and virtual reality pioneer Jaron Lanier. “Information systems,” he writes, “need to have information in order to run, but information underrepresents reality.”

    “In Lanier’s view,” Smith writes, “there is no perfect computer analogue for what we call a person. In life, we profess to know this, but when we get online it becomes easy to forget.”

    So too does it become easy to forget that the person who is now beside you, let’s say, as ashes in an urn, is actually, conclusively dead. To illustrate this digital amnesia, Smith conjures the Facebook wall of a teenager murdered in Britain, which is littered with messages that don’t seem to comprehend the reality and gravity of what has transpired. In doing so, she is struck by a frightening thought: “Do they genuinely believe,” she writes, “because the girl’s wall is still up, that she is still, in some sense, alive?”

    Person 1.0 is sifting through J’s emails as she writes this. She is calm and unperturbed, her hands are steady with purpose. Person 1.0 knows these are words from a dead man, but it doesn’t take much to send her into the glitching abyss where the 2.0s live. It doesn’t take much for Person 1.0, in other words, to become a 2.0, for she already, symbiotically, is. The ping of an email notification, actually, is enough to convince Person 1.0 that her last email to J has just been answered.

    “By vast pains we mine into the pyramid,” wrote Herman Melville. “By horrible gropings we come to the central room; with joy we espy the sarcophagus; but we lift the lid—and no body is there!” What’s worse? The body is still there, a flickering, pixelated version of its former self. Infinitely refreshable. Infinitely reloadable. And the bereaved grope toward it without relent, sucked into a bottomless backlit universe where everything could be described and nothing, understood. Where nothing is alive, and also nothing dead. Connection error: Try Again. Connection error: Try Again.


    Together, along with the other billions, both living and dead, J and I helped build the trans-human colossus that is The Cloud. We constitute it and we defer to it; evolved and uploaded ourselves onto it. And though death in the Information Age is still death, I am beginning to fear that my mind doesn’t actually understand this. I am beginning to fear that the so-called digital democratization of the living and the dead has convinced me that I haven’t, in fact, lost anything at all. And maybe, in a way, I haven’t. 

    In 1985, social psychologist Daniel Wegner proposed a psychological hypothesis called transactive memory. It describes a mechanism through which groups collectively encode, store, and retrieve knowledge. In other words, cognitive goods are off-loaded from one mind to another. But now that we’ve shifted from a traditional industry economy to one of information technology, our method for encoding, storing, and retrieving has drastically altered. 

    “Our work suggests that we treat the Internet much like we would a human transactive memory partner,” writes Wegner in Scientific American. “We off-load memories to “the cloud” just as readily as we would to a family member, friend or lover. The Internet, in another sense, is also unlike a human transactive memory partner; it knows more and can produce this information more quickly.”

    J’s 2.0 self is in front of me, but I am finding it difficult to compute its artificiality. In some ways, this self is more real than the once-living J. But something isn’t right. Its chest mimics the motion of breathing, but no air comes out. And the sentences, though once from his 1.0 mouth, begin repeating themselves in strange, jumbled formations. Chunks of its face are missing too and in their place are streams of code. 

    “As we off-load responsibility for many types of information to the Internet, we may be replacing other potential transactive memory partners-friends, family members, and other human experts with our ever present connection to a seemingly omniscient digital cloud.”

    If J had died in a cloudless world, everything I stored in his brain would have died along with him. But he didn’t. Neither of us, in fact, even had much of a chance of living in one. And if this sounds like I am confused, like I no longer know which world I want to be a part of, that is because I am. And I don’t. 

    “As advances in computation and data transfer blur the lines between mind and machine,” writes Wegner, “we may transcend some of the limits on memory and thought imposed by the shortcomings of human cognition. But this shift does not mean that we are in danger of losing our own identity.”

    Person 1.0 is here, but she does not speak. She will remain so still, so breathless, that she is almost undetected. If she doesn’t, Person 2.0 will hear her and the consequences will be devastating. She will lose the last shrouded, fallible parts of herself. Those few memories she has stored in her mind, untouched by any device. That one time, for instance, that J took her into the woods after dusk in Dordogne. That one is still her’s, she thinks. 

    “We are simply merging the self with something greater,” he continues, “forming a transactive partnership not just with other humans but with an information source more powerful than any the world has ever seen.”

    Person 1.0 wakes up from her daydream. Wires criss-cross and cut into her back. A computer hums beside her head and a phone is affixed like an appendage to her hand. 

    Angela Rose Brussel
    Angela Rose Brussel
    Angela Brussel is a writer based between Brooklyn and Beirut whose non-fiction and fiction have appeared in New Statesman, Catapult, The Awl, Nylon Magazine, Electric Literature, and The Wrong Quarterly, to name a few. Her website is here​.

    More Story
    Larry Kramer’s Great Expectations I was a student sitting in a cheap seat at the John Golden Theatre in the late spring of 2011, watching names write themselves...
  • Become a Lit Hub Supporting Member: Because Books Matter

    For the past decade, Literary Hub has brought you the best of the book world for free—no paywall. But our future relies on you. In return for a donation, you’ll get an ad-free reading experience, exclusive editors’ picks, book giveaways, and our coveted Joan Didion Lit Hub tote bag. Most importantly, you’ll keep independent book coverage alive and thriving on the internet.