Deepfakes will attack our most precious memories. Are we ready?

Joan Westenberg
4 min readOct 30, 2023

Alzheimer’s has haunted my family for generations. I witnessed my great-grandmother’s excruciating descent into oblivion as her memories slowly eroded, leaving behind a hollow shell of the human being I remembered. The experience instilled in me a profound sense of dread — will I be next?

A new technological threat has magnified that anxiety. Deepfakes, hyperrealistic synthetic media powered by artificial intelligence, are fast developing the capability to distort reality and manufacture false experiences with alarming verisimilitude. For anyone who fears the loss of identity and the loss of self through Alzheimer’s, deepfakes jeopardize the sanctity of our memories.

Sophisticated AI synthesizes source material — photos, videos, voices — into new, manipulated versions that appear genuine. With exponential improvements in quality, deepfakes now simulate expressions and mannerisms with eerie accuracy, making authenticity nearly impossible to verify, bringing unreality to life.

My grandmother’s decline taught me how fragile and precious memories are. They shape our identity, guide our decisions, and nurture relationships. Losing them is a special kind of agony. Deepfakes, however, imperil memories by fabricating experiences that feel real, distorting the truth of our most precious moments.

As the line between real and fake memories blurs, deepfakes have the potential to wreak psychological havoc. What are we left with when the building blocks of selfhood and relationships corrode, and our sense of reality becomes unmoored?

My earliest memories of my grandmother are bathed in a warm glow. I remember her smile. I remember her homemade vanilla slice. As Alzheimer’s progressed, her essence faded. She became unrecognizable — a confused stranger in a familiar body.

Technology might bring my grandmother back — or at least give me a convincing facsimile of her. But I’m uncomfortable imagining bogus memories of time with my great-grandmother generated by deepfake technology, confusing my genuine recollections and rewriting her legacy. And I’m becoming aware that whether or not I develop Alzheimer’s myself, these fabricated experiences could utterly consume my grasp of reality.

Tech optimism argues that deepfakes’ threat is overblown. Old photo-editing techniques already allowed for manipulation. With sufficient media literacy, individuals could identify disinformation. But even if true, these counterpoints miss the insidious cognitive and emotional effects of deepfakes.

Photo manipulation is crude compared to Deepfakes’ seamless realism. While we can intellectually acknowledge old techniques as imitation, deepfakes often bypass our rational faculties and directly hijack our perceptions. Our minds instinctively ascribe false memories as real first before rational scrutiny intervenes.

Deepfakes leverage the same visual processing pathways as actual memories. Just as true memories can be misremembered, deepfakes may feel accurately recalled. This effect, called the illusion of truth, exploits the brain’s default acceptance of experiences as accurate, regardless of their authenticity. For individuals with declining cognitive function like those with Alzheimer’s — or even those in advanced years — resisting false memories poses a steep challenge.

Beyond these cognitive dimensions, deepfakes’ emotional resonance compounds their danger. Even if identified as fake, a realistic deepfake depicting a loved one can still elicit real feelings. Our minds retain the visceral experience while tags of inauthenticity are only applied afterwards.

My grandmother is gone, but deepfakes could conjure realistic simulations of her that feel painfully real. They could show false moments of her, healthy and cognizant. And I might cling to those illusions, instinctively accepting them as truth before reluctantly acknowledging their fraudulence. In doing so, deepfakes would falsely manipulate my emotions and corrode precious authentic memories.

This emotional manipulation has lasting effects. As memories naturally erode, external sources like photos, videos, and relationships with others scaffold personal identity. Deepfakes introduce instability to these social and cognitive touchpoints. Over time, incorporating synthetic memories lays the foundation for a distorted sense of self and reality.

Our memories and relationships comprise the foundation of personal identity and significance. For Alzheimer’s patients, preserving the sanctity of those remembered connections is paramount. But deepfakes infiltrate and undermine these bonds. They promise the comfort of manufactured moments that never truly existed. And their false emotional resonance creates an irresistible allure that obscures reality’s slow fade.

The AI community should prioritize detecting deepfakes and elevating authentic content. Companies profiting from synthetic media should fund ethical counter-research. Journalists play a critical role. As deepfakes proliferate through social media, professional fact-checking can and should counter them.

Truth and facts matter to me more than ever before. I feel responsible for taking action against deepfakes and their ability to manipulate emotions and exploit people. My strategy is to educate myself and others. By making people aware of the dangers deepfakes pose to society, we can develop defences against them. I warn older people susceptible to cognitive decline about how deepfakes can deceive them.

In my own family, I’m trying to physically preserve real videos, photos, and memories, keeping our genuine history maintained and inoculating my loved ones against synthetic memories spread through deepfakes. Knowledge is power, and through understanding and vigilance, I’m doing my part to confront this threat.

My grandmother embodied grace and compassion. Her spirit persists in my heart through treasured memories. But deepfakes imperil these recollections. This technology has the potential to poison our most sacred bonds unless we collectively resist its false appeal. We still have time to prevent deepfakes from ravaging society. But only if we recognize their unique danger before our ability to discern truth disappears.