Lumen Wirltuti:Warltati 2025 - Flipbook - Page 23
If you’ve contributed any form of content or visible interaction
to the Web at all, it lives on in redundant servers, or in archives
such as the Internet Archive’s Wayback Machine. We’re likely to be
remembered whether we want to be or not – on digital CCTV, in
the background of someone’s selfie, or as a fragment deep inside
some company’s lucrative AI large language model. For some, these
digital traces represent a way to remember the lost.
Writing in the New Yorker in 2015, author Matthew Malady
recounted seeing his deceased mother in a StreetView image:
“The confluence of emotions, when I registered what I was looking
at, was unlike anything I had ever experienced – something akin to
the simultaneous rush of a million overlapping feelings. There was
joy, certainly – ‘Mom! I found you! Can you believe it?’ – but also
deep, deep sadness.”
Malady’s reaction is a good summary of how many might feel on
their first encounter with such a digital memory. However, beyond
the emotion lay questions about who owns and maintains that
blurred memory, which, unlike a memorial in a cemetery or at a
roadside, can’t be easily refreshed or taken down by relatives.
There are emerging regulations around deep fakes, however,
these don’t address specific post-mortem situations. The race
to build posthumous AI avatars raises ethical and emotional
considerations about the nature of grief and memory, as well as
considerations about control and privacy, and run far ahead of
existing regulatory frameworks.
IN 2022, AMAZON DEMONSTRATED
A SCENARIO THAT INCLUDED ITS
ALEXA SMART HOME DEVICE
READING A STORY TO A CHILD
IN THE VOICE OF HIS DECEASED
GRANDMOTHER.
the grieving process? Who controls these digital models, and for
how long? What are the ethical boundaries of recreating someone
posthumously, and who has the right to do so? How is the data for
these avatars sourced, and does control of that data confer rights to
create such avatars?
Should these digital representations be treated as heirlooms,
passed down through generations, or should they have an
expiration date? Additionally, how can we prevent emotional
abuse through malicious use of these avatars?
These questions underscore the need for careful consideration
of the long-term implications of posthumous AI technology on
individuals, families, and society as a whole.
With our ever-growing digital footprints, there is a wealth of data
to pull from to give a loved one a second life, much of which can be
reconstructed through the use of tools such as voice generators and
language models.
Scott Smith was a visiting research fellow at UniSA, working with
MOD. to create the FOREVER exhibition, examining what the
future of death could, or should, look like. He is a US-educated global
futurist and author. This article is an excerpt from his essay on the topic
which forms part of the book accompanying the exhibition. The book is
available for purchase online or in-person from MOD.
This emerging technology prompts numerous questions: How
does interacting with a digital version of a lost loved one impact
Images created by Lachlan Wallace, Communications Officer for the
University of Adelaide, using ChatGPT.
LUMEN