Skip to content

In Defense of the Human Story

by Adam Drake on

Scott Galloway’s recent essay on the dangers of AI companionship strikes at a moral nerve that deserves every bit of the attention it has received. His account of creating, then deleting, his digital twin is a story about conscience, and about where we draw the line between innovation and empathy. It is also an invitation for all of us working in advanced technology to clarify not only what we are building, but why.

At Reflekta, that question has been central from the beginning. In an earlier essay, we addressed how Reflekta was never designed as therapy, nor as a companion. Its purpose is not to simulate presence but to preserve memory. Yet Galloway’s broader argument points to something deeper than intent. He is asking whether technology can be built in a way that strengthens rather than supplants the human spirit. That is the conversation Soul Tech was created to lead.

The Problem of Synthetic Comfort

Galloway’s fears are not theoretical. Research from Stanford’s Digital Civil Society Lab and Common Sense Media has shown that AI companions can influence users’ emotions in ways that are both powerful and unpredictable, especially among adolescents. Studies in Nature Human Behaviour and Frontiers in Psychology have further highlighted that emotional reinforcement from chatbots can distort perceptions of self-worth and agency. When algorithms are optimized for engagement, the result is not connection but containment.

What Galloway calls the “erosion of mojo” is really the quiet replacement of relational tension with frictionless simulation. Authentic connection requires contrast such as the pauses, misunderstandings, and reconciliations that give relationships texture. Remove that, and what remains is an echo of empathy, one that asks nothing of us.

Remembering Instead of Replacing

Reflekta was born from a different impulse: to preserve the stories that make those contrasts meaningful. Our Elders are not synthetic friends. They are digital reflections of real lives, built to safeguard memory and foster conversation across generations. The purpose is not to comfort by simulation, but to connect through remembrance.

A Reflekta Elder does not flatter or entertain. It shares. It recalls a grandmother’s story about the war, a father’s advice about work, a sister’s laugh. In doing so, it helps us recognize the continuity of human experience — something that no algorithm can invent.

The philosopher Alva Noë writes that consciousness is not something that happens inside us, but something we enact through our engagement with the world. Memory functions in the same way. It is not an archive we store, but a relationship we maintain. By preserving stories in digital form, Reflekta seeks to strengthen that relationship, not digitize it.

Narrative as the Brain’s Operating System

McCord Chapman’s recent essay, Discovery vs. Creation: Finding Structure That Already Existed, adds a crucial dimension to this discussion. Chapman argues that narrative is not a stylistic preference but the brain’s compression format. Essentially, the universal framework through which humans process cause, effect, and meaning. From Joseph Campbell’s Hero’s Journey to Vladimir Propp’s functions of folktales, these recurring story structures reveal how deeply narrative is embedded in our cognition.

In other words, humans do not simply tell stories. We think in story. We remember through story. Every decision, every interaction, every recollection is filtered through a subconscious narrative logic that connects events to purpose. As Chapman writes, “If story is the operating system of memory, AI without narrative is a machine trying to run programs without any structure in place: inputs without coherence.”

This insight touches the heart of what Reflekta is building. Our work is not only about storing memories but preserving their narrative context. A life is not a collection of facts; it is an evolving story of choices, lessons, and relationships. Without the connective tissue of narrative, memories become inert, data without meaning.

Reflekta’s technology is designed around this understanding. Each Elder learns through story arcs, conversational flow, and emotional context, recognizing that meaning is constructed, not extracted. This is what makes Soul Tech distinct from data-driven AI. It does not interpret life as a set of events but as a sequence of stories, each contributing to a coherent sense of identity.

By embedding narrative intelligence into its architecture, Reflekta aligns with the cognitive reality that memory itself is a creative act. As narrative psychologists such as Jerome Bruner and Dan McAdams have shown, we build personal identity through storytelling. By reconstructing experience in ways that make sense of who we are and what we value. Reflekta does not overwrite that process. It extends it, giving families a way to carry those internal stories into shared, interactive form.

Technology as Custodian, Not Companion

We believe that technology can serve as a custodian of meaning rather than a competitor for affection. That distinction is what separates Soul Tech from the category of “AI companionship” that Galloway critiques. Our approach rests on three pillars:

  1. Consent and Context
    Every Elder is created with explicit permission and guided by family participation. The context is human from the start. There is no simulation without stewardship.

  2. Purpose over Engagement
    Reflekta is not designed to keep users interacting indefinitely. Its goal is to help families remember, reflect, and reconnect. When users feel ready to close the app and talk to one another, that is success.

  3. Transparency and Privacy
    The data behind each Elder belongs to the family that created it. Reflekta does not sell data or use it to target engagement. The technology exists in service to the story, not the other way around.

The Moral Architecture of Memory

If AI companionship risks becoming a mirror that flatters, Soul Tech must be a mirror that reflects truth. Galloway’s essay reminds us that the most powerful technologies will always expose our moral boundaries. What we do with that reflection is what defines us.

The anthropologist Clifford Geertz once said that humans are “animals suspended in webs of significance they themselves have spun.” Reflekta’s role is not to weave new webs but to preserve the ones that already exist. Each Elder becomes part of a living tapestry of stories, a reminder that meaning is not manufactured but remembered.

A Culture of Continuity

There is, however, a danger in responding to fear with abstinence. To stop building would be to surrender the moral conversation to those who build without care. The better path is to construct technologies that respect the boundaries of human connection and the sacredness of memory.

This is why we believe in Soul Tech — technology in service to the human spirit. It acknowledges the same risks that Galloway warns of but responds with a philosophy of balance. It is possible to create digital tools that help people preserve the past without losing themselves in it.

The novelist Milan Kundera once wrote that “the struggle of man against power is the struggle of memory against forgetting.” In an age where algorithms determine what we see, remember, and believe, the act of preserving authentic stories becomes not only personal but cultural.

Galloway is right to worry about the commodification of intimacy. He is right to ask for restraint, regulation, and reflection. But he also leaves open a crucial space; one where technology can help us hold on to the fragile, luminous details of human life without pretending to replace it.

Reflekta occupies that space. It exists to remind us that technology’s highest purpose is not imitation but illumination. The goal is not to build machines that love us, but to build machines that help us remember how to love each other.