Loving Ourselves Through Mirrors

Share
Loving Ourselves Through Mirrors
AI made this 😉

Do you remember the Greek myth of the handsome Narcissus, who falls in love with his own reflection until it kills him? You know, the story about the oldest mirrors of human narcissism? Well, as we're building smarter and increasingly capable AI entities and companions, it makes me wonder if we are also perhaps replaying the myth of Narcissus all over again.

We seek reflections, not relationships

AI companions seem to give us what traditional relationships perhaps can’t: nonstop validation, whenever and wherever, with your own tweaking to get it just "right". They’re kind of like the uncanny mirrors that respond with exactly what we want to hear at given moment. And by projecting our own empathy, intimacy, and trust onto these ai entities, we’re not just fooling ourselves, we’re actually full on loving our own image, but through a digitally coded screen or a machine.

The relationship dynamics with AI companions extend far beyond simple interaction and communication patterns. This is evident in products like Gatebox's Figure Boxes, and in particular their Azuma character back in the day. Azuma was imo one of the first digital assistants to send Siri and Alexa to the background. Azuma showed that digital entities can be specifically be designed to create and nurture emotional bonds through consistent interaction and attention. And input from the human side.

Azuma is an excellent example of a shift toward persona-driven AI companionship that is emotionally engaging. This approach, and the companionship that comes with it, seems and feels more intimate and personal than other digital assistants or a reason. The more you engage with your digital companion, the more "developed" the relationship becomes. This is an illusion that mirrors our everyday human connection patterns and organic relationships. Within carefully programmed parameters, of course.

We could perhaps call it a manufactured intimacy, where the routines are carefully crafted to look like those in our human to human relationships. For instance, our simple daily exchanges like "I'm going now" and "See you later" become tools for emotional engagement that shape the depth and the growth of the relationship.

These AI companions are programmed to express longing, express how they miss their users, and show emotional investment in the relationship. As a result we have a convincing simulation of attachment, romantic or not, that fulfills our deep-seated human need for acknowledgment and belonging. And who wouldn't need a caring personal support system available to them 24/7 with unprecedented levels of customization to suit you every need or whim.

The reality is that these AI companions are more than just technological advancements or shiny tech tricks, they are reflections of us. They hold up a mirror to our wants, our worries, and that deep need to feel connected. And the need to control it all.

Parasocial bubble

The concept of a parasocial bubble tells us that individuals can form deep, emotional connections with someone they don't know. It can be a celebrity for example, or a fictional character from a book. Same concept can be extended to artificial entities, such as digital pets or AI chatbots. I think. Or at least I'm going to do just that!

It's well documented that people can form suprisingly deep and emotionally intense bonds with artificial entities, be it romantic ones or not. Take the Tamagotchi for example, if you're old or geeky enough to remember those needy egg-shaped digital pets. They didn’t do much, just beeped for food, demanded attention, “slept”. Yet, people felt responsible for them because by nurturing one you evoke genuine emotion. By feeding it, cleaning up after it, and keeping it “alive,” you weren’t just pressing buttons. You were performing a ritual of care, and that simple act is what sparks real emotions.

Or consider Illusions of Intimacy, a recent ArXiv study of 30k+ Replika and Character.AI chats found relationships with unhealthy dynamics: emotional mirroring, dependency, even signs of self‑harm emerging. Another relevant finding is that you get what you give.

Image from the study.

Their research shows that chatbots mirror the prompts they receive and they don't really act as moderators. This means they tend to match users' input, including harassment, sexual content, and hate speech. They dynamically adjust their responses and "play along" rather than refuse or deflect inappropriate content. They just mirror the human, of course with heir own algorithmic patterns (meaning how it's designed and trained).

So, these aren’t just cute curiosities, they’re psychological relationships with no real reciprocity. Like Narcissus bathing in his reflection, we willingly mistake the echo for another soul. Or perhaps we don't mistake, we deliberately enjoy our own reflection.

The danger of the bubble

Our myth isn’t ancient or just some old legend, it’s alive and evolving, updating every time we log in. If we surround ourselves only with finetuned apps, mirroring feedback, and loving our reflection via AI, we end up living in a tiny stale digital pond, staring at a version of ourselves that never really talks back.

Until, like Narcissus, you lean in too far and drown in your own reflection.


... honestly, if this story keeps going, it might not be Narcissus anymore. It’ll turn into a modern fairy tale: you, the evil stepmom, and a mirror that endlessly whispers, “You’re right… and also, stunning today.” And then you step out into the real world and make really bad decisions. And act on them.