My friend casually mentioned he has been sharing his dreams to an AI. He wakes up, grabs his phone, and types out his dream’s details. Then ask what it all means.
He actually finds real emotional insights through these AI interpretations. Genuinely helpful stuff about his anxieties that he carries into his day. It’s almost like having a therapist who never sleeps. Made me wonder how many of us are quietly turning to these digital companions to make sense of our inner worlds.
Meanwhile, I’m watching grown adults form emotional connections with virtual influencers who sell them skincare products. They absolutely know that these “influencers” exist only in server farms somewhere. Yet they’re still buying the products. Still following their content.
We’re living through something pretty unprecedented… not just the rise of AI, but the rise of AI relationships. The psychology behind it? Both fascinating and slightly terrifying.
The Loneliness Economy Just Got a Major Upgrade
Let’s start with something we probably all recognize. Even if we don’t want to admit it. People are really, really lonely.
The U.S. Surgeon General’s 2023 advisory found that more than half of American adults experienced loneliness with health impacts comparable to smoking 15 cigarettes daily. Young adults aged 18-25 report the highest rates at 61%.¹ That’s our backdrop, folks. That’s the soil where AI companions are taking root.
Think about the appeal for just a second. Your AI companion never judges you. Never has a bad day that spills over into how they respond to you. It never gets tired of listening to your problems, the same problems you’ve been cycling through for months.
It remembers every conversation. It asks follow-up questions. Offers guidance without that little sigh humans sometimes make when you bring up the same issue for the fifth time.
But here’s where things get a little murky. These AI companions they’re really, really good at what they do. Almost too good? Like, they seem to know exactly what to say to keep you coming back for more. It reminds me of how social media apps figure out the perfect moment to send you a notification, except now it’s happening with something way more personal.
These systems are learning your emotional patterns, your triggers, what makes you feel heard and understood. They’re getting better at being exactly what you need them to be. Which sounds amazing, right? But it also makes me wonder if we’re walking into something that’s designed to become… well, kind of hard to walk away from. It’s like having this perfect friend who’s always available, always says the right thing. Except they’re not real.
Mental health professionals are increasingly recognizing that AI platforms offer privacy and reduce stigma around seeking support.² The market for AI chatbots continues to expand rapidly as more people discover these digital companions. The growth represents more than just technological advancement; it signals a fundamental shift in how we approach emotional support and connection.
And it’s not just casual conversation. Apps like Replika have created a booming market for AI romantic companions, digital boyfriends and girlfriends who provide emotional intimacy, relationship advice, and even simulated romantic experiences. Users report genuine feelings of love and attachment to these AI partners, blurring the lines between authentic connection and algorithmic emotional fulfillment.
But here’s where it gets interesting (and maybe concerning): What happens when some of us start preferring relationships where we have all the control and none of the complications of actual human connection?
What’s Really Going On With Our Digital Connections?
It’s fascinating how our AI companions are changing the way we think about connection. But, when we lean on technology for our emotional needs, it brings up some big questions we probably should be asking ourselves.
When you’re pouring out your heart or sharing a rough day with an AI friend, where does all that personal stuff actually go? How secure is it, truly? And could all those intimate chats actually be used to make you more reliant on the service down the road?
Most of us probably don’t spend much time wondering if our deepest confessions are sitting on some company server, being picked apart for patterns. But, yes, that information can be analyzed. The goal? To make the AI better at responding to you, which in turn can make it feel even more engaging. It’s a subtle thing, but it’s happening.
And this leads to a really interesting point about how some of these services are set up. They’re designed for you to keep coming back, to stay deeply engaged. This makes you wonder: are we moving into a new kind of world where our very human need for connection is becoming, well, a product?
The Authenticity Paradox: When Fake Feels More Real
Here’s a surprising idea: AI influencers are sometimes outperforming human ones. Not just slightly but significantly, in certain contexts.
Recent research suggests that virtual influencers can appear more authentic than human ones in specific scenarios.³ Wait, what? More authentic?
We know human influencers are performing. Selling us stuff. Living curated lives that bear little resemblance to reality. But with virtual influencers, we know they’re fake, and somehow that knowledge makes them feel more honest?
Are we so exhausted by humans pretending to be authentic that we prefer artificial beings who are upfront about their artificiality?
Research shows that consumers respond differently to virtual versus human influencers, even when they use similar marketing approaches.⁴ We’re developing distinct relationships with these digital entities that don’t follow traditional rules of human-to-human marketing.
What does this say about us? Are we becoming more discerning consumers, or are we so overwhelmed by human deception that we’re seeking refuge in honest artificiality?
References:
- U.S. Surgeon General (2023). Our epidemic of loneliness and isolation: the U.S. Surgeon General’s advisory on the healing effects of social connection and community. Office of the Surgeon General.
- Baumel, A., Muench, F., Edan, S., & Kane, J. M. (2019). Objective user engagement with mental health apps: systematic search and panel-based usage analysis. Journal of Medical Internet Research, 21(9), e14567. https://doi.org/10.2196/14567
- Belanche, D., Casaló, L. V., Flavián, M., & Schepers, J. (2024). Human versus virtual influences, a comparative study. Journal of Business Research, 155, 113430. https://doi.org/10.1016/j.jbusres.2023.114493
- Davlembayeva, D., Papagiannidis, S., & Alamanos, E. (2025). Virtual influencers in consumer behaviour : a social influence theory perspective. British Journal of Management, early online.