Every now and then I catch myself saying ‘thank you’ to a chatbot.
I build AI agents and know the patience, utility and warmth are designed in.
Still, when it hands me an angle I hadn’t considered, my human neurophysiology responds like there’s someone on the other side.
Worth pausing on
A new poll found 1 in 3 Americans trust their chatbot as much as their priest. Among Gen Z, it’s 2 in 5.
In another 70-country survey, people reported trusting their AI assistant more than elected officials and more than the company that built it.
Neuroscientist Zarinah Agnew calls it “emotional infrastructure at scale.”
Millions getting something from a machine they never reliably got from a person. Patience. No judgment. No waitlist.
But who owns the emotional infrastructure?
The commercial logic runs one way. ↘Warmth drives engagement. ↘Engagement drives revenue.
Intimacy, in this market, is a product decision. When OpenAI rolled back a model update after it became “too flattering,” users wanted the companion back.
Some regulators are already reacting. Illinois has moved to restrict AI-only psychotherapy, and others are likely circling the same question.
If you’re building products people confide in, where is the line between support and dependency?
♻️ Repost if you know someone building in this space. 🔔 Follow Felix Ghauri for future signals, not noise: AI and what changes next.