When AI Replaces Empathy, the Customer Always Loses
Inspired by a LinkedIn post by Ambuj Gandhi — his recent story of a food-delivery experience via Zomato sparked this reflection on how AI, when poorly designed, can destroy more than it saves.
A Real Story from Ambuj Gandhi
In a time when companies are increasingly turning to automation and AI to manage customer service at scale, one LinkedIn post by Ambuj Gandhi struck a chord. Ambuj shared a simple, yet powerful story from his recent Zomato order. Because the experience he described wasn’t unique it was illustrative of a widespread problem many of us have faced, or will face, when customer support becomes a script rather than a conversation.
Here’s what happened, in his own narrative (lightly paraphrased): He placed an order via Zomato, with a 40-minute ETA. Twenty minutes later, the delivery executive arrived at the restaurant, which he thought was good progress. But rather than picking up the order, the delivery executive started moving away like away from the restaurant, away from the expected flow. Ambuj reached out to customer support. Instead of a human, an AI chatbot greeted him. And the chatbot kept repeating: “The delivery executive is moving. He has moved X meters.” Even when the delivery executive was 8 km in the wrong direction, the bot’s update stayed the same. No context. No empathy. No way to escalate. Eventually Ambuj was allowed to cancel — 50 minutes in — but by then the damage was done.
That post by Ambuj Gandhi is the inspiration behind this blog. I’m using it as a lens to explore the deeper issues around empathy and AI, how product decisions affect trust, where businesses go right and where they go wrong.
The Zomato Experience: More Than Just a Delay
What Ambuj’s story reveals is not just a failure of service, but a failure of design.
Chronology
- Order placed → ETA 40 minutes.
- 20 minutes later → delivery executive reaches restaurant.
- Then, instead of picking up the order, starts moving away.
- The customer (Ambuj) reaches out.
- Runs into an AI bot that responds with “he is moving X meters” — repeatedly.
- Even when the delivery executive was 8 km away from the restaurant, in the wrong direction, the reply was unchanged.
- No human-intervention option. No acknowledgement of frustration. No explanation.
- After 50 minutes, option to cancel without charge, but by then the customer’s experience was already broken.
Key frustrations
- Lack of situational awareness: The bot “knew” the executive was moving, but not that the movement was wrong.
- Rigid rules & scripts: The AI was locked into repeating metrics (“meters moved”) without interpreting or contextualizing them.
- No path to escalation: The user couldn’t reach a human until much later (if at all).
- Trust eroded: Even though the mechanics of “order fulfillment” might have been underway, the experience itself (how one feels during that journey) was ruined.
The Invisible Glue in Customer Experience
When you buy something, when you order food, or request help, what you really want is not just delivery or resolution, but to be seen, heard, and cared for. Empathy is what turns a transaction into an experience people remember (for good or bad).
Ambuj’s post demonstrates how easily that glue can be broken when automation overshadows human responsiveness. Key facets of empathy include:
- Acknowledgment of frustration: Recognizing that the customer is upset, inconvenienced.
- Understanding context and meaning: Not just “bag A was processed at time B,” but reading what is going on.
- Flexibility: Being able to adapt when something goes wrong — even if it violates a script.
- Offering reassurance and choices: Not just “things are moving,” but “This is not right; we can offer you options or compensate you.”
Why Many Businesses Prioritize Automation (And What They Miss)
There are compelling reasons why businesses invest in AI customer support:
- Scale: Handling thousands or millions of inquiries is expensive with humans.
- Speed: Bots respond instantly (for simple queries), are available 24/7.
- Cost control: Less hiring, training, payroll, turnover.
- Consistency: In theory, fewer human mistakes, consistent availability.
But what gets missed is:
- Situational nuance: Bots often use algorithms that quantify but don’t qualify.
- Emotional intelligence: Important when customers are anxious, angry, or disappointed.
- Trust and long-term loyalty: Many customers will tolerate small issues; poor support sticks with people.
- Word of mouth & reputation: Negative experiences spread, especially in social media age, and through networks like LinkedIn (like Ambuj’s post itself).
Where Automation Works — and Where It Doesn’t
To avoid disasters like in Ambuj’s case, it helps to see where automation tends to succeed, and where it tends to fail.
Works Well When:
- Tasks are predictable and repeatable: Order tracking, password resets, basic account info.
- Load is high, queries are low complexity: Massive scale, simple queries = bot advantage.
- Data is clear & clean: When there’s little ambiguity in what the user wants.
- Support designs include escalation paths: Let bots handle basics; hand off to humans for edge cases.
Fails When:
- Customer’s problem is unexpected or complex. E.g., delivery going in wrong direction — something unplanned.
- User is emotional or frustrated. Freeze-response, upset, anxious.
- Context matters but isn’t modeled. For example: Is the executive moving away from customer or toward? What’s the expected route?
- No fallback to a human. Every time you block human interaction, you increase risk of bad outcome.
The Cost of Replacing Empathy
Beyond the obvious annoyance, replacing empathy damages some foundational business assets:
- Trust: Once people feel they are not heard, they doubt you’ll take care of them in other moments too.
- Loyalty: Customers may switch more easily, or avoid the brand in future.
- Brand reputation: Stories of bad experience travel fast — Ambuj’s post is one of many.
- Long-term revenue loss: The cost saved on bots may be more than lost from churn, refunds, or re-acquisition of lost customers.
The Classic Product Decision Flaw
What went wrong here? At first glance, it might seem like just a small glitch. But dig deeper, and you’ll see a bigger issue — one that businesses across industries face.
- Business interest → cut costs by automating customer support.
- Customer reality → empathy lost, trust eroded.
It’s easy to understand why companies do this. AI is cheaper than hiring and training thousands of support executives. Bots don’t need sleep, they scale effortlessly, and they never complain.
But what often gets overlooked is the hidden cost. When customers feel unheard, misunderstood, or trapped in endless AI loops, the long-term damage to trust far outweighs the short-term savings.
This is something I had already explored in detail in my article A Costly Tradeoff Between Efficiency and Customer Trust — where I argued that while businesses obsess over efficiency, the real competitive advantage lies in preserving customer trust.
Building AI Support Systems That Don’t Kill the Experience
How do we build systems that harness AI’s power and preserve empathy
Here are design principles and a possible framework:
Hybrid Support Model
- AI handles the repetitive, low-complexity tasks.
- Human agents handle edge cases, emotional or ambiguous situations.
- The hand-off should be seamless, with all context preserved.
Empathy Built into the Bot
- Let bots recognize anomalies (for example, “We see the delivery is moving away from the restaurant rather than toward you”) instead of just movement.
- Bots should apologize or express regret, acknowledging the user’s frustration. (“I’m sorry this seems confusing.”)
- Provide meaningful updates, not just metrics. (“It looks like there’s an unexpected detour. We’re checking with your delivery executive now.”)
Transparent Escalation Paths
- Make sure customer can request a human agent easily.
- Duties of escalation: Time thresholds, user frustration thresholds.
Continuous Learning
- Use feedback loops to learn from customer injuries.
- Track cases like Ambuj’s, feed them into training data so AI can improve.
Metrics Beyond Efficiency
- Track not just “time to respond,” “cost saved,” but also: customer satisfaction, Net Promoter Score (NPS), “how many escalations,” “how many cancellations after bad support,” etc.
- Use qualitative feedback — what customers feel — not just quantitative.
Reimagined Version of the Zomato Scenario: What Could Have Gone Right
Let’s re-write Ambuj’s scenario, imagining it had good design:
- Ambuj orders, ETA 40 min.
- 20 min later, delivery executive reaches restaurant. Bot’s job here: check if pick-up is done. If not, initiate context awareness.
- Instead of just “moving X meters,” the system flags: “We see the executive is moving away from the restaurant. Sorry about that; we’re checking with them.”
- Bot immediately gives an option: “Would you like me to reassign a nearby delivery executive / cancel / escalate to a human?”
- If user says “escalate,” the chat switches to a human agent with all context: order number, GPS trail, timing.
- Customer feels heard. Even if delay or issue, trust is preserved.
Lessons for Leaders & Product Teams
If you’re a product manager, CX (customer experience) leader, CEO, or any stakeholder, here are some takeaways from Ambuj’s story:
- Don’t assume automation = better experience. Efficiency is only part of the equation.
- Measure what matters. Use metrics that capture emotion, trust, and resolution satisfaction — not just volume handled.
- Think about worst-case scenarios. Design not just for the standard path, but for when things go sideways.
- Empathy isn’t optional. It’s central to retention, brand equity, word of mouth.
- Use AI to augment, not replace, human touch. Let AI manage scale; let humans manage heart.
Giving Credit & Moving Forward
This blog was sparked by Ambuj Gandhi’s candid LinkedIn post, which serves as a reminder that even in our high-tech age, sometimes the most human moments are missed. When we automate everything, we risk losing not just convenience: but trust, kindness, and connection.
AI can and should be a force multiplier. But if it’s deployed without care, it becomes a barrier. Ambuj’s experience with Zomato shows that an AI that can count meters but cannot understand direction — or worse, cannot understand frustration — isn’t customer support. It’s a wall.
So here’s what I believe:
- Great AI should augment empathy, not eliminate it.
- Cost optimization shouldn’t come at the expense of trust.
- And businesses that do this right — who design AI and human together — will win. Because at the end of the day, people remember how they were treated more than how fast their food arrived.
