When help is AI
25-12-2025
The other night, I was waiting for dinner ordered from a famous food delivery app.
Literally, the app said ‘5 minutes away' for more than an hour. I can see the person near by to my location but far away from my door.
I tried calling delivery person, no answer.. went downstairs if he is there talking over his phone. Found no one.
I tried again..nothing..
Eventually, I decided to connect with support, wasted another few minutes to unearth the small icon. Relief...
This is what support is for, I thought. Someone will see the problem and fix it. Probably make a call from their side.
Instead, I was greeted by a calm, synthetic cheerfulness. A chat window.
"Hi! I'm here to help "
I explained the situation. Hungry. Order late. Rider unreachable.
The response came instantly.
"I understand your concern. Delays can happen due to high demand."
I felt my irritation sharpen. Not because the sentence was wrong, but because it didn't meet me where I was. I wasn't asking for a sociology lecture on logistics. I was asking for dinner.
I tried again, more directly. "I haven't eaten. It's been over an hour. What can you do?"
The AI apologized. It always does. It offered me options that didn't matter. Wait longer. Track the order again, as if staring harder at the map would summon food into existence.
At some point it clicked: I wasn't talking to human who could feel urgency. Or embarrassment. Or responsibility. I was talking to a system designed to absorb frustration, not resolve it.
That's when I uninstalled the app. Not just the chat. The whole thing. I haven't installed it again it since. I uninstalled all other similar apps from my phone.
More like a decision made by my nervous system.
A sense that I didn't want to put myself in this position again.
What AI Gives?
We talk a lot about what AI gives us. Speed. Scale. Availability. Intelligence.
But that night made me think about what we lost when we replaced human support with artificial support.
And it wasn't just empathy, though that's the obvious answer.
A frustrated human feels heard when the listener share common ground, but that ground itself is missing..
Support used to be a human-facing boundary between a company and the messiness it creates.
When things went wrong, a human had to absorb the consequences. They had to hear the anger. They had to say, "Yes, this is bad." Even if they were constrained by policy, the act of listening did something. It acknowledged reality.
AI support removes that boundary. It turns conflict into a routing problem.
The system doesn't feel the cost of failure. It doesn't carry the emotional residue of ten bad conversations in a row. So the company doesn't feel it either.
If support is human, friction hurts. Too many failures mean burned-out agents, high turnover, angry escalations. There's a natural pressure to fix root causes. If support is AI, friction is cheap. You can let problems pile up because the system can apologize forever.
What do companies think they're optimizing for here? Efficiency. Reduced costs. Faster resolution times. Fewer support tickets per order.
What are they actually optimizing away? Accountability.
The AI I spoke to wasn't cruel. It wasn't even incompetent. It was doing exactly what it was designed to do: de-escalate, deflect, delay. Keep me inside predefined paths until I either calmed down or gave up.
And I did give up. Just not in the way the system expected.
When I told the AI I was hungry, nothing changed for it. No urgency entered the system. No discomfort. Hunger remained my private problem.
A human support agent might not have fixed it either. Let's be honest. They're often constrained, underpaid, and rushed. But the interaction would have been different in one important way: the frustration would have landed somewhere.
Even a bad human response says, "A human heard you and failed." An AI response says, "The system absorbed you and moved on."
That difference matters.