I’m three feet away from my HomePod and I want to swing it across the room. “Hey Siri! Turn on the dinner table!” I say, for the sixth time Siri waits and waits and says cheerfully, “You mean the table lamp?”
I’ve almost given up asking Siri to play songs — I tap my iPhone instead — and later, when I ask Siri to tell me the weather forecast, she’ll ignore me completely. That night I ask Siri to set an alarm, and she will confirm, and in the morning I’m late for work because my alarm won’t go off.
When Phil Schiller unveiled Siri at the launch of the iPhone 4S in 2011 – yes, that’s Siri That old – he said: “For decades, technologists have been teasing us with the dream that you can talk to technology and it will do things for us. Haven’t we seen this before, time and time again? But it never comes true.”
Ah, Phil. It still did not come true.
I loved Siri when it first came out because it felt like the future: it was flawed, sure, but you could see where it was going. And then it just didn’t work. I really hope that changes tomorrow at WWDC 2023.
It just doesn’t work
I know it’s smarter, but Siri in iOS 16 doesn’t feel significantly better than the version first introduced in iOS 5. And that’s because it isn’t.
Do not get me wrong. There have been improvements. More votes. Some third party access. Siri Shortcuts. But as a personal digital assistant, Siri is often worthless, and while that’s annoying enough on its own, it’s becoming an increasing problem for Apple in a world of big language models like ChatGPT. A lot of those apps give you wrong answers with a lot of confidence, but at least they give you an answer. I can barely convince Siri to play a song on my HomePods.
According to recent reports, the Reality Pro team was so impressed with Siri that they offered to write a replacement. Given the importance of voice to VR/AR, that’s incredibly devastating.
And it’s not the only criticism coming from Apple experts. Just a few days ago, The Information published a piece about the AI experts who moved from Apple to Google because they feel Apple just doesn’t cut it when it comes to language learning and digital assistants; it’s Google, not Apple, that apparently thinks otherwise.
What’s the problem?
Siri is naughty
The problem is apparently quite simple. Siri is jerky.
In 2018, The Information published a piece about seven years of Siri in which it blamed the digital assistant for the disappointing performance of the first-generation HomePod. Apparently surprised by Siri’s popularity, Apple scrambled to make improvements — improvements built on code that the article’s sources described as “inflexible,” “brittle,” and “patched but never fully replaced.” This is known among DIYers as a bodge job: it looks good from a distance, but it’s not made to last.
That was already a problem in 2018, but it’s an even bigger problem now because so much more of our interactions with Apple hardware are vocal. Siri is in your AirPods and your HomePods, your Apple TV remote and your iPhone, and soon it’s on your face thanks to the Reality Pro VR headset.
And that’s just for the basic stuff, controlling your smart home and adding stuff to your reminders and choosing songs on Apple Music stuff. We haven’t even dipped our toes in the overhyped but important waters of big language models like ChatGPT yet.
So what’s actually wrong with Siri?
The first and most obvious problem with Siri is that it’s incredibly slow. I was hoping that the iOS 16 updates, which brought new firmware for HomePods and other hardware, would fix the problem. But no. It’s still slow enough that every interaction makes me wonder if Siri heard me at all, and it’s much slower than Amazon’s Echo. I know because I have one too, and my kids prefer using it because they don’t wait for it.
I know that’s a first world problem – ooh! My voice-recognizing intelligent speaker takes a few seconds to control my digital home! — but it’s also an I-spent-a-forty-on-this problem. You wouldn’t accept an iPhone 14 that waited that long to respond to a swipe.
The second issue is that Siri’s voice recognition is still problematic. Maybe it’s because I’m not American – I’ve been using speech recognition since its invention, and as a Scot I’d gotten used to putting on a fake American accent to make things like IBM’s ViaVoice understand me – but I I barely have a thick Glasgow accent and Alexa from Amazon can hear me just fine.
Part of the problem may also be that Siri sounds too good. Because Siri sounds human, it’s expected to be as proficient as a human – so when it falls short, something technology often does, the frustration is amplified: you ask HAL 9000 to open the pod bay doors and it refuses to do it.
So how does Apple fix it?
What I’d like to see at WWDC
We’ve had years of reports of what appears to be a really dysfunctional operation around Siri, with multiple management changes and what appears to be a lack of interest and resources: Siri seems to be viewed as a feature in iOS to this outsider, not a feature. core product. Apple even removed some Siri features in iOS 15.
I’d like to see that lack of focus change, and Apple lay out its vision for Siri: What exactly is Siri for? Because when it comes to just controlling your Apple kit and your smart home, it’s quite behind rivals.
When it comes to answering all sorts of questions, it lags behind that too. And when it comes to much more, possibly becoming the Jarvis to every Apple user’s Iron Man, then Apple needs to take Siri much more seriously. I hope we will see signs of that at WWDC.