The State of Siri
I’ve seen Siri referred to as a “sometimes useful dumpster fire” and while this is definitely funny, I’m not sure I agree that it’s accurate. The sentiment is one that I understand and agree with to some degree, but it’s too harsh. It’s understandable, because when Siri fails, it’s very frustrating. This leads to a buildup of resentment and an overall feeling of disappointed broken trust.
Siri is at least one level above a “sometimes useful dumpster fire.” It’s more like a “confusing mess.”
The way I see it, the core problem with Siri is that it’s confusing.
There are a handful of things that Siri still does quite well. Arguably because these are simple tasks that don’t require a lot of abstract understanding on the computer’s end. And that’s exactly why Siri is good at handling them. Because she’s not a cognitively coherent cohort like Apple markets her. She’s a console. An interface. Not an assistant. A voice assistant. Because it helps you do certain things with your voice.
Apple wanted us to think of Siri as an AI concierge to our apps and by extension the world. They were the first to this party with the iPhone 4S, incidentally also the last iPhone whose development was personally overseen by Steve Jobs, before his death in 2011.
This lofty and vague expectation led people to perceive Siri as something more than she really was. The team built in dozens of not hundreds of cute responses to different questions to make her seem more “real.” In this way, Apple sewed the seeds of Siri’s downfall by pumping up our expectations of it beyond what they were capable of delivering.
”Spaghetti code”
Another problem with Siri is that it’s written in spaghetti code. For the uninitiated, spaghetti code is the kind of thing where failing to plan is planning to fail. It is possible to write a working program by tackling one problem at a time in an organic way, and there is a case to be made for this approach. But the drawback is that it makes it harder to add onto your code later. If you don’t leave it organized enough that your future self will be able to look at what you wrote and understand it well enough to build on top of it, you’ve written spaghetti code. It’s an easier trap to fall into than get out of, and this is why it’s sometimes better to just throw out the whole program and start from scratch using everything you’ve learned so far. One idea for a rule of thumb for this is every 7 years. Going by that, Siri is long overdue for an overhaul.
New Possibilities (speculation)
Generative AI promises to be capable of the thing we thought Siri could do 10 years ago: actually understand natural language.
In 2011-2013, “natural language processing” was a hot buzzword for things like calendar apps that could create events from a phrase, like saying “lunch with Jim tomorrow at 2” would be replaced with an actual event on the correct date at the correct time. Or the same idea but for reminders. This is doable with fairly little compute power, but requires a lot of human work for making all the replacement labeling system. Making a computer look at some words and replace certain words with commands. That’s basically all it was.
But now, OpenAI’s GPT and other large language models can give us an additional layer of contextual awareness. You can actually say what you want to the thing, however you want to. You don’t have to modify your language to be the kind of “natural language” that it’s specifically programmed to understand. Apple is alleged to be seriously looking into leveraging these technologies to make Siri easier to use, and more helpful. But until they make an official announcement this is all just speculation.
But for now...
I still think Siri is useful for certain things, but you have to know what it's capable of, which means you have to learn how to use it. And that's still too much for some people when it bills itself as a "virtual assistant." Which, I totally get. If you want to think of it as more of a "voice interface," that might be a helpful framework. It is reminiscent of the early days of computing where you had to learn certain commands to even use the computer at all.
Siri wanted to bring us closer to the idea of a truly personable user interface, that can meet you where you're at and learn how you want to use it, rather than making you learn how to use it. Siri may have brought us closer to understanding what executing on this vision would have to look like.
For now, if you want to get something that again promises the "AI assistant" and has an actual plan to make it happen, check out the Rabbit r1. Which again, at time of writing, is not yet released, but will be very soon, so stay tuned for that.