Siri’s Ten-Year Anniversary Is a Reminder of Apple’s Wasted Head Start theverge.com

James Vincent, the Verge:

A decade later, the sheen has worn off Siri’s star. “It is such a letdown,” was how Schiller described the promise of voice interfaces past, and such a description could easily be applied to Apple’s contribution to the genre. Everyone who uses Siri has their own tales of frustration — times when they’ve been surprised not by the intelligence but the stupidity of Apple’s assistant, when it fails to carry out a simple command or mishears a clear instruction. And while voice interfaces have indeed become widespread, Apple, despite being first to market, no longer leads. Its “humble personal assistant” remains humble indeed: inferior to Google Assistant on mobile and outmaneuvered by Amazon’s Alexa in the home.

Looking back on a decade of development for Apple’s personal assistant, there’s one question that seems worth asking: hey Siri, what happened?

Siri in iOS 15 is not without its improvements, but it is still frustratingly limited. It refuses to maintain context, it took until iOS 14.5 — released this April — to fix that thing where you tell Siri to remind you of something “at three” and it sets a reminder for 3:00 in the morning, and it has regressed in some areas.

Vincent:

When Schiller introduced Siri in 2011, he stressed time and time again that Siri would understand users — that it knows what they are saying, just like a real person. This set the bar too high for Siri’s functionality. If you treat voice interfaces as if they have the same level of fluency and knowledge as a human being, you will always be disappointed. We speak, and they stumble. We guess what they’re capable of, and they disappoint. Usually because they don’t support the app or command we thought they would. Each failed interaction then teaches users: don’t trust this feature. By comparison, screens and displays tell us clearly what we can and cannot do. They offer menus, directions, and buttons. A voice offers only itself and our projections of intelligence. For Siri, users have been guided by Apple’s flair for the theatrical. They expect too much, and Apple delivers too little.

That is where I am at. Every Siri command — beyond adding Reminders and setting timers — feels like a tightrope walk I should attempt rarely. Sometimes, I am rewarded, like when I told Siri to add something to an existing note titled “Sept 26” and it completed the task successfully. But those moments of delight are often paired with feelings of failure and punishment, like when I told Siri to add something to an existing note titled “Oct 3” and it responded that no such note existed or, on a second attempt, that it could not do that. Why should I try repeatedly if it feels like a waste-of-time crapshoot?

Apple has improved Siri immensely by tying it to Shortcuts. You can build entirely custom Siri commands that are tailored just for your usage; I have created a few for myself. But being able to build your own is no match for a mythical version of Siri that built upon the momentum of the one revealed on this day a decade ago.

A few years ago, I tried a bunch of the commands shown in the original Siri demo video from before Apple acquired the company. It did poorly. I ran through the same commands just now, and saw broadly similar results as I did then:

  • “I’d like a romantic place for Italian food near my office” now seems to be parsed more-or-less correctly. Siri shows five restaurant suggestions that match the search, and it seems to consistently prioritize ones near my work’s address. When I change the command to “… near my home”, the sort order changes. Good.

  • “I’d like a table for two at Il Fornaio in San Jose tomorrow night at 7:30”, when converted for a restaurant in Calgary that I know uses OpenTable, now simply shows a Maps result with a checkmark indicating that reservations are accepted. Tapping on it brings me to the Maps entry, and if I tap the “Reserve” button, I see an OpenTable card with a preselected date of tomorrow, and a table for two people. The 7:30 time was not selected, but I thought Siri had this one licked.

    That is, until I tried changing the request to a table for four on Friday night. Going through the same flow still showed an OpenTable card for a table for two tomorrow night. I was also unable to complete this task using only my voice and a “hey, Siri” command.

  • “Where can I see Avatar in 3D IMAX?”, swapping “Avatar” for a currently-playing film, just showed me web results. Similar queries for theatre showtimes near me also just displayed a web search.

  • “What’s happening this weekend around here?” thankfully no longer displays news headlines, but it also returned a web search. Three suggestions were displayed: the first two websites were generic event aggregator pages not specific to Calgary, and the third result was for event listings in Ottawa, on the other side of the country. The location indicator in my iPhone’s status bar was solid, so I assume Siri was aware of my physical location, yet chose to ignore it.

  • “Take me drunk I’m home” still suggests calling a taxi.

Siri’s development cycle seems defined by a geological time scale. I know I just recently complained about bugginess in Apple’s current software releases that seems to be driven by a relentless and speedy release cycle but, from the outside, Siri languishes for exactly the opposite reason.