In my recent article Taking Siri Seriously, I made the comment that if Apple was releasing a Siri-equipped speaker product into the world, then it would surely be a sign that all is now right with the world’s first mobile digital assistant. They wouldn’t make another “Maps” error. They wouldn’t blindly release another product like that into the world. As the afterglow of yesterday’s WWDC Keynote fades and the reality-distortion dissipates, now I’m not as convinced.
Apple’s massive 2 1/2 hour data dump certainty accomplished goal one of flipping the media’s script yesterday. ALL of the talk today is about what Apple is doing and how they are doing it. They absolutely NAILED that. However, at the end of the day, what did we really learn about Siri in iOS 11?
I can’t help but think back to Richard Gere’s performance in the movie Chicago right now. “Give’em the old Razzle Dazzle….” Is that what happened yesterday? I just re-watched that segment of the Keynote and went back over my notes, and here is what we actually learned about Siri yesterday:
- Siri is available in 21 languages in 36 countries. This is already one of Siri’s few advantages, so it makes sense to point it out.
- Siri has a new voice, in male and female versions. They do sound better.
- Siri can now translate (In Beta)- Sounds like they are leveraging the point above. Google already does this, but its good to see Apple using one of its advantages to catch up a bit.
- The visual interface goes deeper and offers additional information. It has follow-up questions and can provide multiple answers. This was glossed over very quickly, so we don’t yet know if this is something significant, or Apple stretching to fill a slide and some time.
- SiriKit in iOS 11- It sounds like more apps can make use of Siri now, but other than a few specific mentions, we don’t know very much about how far Siri has been opened up yet. OmniFocus 2, Citi Mobile, Evernote, Things 3, and WeChat were specifically mentioned, but can ANY app integrate with Siri now? If so, I’m surprised Apple didn’t make a bigger deal of it.
- Apple specifically mentioned that processing is done on device, and touted their commitment to privacy. If Siri improves, this will be a strength. Until then, it’s still just a talking point.
- Siri is now synced across devices, so like iMessages, Photos, and some Machine Learning tasks, even though the processing is done on-device, all devices share in what they others have been doing. This actually is a legitimate improvement that should make the experience between the iPhone, iPad, and Mac more uniform.
As soon as the Siri portion of the iOS 11 presentation ended, I wrote in my notes, “I hope that isn’t it.” Well, not completely. Siri also showed up a few other places during the presentation:
- The new Siri Watchface in watchOS 4.
- Siri is now going to serve up apps and detail in the Watch’s re-designed Dock.
- Siri and Machine Learning were often used together or interchangeably, as in the discussions about Photos and Memories, and also those on surfacing info through the OS when you need it. This is a positive, but doesn’t really help to fix anything to do with Siri’s own interface and query responses, though.
- Siri got a lot of stage time during the HomePod presentation. Evidently Siri knows a ton about music now, so there’s that. We’ll get to this in a bit.
Ok, I’ll admit that Apple has definitely done some work on Siri and added new features. It is now more of a noticeable component in Apple’s Machine Learning and AI efforts, as well. That’s all well and good…….however…..
Has Siri Actually Improved?
Only one of the new features discussed has the potential to make Siri’s core functionality as a voice assistant better in the short term, and at no time during the 2 1/2 hours on stage were Siri’s weaknesses mentioned. There were no admissions. No assurances of real improvement where users really want it. It was the elephant on the stage, and again like Richard Gere in Chicago, Craig Federighi artfully tapdanced around it.
Unfortunately for Apple, all the new features in the world can’t fix Siri not understanding you when you need it to. They can’t even out the experience between the AirPods and Apple Watch and other devices that don’t have mics that are tuned to work with voice queries. They can’t make Siri’s contextual awareness consistent, and they definitely can’t help it deliver a straight (and correct) answer instead of a wall of text. These are the very well documented weaknesses of Siri that Apple hasn’t adequately addressed since its release six years ago. In my opinion, Apple gave us no indication on that stage that any major changes that would affect the issues above are in store.
So What Happens to HomePod?
The Siri discussion during the HomePod presentation was kind of strange. First, Apple went to GREAT lengths to let us know this device is about music delivery first and foremost. The first mention of Siri as a “Musicologist” kind of cemented that for me. I mean, sure, it’s great that Siri now knows a lot more about music. I love the fact that it will be able to give us real answers about musical questions and, and thanks to all of the information that Apple can gather from Apple Music, the answers should actually be correct. On my second watch of the Keynote, I also noticed the nice dig Phil Schiller got on Amazon Alexa, specifically pointing out that you don’t have to memorize specific queries or commands. Get them where you can, Apple.
Things kind of went off the rails for me after that. Home Assistant was literally the last item on the list of HomePod features, and it was framed in such a way as to make it sound like an afterthought. The list of available “domains” also made it clear that Apple is putting heavy restrictions on what Siri can do through the HomePod. This is somewhere between what’s available via Siri on the Apple TV, and what you can do on the iPhone. Search and asking questions outside of core competencies such as News and Sports are not included. I guess this is smart, since we know how today’s Siri would perform if it’s full feature set was available. However, this is a glaring weakness next to Google Home, and one that won’t go unnoticed.
Mountains or Mole Hills?
Am I reading too much into all this? Maybe. I honestly hope so. No one wants Siri to work better more than I do. I am on the go all the time, and I still use it a fair amount. However, even though I have stuck with it all this time, and even though it works better with my AirPods, I still don’t trust it. I haven’t in a long time. Does anyone still trust Siri at this point?
The issue I have with Apple’s presentation is that they spent part of their time gleefully raising general expectations that Siri is getting better and that Machine Learning will save the day, and the rest dancing around the problems and subtly lowering expectations and limiting exposure. NONE of this makes a statement that Apple accepts that Siri still has major shortcomings, that they own that, and that they are actively and aggressively being addressed.
I have been under the assumption that Apple would never repeat the mistake of Apple Maps. I thought that there is no way they could have enough hubris to go down that road again. However, they didn’t see the negative press bonanza that was the new MacBook Pro and Touch Bar coming ahead of time, either. That was obvious, and the real embarrassment of it was that some of their biggest fans and most known power users were the ones calling them out. Sometimes I worry that, if the guys underneath Tim Cook can’t see the rocks ahead, maybe some of them don’t have the vision to be involved in steering the company.
Again, I could be wrong. I may be overreacting. Be sure that this is NOT an “Apple is Doomed!!!!!” diatribe. So much of what Apple presented at WWDC was very strong, and showed that they can still be aggressive, and that they do have vision beyond the iPhone and the Mac. Their focus on AR and Machine Learning was clear, and I think it will pay off big time for them in the years to come. Apple taking the reigns off the iPad also shows that some parts of the company ARE listening to what users want.
However, the problems with and weaknesses of Siri still remain today. Will they be gone by December? Did anything in the Keynote make you think that they will? Like they did with Maps, Apple is striking out into new and occupied territory with the HomePod. Despite how they position it as a speaker first, if the assistant features don’t work, users will move on to Amazon and Google. Siri already has one or two strikes with a LOT of Apple users. If they roll out the HomePod and Siri doesn’t get the job done yet again, these people aren’t coming back. This wouldn’t spell doom for Apple by any means, but it would be an unnecessary and very unfortunate self-inflicted wound.
What do you think about Siri? Did the WWDC Keynote satisfy you that Siri is improving, and that Apple has it on the right track, or do you feel more like me? Let me know in the Comments below, on Flipboard, our Facebook page, or on Twitter @iPadInsightBlog.