The 2018 WWDC Keynote is just a little over three weeks away. Apple routinely gives us about two hours worth of new software, hardware, and services. With the past as a guide, we can expect to see all of the latest on macOS, iOS, watchOS, and tvOS. We will also see any updates that are coming to Apple’s services, such as iCloud, Maps, News, and Music. Hardware announcements and previews come and go, but I would expect to see the new iPad Pro at the very least.
As we move closer to the Keynote, I’ll be making predictions/wishlists on several of the things that I think we will or won’t hear about on Monday, June 4th. I am going to start the ball rolling with one of Apple’s biggest weaknesses at the moment- Siri. There is absolutely no doubt that we will hear a lot about what Apple is up to to improve the service. The question is, how will it go? Does Apple have anything real and tangible to show us?Will it be more window dressing like the last few years, or will we see something more this time.
Earlier this year, I was trying to look on the bright side when it came to Siri. There were a couple of surveys that seemed to point toward improvement. Apple had changed the way it approached AI and machine learning, opening things up so that they could attract talented engineers and programmers with ties to the academic world. Apple finally seemed to be acknowledging the issues with Siri and their blind spots that would hold their AI and machine learning offerings back.
Part of that optimism was rooted in the belief that Apple would be improving Siri so that it would be up to the task of being the primary user interface for the HomePod. I honestly didn’t believe that Apple would release it with Siri as it stood, but as we all know now, they did. While Siri is capable of handling basic requests like setting up appointments, starting and stopping timers, reading out emails, dictating text messages, and adding reminders, it is only adequate as a music search interface. It is still mediocre to awful as a question/answer interface, especially on the HomePod, where it is limited to a smaller set of commands and requests than is available on the iPhone.
Since the release of the HomePod, the survey results focused on Siri have been decidedly negative. Its role as the primary interface shines a bright spotlight on its flaws. What’s worse is that two of those flaws, general questions and answers and consistency of experience and features, happen to be strengths of Google Assistant and Amazon Alexa. While the HomePod’s hardware has actually drawn positive reviews, Siri has been an anchor dragging the public perception of the product down. In fact, Siri was the only negative in otherwise glowing recent customer satisfaction survey of the iPhone X.
So what does Apple need to do? That is too large of a topic to cover in one brief article. However, this is a good time to focus on how Apple plans to start turning things around in the short term. What better time to kick that off than at WWDC?
The first step is acceptance
The first step to fixing a problem is accepting that one exists. Tim Cook did that publicly when the initial rollout of Apple Maps turned out to be a PR disaster. He admitted the mistake, promised that Apple would fix the problems, and they have. Years later, Apple Maps still isn’t on par with Google Maps, but it is a competent and fully functional mapping and navigation service that works across all of Apple’s devices and services. I haven’t had a problem trusting it to get me where I’m going for a while now. Maps is actually just one of several instances where Apple has publicly admitted to having a problem, and then fixed it.
Apple should have done the same thing with Siri several years ago, but there has never been a public admission of problems or failure. In fact, most of Apple’s few public responses to criticism of Siri have been on the defiant and dismissive side. That isn’t a good look. Despite that, several related acquisitions and internal shifts that have changed who handles Siri and Apple’s AI and machine learning programs certainly point to Apple being aware of the fact that the status quo isn’t cutting it.
Apple needs to change the narrative at WWDC. A little humility and contrition on the subject of Siri would make a big difference in PR perception as Apple makes changes. A little transparency and honesty would at least give Apple fans and the tech press some more confidence that Apple finally gets it and has decided to make changes that will be seen and felt. The key to change is acceptance. The key for Apple changing perception is public acknowledgement of that acceptance.
Meet the new boss, who needs to be different than the old boss
There has been a lot of change in who runs Siri over the last year. Back in September of last year, Apple admitted that responsibility for Siri had been transferred from Eddy Cue’s Services group to Craig Federighi’s Software Engineering group. This move made sense and at the time gave me hope for some noticeable changes on the horizon. Unfortunately, changes and improvements on that level still haven’t come yet.
A month ago, Apple finally scored a big win in its effort to gain traction in AI and machine learning. They were able to poach John Giannandrea away from Google, who was their former head of search and AI. Just this week at Google I/O, we saw just how deeply ingrained AI and machine learning have become throughout all of Google’s products. Apple bringing in someone who had such a profound impact on the competition’s products and services over the course of a few years could be a game-changer.
If Apple wants to change the narrative about Siri, introducing this man on stage and letting him tell us how and why things will different from this point forward is vital. Again, this goes back to Apple being more transparent about the changes going on with Siri in an effort to show us that things are finally going to move in the right direction. As important as voice assistants are becoming, they really don’t have a choice, so the time to outline the changes that are coming is now.
Consistency is key
If Apple wants people to see positive changes in Siri in the short term, the way to do it will be pick some key battles that are winnable before moving on to the difficult longer-term challenges that they will face. One of the biggest issues with Siri as it stands right now is consistency. Siri has different rules, different capabilities, and different features on every Apple device. How are Apple users, especially those who are tech-savvy supposed to know what they can do where? This inconsistency contributes to the belief that Siri just doesn’t work, because for example, things that work well on the iPhone don’t exist on the Mac, HomePod, or Apple TV.
One of the reasons that Siri on the Apple Watch seems to work a little better there than on other Apple devices is because of the inclusion of Handoff. If it can’t do something, it simply re-directs the task to the iPhone that it connects to. Considering that the majority of Apple users are going to have their devices linked via iCloud, Handoff could work between a device that can’t natively handle a particular Siri request and another that can. An example would be the HomePod passing requests that it can’t handle because of its lack of a screen to your iPhone.
While enabling Handoff for Siri across more devices would be a step in the right direction, Apple shouldn’t stop there. There are Siri features that are compatible or possible with Apple hardware that were just omitted. The two best examples would be Siri’s limited implementation on the Mac and extremely limited version on Apple TV. There is absolutely no reason that most if not all of the features available on the iPhone and iPad couldn’t be included on these two platforms.
One of Amazon Alexa’s standout features is its remarkable consistency. It now works pretty much the same wherever you are using it and however you are accessing it. Combined with good voice recognition, that consistency leads to much higher levels of users confidence and satisfaction. Apple should take note of this. If they want a short-term win for Siri, consistency is it. A big part of the reason why I say this is because Apple directly controls all of the devices and how they work. If they choose to, they can make Siri consistent across all of their devices. I can’t understand why they haven’t already, but now is the time to clean up this mess.
Double down on privacy
Google put on quite a show with many of their services and new features at Google I/O this week. However, if there was one area where they tripped over themselves, it was in their absolute and complete disinterest in privacy and their disconnect with how a large segment of the population feels about technology and its proper boundaries. They obviously have no grasp of how important these issues are to some of us, and that was made abundantly clear during their demonstration of their new Duplex feature.
While most of the tech press and fanboys (especially those in attendance) fell all over themselves congratulating Google on their achievements during I/O, there were enough people who called them out for how out of touch and out of bounds Duplex is for many of us that they have already had to address the issues publicly. Google has already come back and said that Google Assistant will announce itself during calls to people, and they will likely have other obstacles to clear before they can set people at ease about how invasive they are into every aspect of people’s lives.
While the concern over Duplex is more about the “creepy factor” than user data privacy, it still illustrates one key weakness that Google and Amazon share. There are a lot of people who are nervous about how much power and control they have over user’s data and commerce in general. This is where Apple’s commitment to privacy will serve them very well.
Apple still has work to do here. They have to prove that they can improve their services and their AI and machine learning capabilities while maintaining their focus on privacy. Right now, there is a general consensus in the tech community that Apple’s focus on privacy is holding them back in AI and machine learning. There is no way to know if that is true with no empirical data, but Apple won’t shake this unless they prove it untrue. On the flip side, if they can start to move the needle, their privacy-centric model could become a far more powerful draw for certain users.
Stay the course
Beyond Tim Cook showing a little humility and contrition on stage when talking about Siri, he needs to assure Apple users that the company is in this AI, machine learning, and voice assistant game for the long haul. They have proven that they will stick with and improve services that don’t have the best of starts. Apple Music and Apple News only languished for a year before Apple took action and got them on track. The original Apple Watch wasn’t as poor at the start as Music or News, but the interface was confusing and it didn’t have a cohesive direction. Apple also fixed it by year two and now had a solid hit on its hands.
Then we have Maps, which took more time to turn around than any of the above. However, to Apple’s credit, they have stuck with it and they are still refining and adding features to it today. That proves to me that they are capable of getting the job done. They just need to same kind of commitment to fixing Siri’s problems.
It is up to Tim Cook to explain and emphasize Apple’s long-term commitment to Siri on stage at the WWDC Keynote. Without acknowledging that there are real problems and that they are actively working on them, this will just fall on deaf ears. For us to believe that there is a real long-term commitment, we have to know that Apple is making changes in response to problems, and that they have already identified short-term goals and solutions that will we see the fruit of within the year.
One big component of this long-term commitment will be third-party access to Siri. I honestly don’t think that Apple can move forward too aggressively in this regard until they improve Siri to the point that developers will want to use it. If they end up delivering a poor experience using Siri, they will get blamed as much as Apple. That is why I mention third-party access improvements as a long-term goal. Apple needs to give developers assurance that real changed are coming, and that they will get something meaningful out of those improvements down the road.
Tim Cook, Craig Federighi, and if they allow him on stage (and they should), John Giannandrea all need to tackle all of the above elements head on. They need to let the public know that they understand the nature of Siri’s problems and shortcomings. They need to assure everyone that things will be different going forward, and they need to give us specifics as to why. That includes short-term goals and long-term vision. They need to assure us of their commitment. They need to let us know how privacy fits into the equation going forward as they work to improve. They need to tell us how they will bring consistency to Siri across their hardware, and when it will come to each device.
That is a lot of ground for Apple to cover. This is all more of a wishlist on my part than a prediction, because I can’t say my confidence is high that we will get all of this at WWDC. However, if we do, I think it will restore some faith in Apple users that they finally “get it.” For Apple to convince us that this time will be different, this time Siri will start moving in the right direction, all of the issues above need to be addressed in some respect. Here’s hoping.