When Apple bought SRI International in 2010 and subsequently integrated the technology from their app Siri into iOS 5 in 2011, they took an early lead in the race to provide advanced voice assistance on mobile devices. However, as has been the case many other times when Apple took a step forward in the smartphone space, the competition quickly closed the gap. They caught up to the capacitive screen, the multi-touch interface, the Retina display, and eventually Siri, as well.
This isn’t to say that Apple can’t keep up in areas where they feel the need to do so, or that they have fallen far enough behind in any aspect to have problems selling iPhones. However, there are times when Apple’s pace feels less innovative and more glacial. No place in iOS is this approach more evident than Siri. Apple moved very slowly and deliberately over the three years following is appearance in iOS 5, adding only very tightly controlled features and few new data partnerships. This allowed an opening for Google, Microsoft, and now even Amazon to catch up and gain an advantage.
Apple certainly hasn’t abandoned Siri, as the last two years have seen them finally pick up the pace a bit and show some increased initiative. They added proactive assistance, which serves up data from your personal information, contacts, and calendar based on what you are typing, assists in app searches, and serves up other information based on time of day and your usage patterns. Apple also beefed up their “Hey Siri” feature, defaulting it to enabled and adding individual user voice training as part of every initial iOS device setup.
However, the biggest step came last year when Apple opened up Siri to work with third-party apps via their new SiriKit API. However, in typical Apple fashion, this rollout was initially restricted to only certain classes of apps, and had to be manually enabled in Settings. This addition definitely had the feel of a soft-launch, much like with “Hey Siri” a couple of years before.
(Re)Building From Strength
If you find yourself behind in a category, a great way to move forward is to build around areas where you still have strength. As much criticism as Apple may get for Siri’s relative stagnation, there are still areas where it has advantages that they can rally around. The biggest is multi-language support, where Siri still leads the way among digital assistants in many respects. Siri currently supports at least 21 individual languages, many of which can also be customized by region. Siri can even understand some mixed language requests, which is pretty impressive and shows just how good Apple is in this category. In comparison, Amazon’s Alexa isn’t even available in Canada, much less in multiple languages yet.
Another advantage Siri has is the way that it’s designed to be contextually aware and conversational. This is another area where Apple has an advantage on Amazon’s Alexa, which requires users to perform queries for its “skills” with set syntax. However, while Apple at one time had an advantage on Google when it came to conversational assistance, their AI and machine learning initiatives have allowed their various voice search and assistant services to surpass Siri. Even though Apple has lost a step on Google and also Microsoft in this area, it is well worth Apple digging in and improving it using their own AI initiatives. A little refinement and consistency in how Siri determines context across apps would go a long way in closing this gap and making it a strength once more.
An advantage that no other major player in the voice assistant category has is complete control of a comprehensive set of hardware. Amazon and Google have their own hardware lineups, and both have in-home speaker products that Apple doesn’t at the moment. However, it is no secret at this point that Apple will be announcing a competing Siri-enabled speaker next week. On the other hand, Amazon has no phone or computer offerings, and Google’s own phones never sell at any scale and they don’t offer a smart watch of their own at this time. Microsoft has its Surface line, but they still generate only modest sales, and it seems they have finally given up on mobile phones. In comparison, Apple has a complete and comprehensive set of hardware, including computers, phones, tablets, smart watches, a set top box, and now AirPods, all of which are designed to take advantage of Siri in different ways.
Of course, there is always the advantage of Siri being the built-in default voice assistant for all Apple products. Google Assistant, Microsoft Cortana, and Amazon Alexa are all available on iOS, but there will always be things that they aren’t able to do because Apple doesn’t allow for the replacement of their default apps at the system level. This barrier will always give Apple a certain advantage, but it is one they should cease to lean on so heavily. As with their Maps app, Apple needs to deliver an experience that is good enough to make the majority of their users who don’t want to look elsewhere for a voice assistant happy. Maps obviously had a bumpy start, but Apple has put a lot of work into making it into a solid product. It would be nice to see more outward evidence of this kind of work on Siri.
What Can Apple Change to Improve Siri?
Beyond sweeping new features, there are several smaller, more practical changes that Apple can make that can greatly improve the Siri experience.
In my opinion, this is a huge drawback of using Siri right now. It works differently on the iPhone and iPad than on the Watch or the Mac, and the version you get on the Apple TV isn’t even close to what Apple provides elsewhere. With a new speaker product likely arriving very soon, now is the time for Apple to unify Siri as much as they can across all of their platforms. I picked on Alexa a bit earlier, but this is one area that is really working for Amazon right now. While you may have to learn how to do certain tasks and skills with it, once you do, they work the same across all of their devices and apps. Apple should take note a how fast third-parties have adopted Alexa as their primary Home Automation interface because of this.
- Open the Doors to Developers Completely
Apple started this process last year, but they need to take what they have learned and go the rest of the way. Amazon has already gotten out in front of Apple’s HomeKit in terms of third-party support his year because of their platform’s relative openness and ease of implementation. It’s time for Apple to take the reigns off and end the restrictions on the types of apps that can integrate with Siri. If done right, this could be the biggest difference maker out of all possible improvements to Siri.
- Improve on-board mics, or add ones designed specifically for voice queries
This may not seem like a big priority to many users, but one of the biggest complaints about Siri is that it often fails to recognize or interpret voice queries correctly. When you think about where and how most people are using Siri, it is often in a crowded or noisy space, and usually on a device that isn’t specifically designed for that type of voice interaction. In contrast, I have found Siri voice queries to be more accurate using Apple Watch, and FAR more accurate using Apple’s AirPods. In both cases, it seems that these products were designed with taking Siri voice queries in mind.
Amazon’s Echo and Google’s Home, with their multiple beam-forming microphones, do an even better job of correctly interpreting voice queries regardless of conditions. If Apple really wants to move Siri forward and improve their users’ perception of it, voice queries need to work more consistently, regardless of where they are made and on what device they are made on. They need to get serious about making Siri work better on the iPhone, iPad, and Mac, and they DEFINITELY need to knock it out of the park with their coming speaker.
- Make Siri even more conversational
We all know that Siri is quick with a joke, and can rattle off things like the weather, sports scores, iMessages, emails, and your calendar and reminders without issue. These conversational interactions were new and exciting when Siri was first released. However, Apple has never been able bring the same level of interaction to queries that fall back on any kind of web search. Unfortunately for them, this has quickly become one of the major uses of voice assistants. This puts Apple at a distinct disadvantage against Google and Microsoft due to the fact they run the world’s two major search engines, and thanks to that, have access to massive amounts of user data to feed their machine learning initiatives. They both have an upper hand in delivering the right answer directly, rather than just a web search that the user has to actively read through.
This is likely going to be the most difficult area for Apple to catch up in because of their disadvantages in search. However, Amazon has shown that it is quite possible to build an effective voice-first assistant platform without owning your own search engine. Apple would be wise to take a page out of their playbook, and work on getting Siri to deliver the answers that it can more directly. It would also be wise to explore an “enemy of my enemy is my friend” partnership with Microsoft against Google to bolster their ability to acquire useful search data.
If a Siri Speaker project is forthcoming, then improving Siri’s ability to deliver full voice responses to queries is absolutely essential, no matter how great the difficulty. Apple will be directly invading territory already well staked out by both Amazon and Google. If they don’t deliver right out of the box, it could be a negative PR nightmare. That said, I don’t think Apple will ever make the mistake of allowing another Maps-like product rollout. If a Siri Speaker is about to be annonuced, it is because these issues have already been worked out behind the scenes.
- Less Reliance on Voice Input
It may sound like I am talking out of both sides of my mouth after my comments on how Siri needs to be even more conversational. However, this has more to do with the consistency of use I was referring to earlier. One of the improvements that Apple made to Siri recently was to add some of its search capabilities to iOS’ Spotlight Search via typed queries. This was a welcomed addition, but unfortunately, it still doesn’t provide access to all of Siri’s Assistant features. While voice is a great way to interact with a device, there are plenty of situations where it either isn’t ideal or isn’t allowed, so a more complete alternative will be a welcomed addition. The good news is that, thanks to a recent Apple patent award spilling the beans, we know they are already working on this kind of functionality using iMessage. Now it’s likely just a matter of how soon we see it.
iOS was originally designed around the concept of interacting with individual, sandboxed apps. We have already seen a shifting away from this philosophy, as more and more app functionality is made available through alternative means. We have voice interaction via Siri. We have app extensibility and share sheets that allow pieces of apps to work within other apps. We have Widgets and 3D Touch interactions, which allows us to see certain data and trigger some actions outside of the app. We also have Notification Center, which provides us another way to work with specific app data. However, none of these methods currently gives users a complete way to work with all app data and functions, all of the time. Assuming that Apple opens up Siri to all developers in iOS 11, if they will also add full text query capability to Siri, then it truly becomes the complete package- A digital assistant that can work with any data, anywhere, anytime. There will always be a need to work within apps, but the more users can accomplish quickly without having to open them, the more versatile and flexible iOS becomes.
The Future of Siri
This year’s WWDC should shed more light on how Apple looks at Siri, and how they plan to position it for the future. If you take a look at Google’s Assitant, Microsoft’s Cortana, or Amazon’s Alexa, you can already see a clear direction, focus, and purpose. In comparison, Siri is, as I said earlier, far more inconsistent. Is it an assistant for setting appointments and reminders, and making voice calls sending text messages? Is it a voice search tool? It is the AI engine for iOS? All of the above? Again, one of the most important steps Apple can make at WWDC this year is to clarify EXACTLY what Siri is, what it does, and where it does it.
Another subject that Apple would be wise to shed more light on is their AI initiatives, and how they will dovetail with Siri. I was happy to hear something about their ongoing work in AI and machine learning during the WWDC Keynote last year, but that honestly felt somewhat forced. Google had really gotten out in front of them in a big way at I/O a couple of weeks prior, and it seemed like Apple had to show off something to put power users and the press at ease that they weren’t being caught with their pants down. It was more of a proof of concept than anything meaningful feature-wise, though. With a full year under their belts since then, a couple of interesting hires and acquisitions, and a new Siri-specific product likely to be announced, next week’s Keynote will be the perfect time for Apple to lay out their vision of how AI and Machine Learning will directly benefit iOS users, both in iOS 11, and especially going forward into the future.
I have always had a love/hate relationship with Siri. When it works, it can be very useful, but when it doesn’t at the moment that you need it to, it can make you want to smash your iPhone. It really doesn’t take a high failure rate at all to get users to lose trust in a product’s reliability. Unfortunately, Siri has burned enough users enough times over the last six years that it’s hard to find many people who still have a high level of confidence in it at this point. If Apple wants to change the way people think of Siri, the time is now. If Apple wants its virtual assistant to be viewed as equal to or better than Google’s Assistant, Microsoft’s Cortana, and Amazon’s Alexa, it will definitely be an uphill battle that will take time, money, and even more important, commitment.
As with Maps, Apple has entered ground with Siri that it absolutely cannot abandon. They have no choice but to apply the money and manpower necessary to see things through and make Siri a viable competitor in this space. Despite my issues with it, I have a feeling of optimism that I honestly didn’t going into last year’s Keynote. I am confident that Apple knows that they have to make an impact a week from today, and that they are going to deliver the biggest improvements and additions to Siri that we’ve seen since its release.
So what do you think of Siri? Love it? Hate it? A little of both? What do you think Apple needs to do to improve it? What do you think is coming next Monday? Let me know in the Comments section below, on Flipboard, on our Facebook page, or on Twitter @iPadInsightBlog.