More Context For Siri And Search In iOS 9

Share This:

Screen Shot 2015-06-08 at 9.07.43 PM

Siri and Spotlight are merging a little in functionality in iOS 9. The first change is that the Search screen will reside to the left of the first home screen, instead of being activated by a downward swipe. I’m not sure I’m a fan of this change because I like how accessible Search is in iOS 9, but I welcome all of the extra power that Apple has given this core feature.

I really like Search because it keeps me from having to dig through large folders of apps, like the single “Apple” folder that I inevitably dump all of my unused first-party apps into. iOS 9 will make Search more powerful with more data sources — e.g. Vimeo, YouTube, iCloud Drive — and an API that will allow third-party apps to show up, as well. We didn’t see a specific demo of this at WWDC, but it should mean that I can search for “chicken recipe”, tap on an Evernote link in the result, and have the Evernote app open up right to the note containing said recipe. I enjoy that level of deep linking on OS X through Alfred 2 and some plugins, so I’d be delighted to see it in iOS 9 this Fall.

The other big thing is that Siri is getting smarter and more aware as an assistant. Prompts with natural language can now do more things, allowing me to say things:

  • “show me pictures of Patrick from my iPad Insight Party” (loads a specific album in Photos)
    “remind me about this tonight at 7pm” (sets a reminder with a deep link to a specific webpage)

I find that promising not only because it shows more awareness of context, but also because it hints that Faces will sync to iOS this Fall. I tag the crap out of my pictures on the Mac, and it’s disappointing not to have that metadata sync over to my iPad on iOS 8.

The only thing I’m wondering at this point is whether Siri can respond to text queries (e.g. “how old is Martin Sheen?”), or if those are still reserved purely for voice. That Public Beta in July can’t come soon enough…

Share This: