iOS 8 wants: A smarter, contextually aware Spotlight search

Imagine if, in iOS 8 or some future version of Apple's mobile operating system, Spotlight became a secondary, text-based point of access to Siri, able to parse the same type of natural language queries and commands, and retrieve the same kinds of responses, and perform the same kinds of actions? Back in June of 2012 I hoped for a Spotlight that could access Siri, and I'm still hoping for it today. Why?

Siri voice free

While Apple has made great progress expanding Siri since its debut in iOS 5, including adding new data sources for movies, sports, and more, they've also expanded Siri's accessibility. Siri gained Eyes Free in iOS 6 and CarPlay in iOS 7. However, there's no equivalent for "voice free".

No matter how useful and enabling Siri is, there will be times when it's simply not possible or socially acceptable to talk out loud to our phones. In those situations, being able to type "Cupertino weather?" or even "Text Georgia I'll be late" would be incredibly useful.

Sure, you can pull down Spotlight from any Home screen, search for the Weather or Messages app, enter the right city or add the right contact, type in your message, etc. but Siri has shown natural language can be faster using voice. Apps like Fantastical have shown they can be faster using text input as well.

What makes this feature so tantalizing is that Siri can already handle text input. Once you've made a voice query or command, an "edit" button appears. Type in anything you like, tap "Done" and Siri will process it. However, you have to start it with voice, which limits the utility.

Imagine instead if you could simply pull down Spotlight from the Home screen, enter your text, and access Siri directly. As I said last time, multiple points of entry into the Siri system don't increase complexity, they increase accessibility.

Quick access to quick actions

Siri can not only answer questions but execute commands. If Spotlight could parse text and hook into the system like Siri, it could also be used to execute simple commands.

There's lots of precedent for this. Quicksilver, LaunchBar, and Alfred are all implementations of text-based quick action launches that have existed on the Mac for years. Just Type from webOS and Type and Go from BlackBerry have tried the same on mobile. Launch Center Pro has even made an icon-driven version work as far as current iOS limitations will allow.

With Spotlight hooked up to Siri's action engine, "Text Georgia I'm running late" is just one example of the type of text-based quick-action that could be possible. "Tweet Guy Wow, arrow was bananas!" could instantly send your status. "Meeting with Ally at 6pm tomorrow" could add an event to your calendar.

If Apple gets around to adding a DocumentPicker to iOS, Spotlight could even see into any and all the files you're storing locally and on iCloud. Spotlight on the Mac can already be used to search for OS X files. "Team roster" or "WWDC keynote" would be just as useful on iOS.

I firmly believe iOS should stay as simple as possible for as many users as possible. When power can be added below the surface, however, when functionality can enabled only for those who want and need it, then it's too everyone's advantage. Just like Notification Center, Control Center, and fast app switching stay completely out of the way and all but invisible unless expressly called out, so too could Spotlight quick actions.

A smarter Spotlight

There's a lot I'd like to see from Spotlight and Apple's text based search in general, including and especially how it works on the stores. Nearest-neighbor and automatic search widening so that spelling mistakes become irrelevant and queries too narrow to return exact results can return close-enough results anyway.

Server side delays, the kind that have sometimes been problematic for the current Siri implementation, could also be mitigated. Since text is already been typed, there's no need to run speech-to-text. Any local actions could be processed locally. Any queries that require a trip to the internet to get results would suffer not much more than the same possible limitations Spotlight web and Wikipedia searches already have today.

Put all that together and Spotlight, like Siri, becomes a parallel, powerful, useful way to interact with the iPhone and iPad.

It makes Spotlight smart, and who doesn't want that?

Rene Ritchie
Contributor

Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.