Last week, designer Cody Sanfilippo shared (via TUAW) an interesting concept of what Spotlight on iOS could be like. Drawing inspiration from Siri’s results, widgets, and some third-party apps that already enable similar functionalities, I find Cody’s concept worth of a deeper discussion. Particularly following the release of iOS 6 yesterday and the many refinements it brings to several areas of the OS, I think Spotlight is one of the stock functionalities that could use an overhaul in the future.
Cody’s proposed design essentially breaks down Spotlight by filters, and makes it play nicer with third-party apps and their content. It is aimed at reducing taps required to perform some actions such as calling a contact, or playing a song, and, overall, it takes some of the functionalities of Siri and translates them into a text-based input.
With this new system in place, you are capable of doing things a lot quicker. For example, tap phone numbers to call from Contact results. Tap their email addresses to email them. Tap “play” on a music result to play the song without ever entering the Music app. Check calendar events, or notes, or reminders instantly. You get all this information at a glance, without ever opening the applications. Of course, tapping the entry (the arrow in Contacts, entire clipping in Notes, etc) will bring you into that application.
In Cody’s vision, Spotlight could gain “filters” to display as buttons at the bottom. So rather than searching for a string of text across your entire operating system, you’d gain the option to refine your results by restricting them to a certain application, like Contacts. Looking for “Mike” in the current Spotlight, for instance, would bring up results from your Address Book, Music, Mail, Messages, and more. If you know you just want to view Mike’s contact card, though, you could enable a Contacts filter, and display a Siri-like card with phone number, Twitter username, and email addresses you can tap on.
Perhaps contacts aren’t the best way to illustrate how such concept could help users save time (though I’m a fan of Cody’s design idea for this). Say you want to play a song: with “Spotlight 2.0”, you could hit Play directly from a search result, saving you the time to open the Music app, view the playback screen, etc. Sanfilippo thought of various implementations for this concept for several built-in apps.
In my opinion, however, the really forward-thinking idea is the possible third-party app integration. You know how Spotlight can look at the contents of your Mail messages? Imagine if it’d be able to return songs from Rdio or Spotify, or tasks from OmniFocus without opening those apps.
That’s a very powerful concept, and one that is not too dissimilar from the Siri API many developers have been expecting since last year. Such an idea – a system that can “look into apps” for certain types of content it understands and returns as results – would probably require major changes by developers to their apps, although, as we’ve seen in the past week, developers who are truly committed to their apps will always update them with support for the latest technologies and devices.
If you’re interested in Cody Sanfilippo’s concept for a better Spotlight, I suggest you head over his website, where you can find a detailed explanation of his ideas, a UI breakdown, and some thoughts on the implications of a different Spotlight. If you’re interested in reading more on the subject, I also recommend this article by Rene Ritchie, which touched upon many similar points back in June.