Apple’s Rumored Virtual Assistant Could Outshine the New iPhone

Buzz surrounding Apple’s Tuesday event has never been higher, as consumers eagerly await the announcement of the next generation iPhone. But the new hardware could take a back seat to a bigger announcement: a potential voice control software feature that could be released with the latest version of iOS 5.

The voice control feature — referred to by Apple pundits and bloggers as “Assistant” — could change the way people interact with their iPhones, using conversation with an artificially intelligent assistant to help make decisions and schedule daily activities.

“This is an area in which Apple has been trailing Google and is playing catch-up,” Forrester analyst Charles Golvin said in an interview. “Similar to the notifications improvements [in the iOS 5 beta] and the ability to use the volume control button as a camera shutter.”

This type of service has been a long time coming for Apple. Former Apple CEO John Sculley first described such a user experience feature in his 1987 book Odyssey. He called the concept the “Knowledge Navigator,” and Apple subsequently released several video demos over the next several years illustrating how the idea would work. The Knowledge Navigator concept takes place on a tablet-computer (decades before Apple unveiled the iPad), incorporating advanced text-to-speech functionality, a powerful speech comprehension system and a multi-touch gesture interface much like that which is used in iOS.

Back in the late ’80s, Scully’s lofty visions of the future were the stuff of dreams. Today, we’re much closer to this becoming a reality. We’ve got intuitive, portable touchscreen devices that house powerful processors with enough memory to handle such impressive tasks.
To boot, we’ve got chips and software that can back up the processes required for complex speech analysis.

Apple had the hardware portions of its Knowledge Navigator concept pretty much nailed down with the latest iterations of the iPhone and iPad, but lacked the text-to-speech and speech-understanding chops. Until a start-up named Siri came along.

Siri began as a voice recognition app for the iPhone. The app sounds similar to Google’s voice search, which is integrated into Google Search on iOS and is a standalone app on Android and other platforms. With Siri, instead of just searching for a specific topic, place or person using your voice, you’re giving more descriptive instructions. One command, for example, may be “Find the nearest good Chinese food restaurant.” At launch, Siri was integrated with about 20 different web information services, so rather than just taking you to the search results page for “good Chinese food restaurants,” it would bring up Yelp results for the highest-ranking restaurants near your GPS-determined location.

But it’s much more than just a digital Zagat’s. Siri calls itself a “Virtual Personal Assistant.” Rather than just issuing the app commands or Google-style search phrases, you interact with it through conversation. Saying something like “I’d like a table for six at Flour and Water” would prompt the app to make a reservation using OpenTable. And if you haven’t provided enough information for it to complete a task, it will prompt you to elaborate. Siri then uses information about your personal preferences and interaction history so it can better accomplish specific tasks. As you use it more, it learns your preferences and improves its performance.

Photo: Jim Merithew/Wired.com
Merging Siri’s virtual assistant software with iOS would turn it into an extraordinarily powerful tool. You could tell Assistant to email friends, coordinate dinner plans and reschedule meetings, as it would have access to things like your contacts, email and calendar. Your favorite multi-touch device would get a robust new touch-free way of interaction.

Independent iOS developer Will Strafach is hopeful that it will be more robustly integrated into iOS. “I’m hoping that Apple will release some developer APIs so that devs will be able to integrate their apps with the Assistant feature,” says Strafach. One usage case could be with the Delivery Status package tracking app. You could use Assistant to interact with the app and quickly find out the status of a package, or notify you when it has been delivered.

Siri was acquired by Apple a few months after its app officially launched over a year ago, and we haven’t heard much about it since.

But a number of signs are now indicating that the Siri-developed Assistant could be the primary focus of Apple’s iPhone event.

Siri founder and former VP of engineering Adam Cheyer now plays an influential role with Apple as Director of Engineering in the iPhone group.

And if you look at the four icons included in the invite image (bear with me here) they can be interpreted to indicate that this Assistant feature will be the key component of Tuesday’s event. The calendar and clock are positioned at the top because first and foremost, Assistant will help you manage your time and schedule. The inclusion of the maps icon indicates that Assistant will be location-aware to help you locate nearby businesses and restaurants. Lastly, the phone icon is included because the new feature will help manage your social connections, relationships and obligations. Of course, we could be looking into it far more than we should.

Rumors that Apple would add a powerful voice recognition feature to iOS have persisted off and on for some time (especially since Knowledge Navigator has been a known conceptual goal of Apple for many years). And the concept of voice control is nothing new, already used to execute basic phone commands on most major smartphone operating systems. But like Apple’s FaceTime feature did with video chatting, Assistant could potentially make that feature dead-simple to use, and of course, exclusive to Apple devices.

“Siri was many iterations ahead of these technologies, or at least it was two years ago,” Siri board member and co-founder, Norman Winarsky, said in an interview with 9to5 Mac. “If the rumors are true, Apple will enable millions upon millions of people to interact with machines with natural language.”

Golvin expects that a more complete voice interface solution will be part of the iPhone announcement tomorrow, but doesn’t think that it will be baked right into iOS. “I suspect these unique features will continue to be delivered via an application rather than as an embedded function of the OS across a variety of applications.”

And with more signs that the hardware debuting this week may only be an incremental upgrade (examples include leaked parts and mentions of an iPhone 4S in the most recent iTunes beta), it seems logical that the voice-controlled Assistant could be the focal point of Tuesday’s presentation.

Of course, this is all subject to Tuesday’s announcement. But if true, Apple’s “Let’s Talk iPhone” invite teaser line may be far more important than we first realized.