Welcome to the MacNN Forums.

If this is your first visit, be sure to check out the FAQ by clicking the link above.
You may have to register before you can post: click the register link above to proceed.
To start viewing messages, select the forum that you want to visit from the selection below.

iOS' native voice command system, Siri, no longer has a "beta" tag, a look at related pages on Apple's website reveals. All references to beta status appear to have been dropped both graphically and in text. Until now, Siri has carried the beta label since its arrival on iOS in October 2011.

Siri is due to get a number of improvements when iOS 7 releases on September 18th. Some of these include a redesigned interface, and new, more natural voices, including the option of choosing a male or female voice.

Shame that, with iOS 7 (even the GM), it feels more beta than ever before. More than half of my attempts get shot down by Siri because "there's a problem".

And don't get me started on the fact that Apple still didn't manage to give Siri much-needed offline functionality... something that was there in the pre-Siri days (and still is), with voice control on the iPhone 4 and below. Why do I need an online connection to dial contacts or play music?

Most likely because all speech requests need to be sent through Apple's servers for processing, despite some spoken requests to access local content.

In other words, the technology for playing a song or dialing a contact with voice control was limited and speech processing was done on the device itself. The processing is no longer done on the device itself, with the tradeoff that it's far more powerful and far more accurate.

Sure, a compromise could be to run certain commands through the old-style, local voice control processor, and more complex commands through Siri online. Maybe Apple ditched that solution for one reason or another.

This isn't an opinion one way or the other, just an answer to the question "why?"

Diablo, I get the point why Apple needs to route more complex commands (especially the ones that need to pull further info, like weather or stocks) through their servers. And I'm precisely asking for that "compromise" you mention, although I see it more as a vast improvement than a compromise.

All I want is a simple routine implemented in Siri that checks if a command can be processed onboard. This would not only make it faster to complete the task of e.g. calling a contact - I'm on Edge speed more often than I care for, and Siri takes a good few seconds to react then. It would also drastically reduce the amount of failed requests due to a slow or non-existent net connection. Siri could still send that recorded request to Apple's servers for analysis later, so that Apple can optimize their algorithms (or whatever they do). The technology is there, heck the code is there, why not just throw in a simple "IF no internet THEN onboard speech processing" routine to increase that customer satisfaction that Cook is always on about?