Category:

With the iPhone 4S launch nearly two years ago, Apple introduced the Siri voice control system to its customers. At launch, Siri was a gimmicky feature at best, being released with bugs, a highly-computerized voice, sluggish content loading, and unreliable servers. In addition, Siri of 2011 was short on user compatibility, only launching with knowledge of English, French, and German. Apple certainly did not deny the early issues with Siri: the company launched the product in “beta,” a tag that has remained on the software ever since.

Late this past week, Apple updated its Siri webpages to drop all references to the product being in beta. Prior to this past week, the bottom of the Siri informational page read:

Siri is available in Beta only on iPhone 4S, iPhone 5, iPad with Retina display, iPad mini, and iPod touch (5th generation) and requires Internet access. Siri may not be available in all languages or in all areas, and features may vary by area. Cellular data charges may apply.

Siri is available on iPhone 4s or later, iPad with Retina display, iPad mini, and iPod touch (5th generation) and requires Internet access. Siri may not be available in all languages or in all areas, and features may vary by area. Cellular data charges may apply.

Additionally, Apple has removed its Siri FAQs website (Google Cache) that explained some of the finite details of the service and the supported languages. Because Siri seems to no longer be in beta, perhaps Apple feels that the service now performs well enough to not need an additional page of explanations.

Did you not read the article? It was clearly stating in iOS 7 that Siri had exit beta. It still can be updated, but you saying there is lag and server issues because isn’t because it has the tag ‘beta’.

No, he’s saying iOS7 is still in Beta because it hasn’t been released yet. Any copy that someone is using (a developer probably) is the Beta version. So the iOS7 Siri is not using the production servers or backend yet. When you correct someone, you should take pains not to be wrong about it.

Technically, my current developer version of iOS7 is no longer Beta. Gold Master is the release candidate that will be released to the public on the 18th. Since installation of the GM can be time consuming (not available via OTA update; you must backup your device, download the GM package and perform a Restore via iTunes 11.1 Beta 2; which will presumably leave beta state & be released to the public around the same time as iOS 7). If your iOS device has a large number of apps and lots of data; this can take several hours. Consequently, there may be some dev’s out there who have not updated to the GM. Fortunately; the general public will be spared having to go through this process; as everything I have read/heard indicates that upon official release, updating from iOS 6 to 7 will be available via OTA.

It’s not exactly difficult to stop receiving updates to these comments. Look near the bottom of each email response. You will see text that reads “Want less email? Modify your Subscription Options” Subscription options is a hyperlink to change those options.

Regardless; you would have never received ANY further comments from this thread if you had not checked one of the two boxes ( Notify me of follow-up comments via email. or Notify me of new Posts via email.) when you posted your own comment. Don’t blame the community for your own actions.

Much of the time, there are a lot of Apple news topics to blog about. However, I don’t always have the extra free time needed to write my own posts for every topic; thus, I reblog 9to5Mac’s posts to share with my readers. And like what robertvarga79 said, “Reblogging is not spamming … but sharing articles on a fair, honest way (to not have to copy others’ work.” Whenever I reblog an article, a computer at WordPress automatically generates those “Reblogged this on… ” comments; those comments started showing up after 9to5Mac switched to the WordPress commenting system.

I hope you understand my side of the case and I apologize if I didn’t make that clear.

Look forward to new Siri. If you guys use iOS 7 already have you by any chance noticed if Siri reads calendar appointments for you – couple of weeks ago it stopped it just pulls up the list. Before it used to go meeting by meeting and tell me what my day looked like during my drive to work – having a list on screen whilst driving is not very useful and I hope they go back…

I’ve been able to get Siri to read my calendar appointments in the most recent iOS7 build. In *most* cases when Siri brings up reminders, notes, message, emails, calendar events, etc. on the screen and doesn’t read them out loud, I’m able to just say “read it” or “read them to me”, and it will do so. I hope it starts working for you again in iOS7!

I just tried, “Can you read me my schedule?” Siri answered, ” Tomorrow, you have an appointment at 12:00 PM.” “Blah blah blah blah.”
So, yes, it gave me the schedule perfectly in a very well pronounced voice.

I also agree that Siri should work offline for creating / editing notes, calendars, reminders, contacts, music, etc., searching and reading messages and email, dealing with settings, alarms, timers, etc. There isn’t really any reason why an Internet connection is needed for that. For things that require web access, obviously Siri will never work offline. But it seems silly that if I have a flaky Internet connection somewhere and say “Siri, set an alarm for 7:00am” the response is “Sorry, I’m not taking requests right now”.

Partial offline functionality of Siri services is something that I whole-heartidly believe should be implemented. I realize that the full Siri package is quite complex; but it is extremely frustrating when I am driving and (using headset activation button) ask Siri to call / dial someone in my contacts; only to have Siri respond “I’m unable to respond right now; please try again later.” Forget about futer “Eyes-Free Siri in the car; if it can’t dial a phone number stored on my phone; how is it going to change radio stations or anything else.

There should be some method available to trigger Siri into handling basic requests offline. Before Siri on the 4S (and with countless other cell phones; whether they be Smart or Feature Phones) Voice Dialing had become an integral part of most devices. I’m guessing that Siri requests (and responses) are offloaded to central servers in order to lighten the load on the phone’s processor; while simultaneously providing a uniform experience across all compatible devices. (With Multi-language, Multi-Dialect support; that’s a lot of information that would tie up local resources)

I do not know how Siri is coded; but it appears that there is no attempt within the program to determine if a request is one that can be processed directly on the device; thereby avoiding access to it’s web servers. Such functionality would also lighten the load on the central servers. (JMHO)

Actually; under iOS7 (Currently running GM on a 4S) Siri has 3 options for Spanish; Mexico, Spain and United States. It also offers 3 variants of Chinese, 4 of English, 3 French, 2 German, 2 Italian, Japanese, and Korean. In addition, many of these languages offer the option of a male or female voice. The familiar female U.S. English voice associated with Siri since its “Beta” launch in 2011 has been updated with a more natural sounding variant of the original.

Good catch, Mark! I have used Siri only infrequently for about a year, and a couple days ago I decided to test out what’s changed now that I’m running the iOS7 GM build on my iPhone.

I’m actually pretty impressed with the progress Siri has made in “filling in the holes” of functionality that used to be frustrating. Just on a whim, I asked Siri to search my emails for message containing the word “flight” (which she found right away), and then I was able to say “read me the first one”, and then “read me the next one”, etc. all with no issues!

Similarly I asked Siri to search my notes for any having to do with “furniture” and she did this also successfully and quickly and was able to read me the note aloud when requested. The first “hole” I stumbled upon in the current version of Siri was sharing. I asked Siri so share the note with my wife and she said “I can’t share your notes”. However, Siri was able to find a reminder that I had previously created and share it with my wife. She was also able to pull up and read specific reminder lists, but not mark any as complete “I can’t update reminders. Sorry about that,”. She also was able to tell my what notifications I had missed and read them to me.

Overall, Siri feels a lot more complete to me now, and more useful as a separate and easier way of doing common tasks. The response time generally felt quicker as well. I think I’ll probably use Siri more throughout the next year than I did in the last year, but I really look forward to:

1) Filling in more “holes”. I would like to be able to conversationally ask Siri to do anything I can do manually in apps, particularly sharing photos, notes, etc. As an observation though – Siri can do a really large percentage of the functionality of built-in apps automatically or through voice requests. Pretty impressive and complete coverage.

2) Local processing of voice requests, so Siri can do a subset of things without an Internet connection.

3) I think an “always listening” option would be cool, so if my phone is on my desk, I can just ask my question to Siri out loud without picking it up.

I don’t find myself wishing for Siri integration with 3rd party apps as many people do, perhaps because I use the stock Calendar, Mail, Messaging, Notes and Reminders apps for everything anyway. I doubt 3rd party apps will get the level of Siri integration available in Apple built-in apps any time soon, if ever.

I would like to see more “serendipitous” or “proactive” notifications in the style of Google Now, to let me know about traffic conditions, package deliveries, nearby transit schedules, points of interest, etc. But I don’t see any reason why this needs to be through Siri. Just normal notifications would work fine for me.

I’m definitely interested in seeing what other web-based services Apple can integrate with Siri – like calling a cab through Uber, finding me a good price on a plane flight or car rental through Kayak and booking it, and giving me information about my recent spending and budgeting through Mint. It would be cool to stand in the supermarket and ask “how much have I spent here this month?” or “how much did I spend here last month?” or “how much do I have left in my grocery budget”?

Also, integration with my iWork documents in the cloud would be cool, particularly once they add collaboration features. I’d like to have Siri add comments to documents for me, read recent changes, and share access to documents with new people.

Really excited to see how Apple keep building out this “personal assistant” concept!

Never mind the server, Siri has simply never actually understood my voice to a degree that makes it really useful for anything.

If you have to say somethings twice, you can type the query faster, it’s as simple as that. And if it gets one word out of twenty wrong and doesn’t do punctuation, it’s useless as for dictation also.

My suspicion is that this is one of those things that works great for Americans and that Siri is actually tuned to the American voice, because the things it gets wrong are consistently those things that Americans say or pronounce differently from the rest of the world.

Yeah, this is a good point. Siri is still pretty bad with accents as well. My wife speaks English with a very slight Russian accent, but it’s enough to throw Siri off. I had a coworker with an Indian accent who tried Siri, and the voice recognition only got about 10% of it right.

For now, it’s only a good option for people who speak the exact supported languages and dialects, without any accent. Since Nuance, who supposedly does the voice recognition software layer for Siri, has options in their desktop software to “train” their system to recognize a specific accent or dialect or user’s voice, it would be really nice if a Siri user’s iCloud account could store such individual “training” in Apple’s data center, and give each user a personalized, highly accurate voice recognition configuration.

Hey tried holding the phone away from your mouth? I find that when you have people that have just gotten a new phone they want to hold the microphone right up to their lips. Siri always seems to make a mistake when people do that. I find that if I hold the phone in front of me with the screen facing me, not the microphone, and speaking in normal voice without feeling that I have to talk too loudly, she gets it right every time. By the way, this is 100% dictation with no corrections after I spoke. That includes punctuation. If I say “?” She gets all of that correct. In order to make “?” In that sentence I said quote, question mark, end quote – I had to type this sentence. With the rear microphone, I find that Siri can hear me even in a store where there’s lots of background noise or a restaurant.

It has an awful time with my American “New England” accent. I grew up straddling Massachusetts and Rhode Island. and Siri is pretty much a time waster for me unfortunately. She does however give me plenty of practice in trying to pronounce my “ahhs” uh I mean “Rs”..

Yeah, Damn Siri for expecting you to properly speak the language you have it set on. I know plenty of non-native US citizens that think their English is “great”…and maybe it is for someone who natively speaks Russian or Mandarin….but ‘good for them’ isn’t necessarily ‘good’ at all.

2 years later and I still don’t understand why people who barely speak English expect Siri to understand the non-language coming out of their mouths.

I think the problem isn’t just with non-native US citizens who have a very heavy accent. My wife has lived in the US most of her life, and has an accent so faint that many people don’t notice it until they’ve known her for a while. My mom has a strong old school New York accent and says things like “waduh” instead of “water” and “idear” instead of “idea”. Siri simply does not get a lot of words right when they use it. I notice that for my own speech (native US speaker) Siri will tend to get certain specific words wrong over and over again and I have to carefully emphasize them in a way that tries to nudge Siri towards the right transcription. I do believe that being able to “train” Siri for individual user accents and specific word pronunciations would be a really big step towards better accuracy for more people. The technology exists, I would love to see Apple make use of it.

While there are some programming fixes that can increase the usability of Siri; you should not expect such technology to adapt to millions of language variants. It’s like the time-old tradition of Americans visiting foreign countries and expecting everyone there to understand English. If you want to communicate effectively; you need to at least attempt some effort in learning the language of the recipient. In this case, the recipient happens to be a machine.

It’s actually very bad with arabic accent too .. on the other hand google chrome is striking accurate with arabic accent and have the support for all arabic speaking types too .. this only applied to the recognition part of the service .. I’m actually sad siri can’t speak arabic ..

I dont understnad how they can NOT call it Beta when it doesnt support nearly a fraction of the languages iOS support. I can´t use Siri, since I dont want to speak english to my phone when my native language is Norwegian.

First impression from reading the title alone, “seems” sounds like doubt and knowing software and how bloated it can get and buggy it can be and appear to be functional makes me wonder what can be wrong with it that may not have come out during testing.

Now after reading the article, it still sounds like doubt of the existence of hope for its ability so all I am left is with the question I have to ask myself all the time. Should we (developers) have an obligation to the customer to speak up when there is a problem or allow bugs that can be fixed but will cause deadlines to be pushed back for the greater good of the company and it’s image? Or Should we keep quiet and not express our concerns because we are the new person and should just listen to our leaders and allow the pressure of the customer to inadvertently influence our judgement and decisions? As a developer I know, at least now, that people will not listen unless you can give a sound argument backed by solid evidence. In essence, we have to become lawyers in everything we do no matter the career choice.

From what I have read so far of other articles is that Apple is losing its stronghold on the mobile phone market at least. Could it be a cause of people becoming aware of Apple’s motivations or is it just that their failure it bringing light or shedding a distorted image of what Apple is trying to achieve?

Beta shmata! What is the purpose of this article? I want more IA feedback from my devices. If I’m stuck on my phone all day and night. I want Siri or whatever to be able to carry on an intelligent interaction (conversation?). How long will we have to wait to have a real IA experience. It is the logical future of these devices. Siri is just a baby step.

I have found Siri to be a major distraction and I am yet to find a way to disable the damn thing. No matter how much I try and follow the recommended steps, I am unable to disable it.

I use Waze for my GPS and when I am driving down the Interstate at more than 65 miles an hour in heavy traffic, Siri pipes up, “How can I help you?” (Thankfully there are no children in my car then otherwise I would be guilty of indecent language!)

One of the first questions that I asked Siri was “Where is Steve Jobs?” Jobs had just passed away then. I got a reply, “I don’t know what Steve Jobs is.”

It seems a very innovative system, your application is very useful for some people who have limitations and its functionality is not limited merely to the use of the hands, making it a system with great potential for the future, it is logical to have flaws, and thereby demonstrate that those errors do not achieved during the development stage. I have faith in the product (http://www.officegraphicdesign.com/web-design-miami/ ).