I was surprised the Siri demo didn't have it reading the answers back to the user rather than just saying, hey, look at this! If you ask with your voice, should you get answered the same way? Hard to be hands free when you have to look at your screen for the info you wanted. And am I the only one who wanted more info in the weather stock reports? Instead of the overall cost, wouldn't you expect stock quotes to include the up/down for the day as well? For weather, wouldn't you expect it to read out an actual forecast?

I was surprised the Siri demo didn't have it reading the answers back to the user rather than just saying, hey, look at this! If you ask with your voice, should you get answered the same way? Hard to be hands free when you have to look at your screen for the info you wanted. And am I the only one who wanted more info in the weather stock reports? Instead of the overall cost, wouldn't you expect stock quotes to include the up/down for the day as well? For weather, wouldn't you expect it to read out an actual forecast?

Well, it depends. If you ask it to 'tell me' or 'read me' it will do that in many cases. Not for traffic or for purely visual stuff, though.

I think we're going to find that Siri is somewhat variable and trainable. Should be kind of a fun sandbox to play in, actually.

Can Siri listen to your phone conversations and provide useful information as the conversation proceeds? Siri could navigate to useful websites with facts for you to add to the conversation, or in extreme cases Siri could interrupt with corrections.

I imagine this will be pretty nice for people with disabilities (and is subtly hinted at in the last part of the promo video). I've read iOS is already pretty well regarded for accessibility, but voice control seems like it could provide a pretty significant jump.

Ironically I'd like keyboard access to it too, for stuff that I'd like to search but can't talk for whatever reason. Maybe there's a Siri option on the Spotlight page or something...if not that seems like a logical place for it.

alexr wrote:

Abulia wrote:

I had no interest until I just watched the Event stream from Apple. Wow. That's amazing. I really want Siri now!

It really sells it. Anyone else think that the voice synth in the event stream sounds noticeably better than the voice in the promo videos on Apple.com?

I haven't seen the keynote demo yet, but in the promo the voice sounds like the existing VoiceOver voice. At least in iOS 4 you can change the speed and I think amount of inflection in the voice which might explain the variation if it sounds similar yet different in the keynote demo.

Ironically I'd like keyboard access to it too, for stuff that I'd like to search but can't talk for whatever reason. Maybe there's a Siri option on the Spotlight page or something...if not that seems like a logical place for it.

I suspect this is a psychological move by Apple. I think if there was a keyboard interface, users would yield to their self-conscious instinct and type rather than talk. This forces you to talk to your phone (which to everyone but you looks exactly the same as you talking on your phone, which eventually people will acclimate to). Not breaking down this barrier will limit Apple's ability to mainstream this tech.

Alternatively, because it's a natural language processor designed to be conversational, and we don't write the way that we speak, it may not have worked as well from the keyboard. At the very least, it would be slower in use and that may hamper it's utility - it's supposed to make things easier/faster, not harder/slower. And Siri is designed to ask questions/seek clarification, so I think people may find rather extensive conversations developing. I agree that there are situations where voice isn't appropriate, but that might be a necessary sacrifice.

Getting over this aversion to talking to your phone is going to be necessary and won't be particularly easy for many people, so Apple may just be forcing it along.

Can Siri listen to your phone conversations and provide useful information as the conversation proceeds? Siri could navigate to useful websites with facts for you to add to the conversation, or in extreme cases Siri could interrupt with corrections.

Yeah, I had the same idea. It would be very interesting if Siri could do that. Like having a virtual butler always around:

(blah blah blah with friend on phone) .."Hey Siri, do you remember what date that was?" ..."Oct 5th, Sir."

I just saw Siri is using Yelp! to locate businesses. I tried Yelp! on my iPad and found it lousy. Maybe it's my area, but I doubt that since it's fairly populated. I couldn't get Yelp! to find the nearest grocery store or liquor store reliably. Looking out the window literally worked better about half the time.

I know what you are getting at here, and it is a valid point, but I can't stop laughing at the complaint that "my phone is less accurate for finding things near me than looking out the window."

(blah blah blah with friend on phone) .."Hey Siri, do you remember what date that was?" ..."Oct 5th, Sir."

"Oh and Siri, was <fiancee> really working late at the office or was he banging that slut Michelle from accounts at his apartment last night?" "Banging madam. His calendar, invitees, location records and voice messages all suggest you are correct. Replaying now…"

Hopefully you can train your own Siri to distinguish whose voice is askin' on a conversation between two Siri-enabled phones.

I'm French. English voice control is bloody useless to my and my 'Authentic Froggie Accent'.

Why not use the French recognition then? I'm an expat. All my contacts, apps, notes are written in English. Ever tried googling for an English term on a French locale? It brings back French results... Useless.

I'm French. English voice control is bloody useless to my and my 'Authentic Froggie Accent'.

Why not use the French recognition then? I'm an expat. All my contacts, apps, notes are written in English. Ever tried googling for an English term on a French locale? It brings back French results... Useless.

I don't know how well it will work with French, but it definitely supports French according to Apple.

I suspect it will not be supported as well as English, because French is just harder to learn. In particular, conjugation is a mess. In English, you would just say "Read my mail". In French, you might ask 'lire mon courier', 'lisez mon courier' or 'lis mon courier'.

OTOH, thinking about it, variations might help Siri understand you. Like cases in German or Latin, which tell about the function of a word in a sentence.

I suspect it will not be supported as well as English, because French is just harder to learn. In particular, conjugation is a mess. In English, you would just say "Read my mail". In French, you might ask 'lire mon courier', 'lisez mon courier' or 'lis mon courier'.

OTOH, thinking about it, variations might help Siri understand you. Like cases in German or Latin, which tell about the function of a word in a sentence.

Forstall said in the keynote that Siri doesn't just understand the words, it understands the meaning of your message so it doesn't seem to matter whichever way you ask it. If it works as it's advertised, it should read your mail no matter you ask 'lire mon courier', 'lisez mon courier' or 'lis mon courier'.

I think Siri has all kinds of useful application, and I am looking forward to picking one up after skipping the iPhone 4. I'm in my car a LOT and the ability to translate txt to voice and vise versa is worth the price of admission.

I'm kind of curious about what kind of addressbook metadata Siri will require to work effectively. The demo examples like "call my wife when I leave work" requires your iPhone to know who your wife is, where you work, etc.

Will there be some kind of training required to configure? Does it require some extra tags in your addressbook, like, "I work at 1313 mockingbird lane" and my spouse is such and such?

I'm kind of curious about what kind of addressbook metadata Siri will require to work effectively. The demo examples like "call my wife when I leave work" requires your iPhone to know who your wife is, where you work, etc.

Will there be some kind of training required to configure? Does it require some extra tags in your addressbook, like, "I work at 1313 mockingbird lane" and my spouse is such and such?

My sisters are under the family in my address book. I guess Apple is eventually gonna update the fields for the address book to add relationship status. Other then that, address book already allows you to add home and work address.

I'll use it when it fits. I won't try to force it into a task that I know I can do with my fingers "just to use it". However I won't ignore it either for free of looking odd (because why should you care what other stuffy jerks on the street think?).

When I'm out in the sun and can't see my phone? I'll use it.When I don't have my glasses on before bed? I'll use itWhen I want to call someone quickly? I'll use it.

News flash people, you look just as goof walking down the street with a bluetooth headset, iPhone earbuds, texting, or even talking on the phone normally. Using this is no different.

I'll roll my eyes at people that walk and talk/text because they walk slowly, don't look where they're going, and usually take up the middle of the side walk when I'm walking with a purpose.

At least this way I can take my phone out say "Call Steve" and put it back in my pocket.

Am I the only one who wants to see Siri come to the Mac? It would be nice to, say, have my Mac sat in the corner, with me shouting commands across the room to read new email, dictate new email, add events to iCal, Skype/FaceTime people, etc.

News flash people, you look just as goof walking down the street with a bluetooth headset, iPhone earbuds, texting, or even talking on the phone normally. Using this is no different.

- CG

If Siri is accurate and can do certain tasks faster than you can with the touch UI, then people might be willing to use it. If for instance you tell it to text a friend and it opens Messages.app and fills in the recipient and transcribes the message accurately and sends it off, then people might be willing to put up with it in scenarios other than driving and other situations where they can't use their hands.

But if you have to make corrections or keep repeating instructions or dictation for it to get it right and it ends up taking much longer to use Siri rather than for you to do several taps, swipes and tapping out the message, then obviously it'll flop.

Am I the only one who wants to see Siri come to the Mac? It would be nice to, say, have my Mac sat in the corner, with me shouting commands across the room to read new email, dictate new email, add events to iCal, Skype/FaceTime people, etc.

++I've been waiting for Siri since I spent $30 for a PlainTalk mic for my 6100 running OS 7.5.

I named him Nixon, and he was just useful enough for me to try using him many times, but not useful enough for me to keep using him on a regular basis. Speech rec seems like it's been on that cusp for 20 years now. From the videos and reviews I've seen, maybe Skynet Siri is the tipping point.

I'm kind of curious about what kind of addressbook metadata Siri will require to work effectively. The demo examples like "call my wife when I leave work" requires your iPhone to know who your wife is, where you work, etc.

Will there be some kind of training required to configure? Does it require some extra tags in your addressbook, like, "I work at 1313 mockingbird lane" and my spouse is such and such?

The current reminders app in IOS5 that works based off of GPS location requires each 'location' to be pre-entered under a contact; it won't let you input a location while creating the reminder. I assume Siri will have the same requirements. Makes it easy when you ask for directions to so and so's house.

Am I the only one who wants to see Siri come to the Mac? It would be nice to, say, have my Mac sat in the corner, with me shouting commands across the room to read new email, dictate new email, add events to iCal, Skype/FaceTime people, etc.

++I've been waiting for Siri since I spent $30 for a PlainTalk mic for my 6100 running OS 7.5.

I named him Nixon, and he was just useful enough for me to try using him many times, but not useful enough for me to keep using him on a regular basis. Speech rec seems like it's been on that cusp for 20 years now. From the videos and reviews I've seen, maybe Skynet Siri is the tipping point.

The actual *speech* recognition is easy, and that's old. Recognizing context is what's hard, and this is what will hopefully make Siri good and speech recognition in general useful. Also, set it apart from all of the other phones people keep mentioning "already do this". No, they don't.