Siri

Siri is Apple's virtual personal assistant, available on iOS devices, the Apple Watch, and the Apple TV. When activated, Siri can respond to natural language requests to help Apple product owners find information, complete simple tasks, and get recommendations.

Siri was introduced in 2011, and since that time, Apple has expanded Siri's capabilities quite a bit. On the iPhone and Apple Watch, Siri can be used for a long list of tasks, such as making phone calls, sending messages, identifying songs, downloading apps, changing device settings, searching the web, finding movie and restaurant reviews, making dinner reservations, creating reminders and calendar events, calculating tips, and more.

On the Apple TV, Siri is used as a main method of input for finding specific TV and movie-related content, and Siri is also an integral aspect of HomeKit, Apple's home automation platform.

Siri is activated on the iPhone by holding down the Home button, the Apple Watch by holding down the Digital Crown, and the Apple TV (where available) by holding the Siri button on the included Siri Remote. Hands-free Siri is available by saying "Hey Siri" to an iOS device that is plugged in or an iOS device with an integrated motion coprocessor.

Siri was added to the Mac in 2016 and the HomePod in 2018.

'Siri' How Tos

iOS 11 brings new functionality to Siri, including a translation feature that allows Siri to translate words and phrases spoken in English to a handful of other languages. Translation is super simple to use, and while the translations aren't always perfect, they get the gist of what you're attempting to say across to someone who speaks another language.
Using Siri Translate
Activate Siri, either by holding down the Home button or using a "Hey Siri" command.
Tell Siri the phrase you want to translate and the language you want it in. For example: "Siri, how do I say where's the bathroom in Spanish?"
Siri will respond with the appropriate translation, both in text form and vocally. The vocal component can be replayed by pressing the play button located at the bottom of the translation.
There are multiple ways to phrase your translation requests. Siri will respond to "Translate X to X language" or "How do I say X in X language?"
Available Languages
Siri can translate English to Mandarin, French, German, Italian, and Spanish. There's no two-way translation available yet - it's only English to the above listed languages. Apple has said it plans to add additional languages to the Siri translation feature following the release of iOS 11.
Apple appears to be using an in-house translation engine for Siri, as the translations do not match up with translations provided by popular services like Google Translate or Bing Translate. Also of note, while Siri can translate from English to several other languages, the translation features do not work with British,

With macOS Sierra, Apple has finally brought its well-known personal assistant, Siri, to the Mac. Siri for Mac differs from iOS' version of Siri in several ways, taking advantage of the larger real estate of a Mac's display and the Finder file system. Users can also easily transfer or pin Siri's search results to the Notification Center or documents they're working on. To help you get started with Siri for Mac, we've put together a guide outlining what it's capable of.
Activating Siri
There are three ways to activate Siri in Sierra. Two of the methods are visually obvious while the third is not.
The Dock icon sitting in between the Finder and Launchpad logos.
The Menu Bar toggle in between the Spotlight search and Notification Center icons.
The keyboard command. Hold the Command and Space buttons for approximately two seconds.
Siri can be enabled two ways. While you're installing macOS Sierra, there'll be a prompt asking you whether you'd like to enable Siri. Additionally, Siri can be enabled and disabled in the Siri section of System Preferences. There are several other options for Siri in System Preferences, including language, voice, voice feedback, mic input and customized keyboard shortcuts.

If you are signed up for the free, three-month trial of Apple Music, you probably know by now many of the cool features the streaming music service has to offer. But, did you know that Siri can make the experience even better? We've got a few tips for getting Siri to act as your digital deejay.
To get the full use of Siri's compatibility with Apple Music, make sure you are subscribed and your iCloud Music Library is on.
Play a Radio Station or Beats 1
Not only can Siri play a radio station like Electronic or Oldies, but now the personal assistant can also start playing live Beats 1 programming. Just ask her to "Play Beats 1."
Play an Apple Music Playlist
One of the things I love about Apple Music is the playlist feature in the For You section. If I've recently "liked" a particular song, A new playlist based on that will show up. If you know the name of an Apple Music created playlist, ask for it specifically. For example, "Play Souxie & The Banshees: Deep Cuts."
What Song is Playing
If Apple Music is playing a song you don't recognize, you can ask for more information. Just say, "What song is this?" to discover the artist and song title.
Add an Album to Your Playlist
If you like the song that is playing and want to hear the whole album, ask Siri to add the album to your playlist and it will begin playing after the current track is finished.

Apple Watch is the perfect device for quickly glancing at the things you need to do today or to fill you in on your plans for the weekend. It is also a useful device for quickly setting up a reminder without needing to pull out an iPhone.
While much of the setup for Calendars is done on iPhone, you can use Apple Watch to respond to invites, add a quick event, and get alerts to remind you when to leave for your next appointment.
Using the Calendar App
The Calendar App on Apple Watch is tied to Apple's native Calendar app on iOS, which is also compatible with OS X. I sync my Calendar app with Google Calendar, but it is compatible with a number of services, like Exchange, Facebook, Yahoo, and remote servers via CalDAV. In order to use the Calendar app on Apple Watch, you must be using it in some form on iPhone.

Siri is a workhorse of a virtual assistant for iOS, but I rarely see anyone actually using the feature on the iPhone. Maybe it is because most people don't know all of the amazing things she (or he) can do. Siri's improved a lot over the past several years and there's now a long list of tasks she can accomplish, so if you haven't been using Siri it might be time to give it another look.
Siri can schedule appointments, call your friends, read your text messages to you, play back your music, and much more. Apple recently updated Siri's webpage with more details on the different commands. Today, we've got a quick set up guide for using Siri, plus a list of features that Apple's virtual assistant can perform if you already know how to use it.
Set up Siri
Open the Settings app.
Select General from the menu.
Select Siri from the list of available options.
Turn the toggle switch on.
Optionally, turn on the toggle switch for "Hey Siri" to use the feature hands-free when it is connected to a power source.
Select "My Info" to add your contact details to Siri's database.
Once activated, to use Siri, simply hold the Home button on the iPhone (or iPad) until the microphone icon appears, or simply say, "Hey Siri" when your iPhone is connected to power. On the Apple Watch, you can hold the Digital Crown to bring up Siri, or just say "Hey Siri" immediately after raising your wrist or tapping the screen to wake up the watch.
Siri can perform a variety of tasks to make your life easier. Below is a list of phrases that you can use to make the most of your virtual

Anyone with an iPhone 4S or newer knows how to use Siri. Even if you've never used the "Hey Siri" feature on iOS 8, you can probably figure it out fairly easily. However, on a completely different device, like Apple Watch, accessing Siri may need a little bit of extra training.
If you are having trouble getting Siri to activate, we've got a tutorial that may help shed some light on how to get her attention. Plus with one simple question, you can find out everything that your personal assistant can help you with on Apple Watch
Using "Hey Siri"
You can get Siri's attention by raising your wrist and speaking the words "Hey Siri" within range of Apple Watch. You can also ask follow up questions the same way.
If you have experienced problems getting the Hey Siri feature to work, there are a couple of factors that may be keeping her from responding.
First of all, if you navigated to the watch face by pressing the Digital Crown from another view, like the Home screen, Hey Siri won't work. It also doesn't work if you are in the Glances screen.
Instead, you will have to lower your wrist until the Apple Watch screen goes to sleep. Then, wake Apple Watch by lifting your wrist again. Then, say "Hey Siri" to activate your personal assistant.
You can use Hey Siri while you are viewing an app, in the Notifications screen, or on the Home screen. But if you are having trouble getting her attention, try the steps above.
One other reason that you may be having trouble using Hey Siri is if the microphone is blocked. Apple Watch's microphone is on the side of the

'Siri' Guides

HomeKit is Apple's home automation platform for controlling smart home products with iOS apps and Siri voice commands. The platform was announced at WWDC 2014, and the first HomeKit-enabled products were released one year later.
The software framework communicates directly with connected accessories within the home, securely encrypts all data and even works remotely over iCloud remote access with a third-generation Apple TV or later when you are away from

'Siri' Articles

Apple's new HomePod is late to the smart speaker market, which is already crowded with speakers from companies like Amazon, Google, and Sonos. The latter two companies, Google and Sonos, have released speakers with high-quality sound and robust voice assistants, giving the HomePod some serious competition.
We decided to pit Apple's $349 HomePod against both the $399 Google Home Max, which comes with Google Assistant, and the $199 Alexa-powered Sonos One to see how the HomePod measures up.
Subscribe to the MacRumors YouTube channel for more videos.
To compare the three speakers, we focused on design, sound quality, and the overall performance of Siri, Alexa, and Google Assistant.
When it comes to design -- and this is certainly subjective -- we preferred the look of the HomePod with its fabric-wrapped body and small but solid form factor. The Sonos One looks a little more dated with its squarer body and standard speaker mesh, while the Google Home Max has a much larger footprint that's going to take up more space.
Apple's HomePod
All three offer touch-based controls at the top of the device, but the Google Home Max has one design edge - a USB-C port and a 3.5mm audio jack for connecting external music sources. The Sonos One has a single Ethernet port, while the HomePod has no ports.
Though we liked the HomePod's design, Siri, as you might expect, did not perform as well as Alexa on Sonos One or Google Assistant on Google Home Max.
Google Home Max
On questions like "Is Pluto a planet?" or "What's the fastest car?" both Alexa and Google Assistant were

Apple's alleged plans to double down on the quality of its iPhone, iPad, and Mac software platforms, rather than rush to introduce new features, have been revealed in more detail by Mark Gurman at Bloomberg News.
The report claims that Apple's software engineers will have more discretion to delay features that aren't as polished, with the company essentially shifting to more of a two-year roadmap for iOS and macOS, rather than trying to release major feature-packed upgrades every single year without question.Instead of keeping engineers on a relentless annual schedule and cramming features into a single update, Apple will start focusing on the next two years of updates for its iPhone and iPad operating system, according to people familiar with the change. The company will continue to update its software annually, but internally engineers will have more discretion to push back features that aren't as polished to the following year.The report describes Apple's new strategy as a "major cultural shift," and an admission that its recent software updates have suffered from an uncharacteristic number of bugs, ranging from a critical root superuser vulnerability on Mac to iMessages appearing in the wrong order across Apple devices.
Apple's commitment to a fast-paced iOS release schedule already led some features to be delayed regardless, including Apple Pay Cash and Messages on iCloud, so the new strategy would likely involve not announcing or testing those features in beta until they are much closer to being ready for public release.
Despite the increased focus on

Uber yesterday updated its iPhone app, and while the release notes do not mention any specific changes, the latest version appears to re-enable the ability to request a vehicle for pickup using Siri or Apple Maps.
After updating the Uber app, we were successfully able to ask Siri to hail us a ride, while tapping on the Ride tab in Apple Maps once again listed Uber as one of the ride-hailing services available alongside Lyft.
While the Siri and Apple Maps integrations are working again in the United States, we encountered errors when trying to hail an Uber with Siri and Apple Maps in Toronto, Canada, where the features were previously supported.
As noted by Christian Zibreg at iDB, some users may need to manually re-enable the Siri and Apple Maps integrations in Settings → Uber → Siri & Search and Settings → Maps under "Ride Booking Extensions."
The ability to hail an Uber ride with Siri or Apple Maps had disappeared in late January following an earlier update to the Uber app. Both features were originally added in iOS 10, and it's unclear what prompted their temporary removal.
Uber's app is available for free on the App Store

Uber's latest app update appears to have removed several important iOS integrations, with the service now unavailable to both Siri and Apple Maps.
If you ask Siri to get you an Uber, a feature that has been available since the launch of iOS 10, Siri will say that Uber hasn't activated that feature. In the "Siri & Search" section of the Uber options in the Settings app, there's also no longer a "Use with Siri" toggle.
Similarly, in Apple Maps, you can no longer select Uber as an option when choosing "Ride" when getting directions. This is also a feature that debuted in iOS 10.
Both Siri and Apple Maps integrations are still available for other ride sharing apps like Lyft, so the problem seems to be with the Uber app rather than with Apple's services.
The removal of both features was noticed by MacRumors readers and reddit users starting last week. It is not clear if Uber has deliberately removed these features or if it's a bug, and the company did not respond to a request for comment when contacted by MacRumors earlier this afternoon. We have also contacted Apple and will update this post when we hear back.
Uber integration with Siri, enabled through the SiriKit API, was a much touted feature when iOS 10 first launched, as was Apple Maps integration. Both Apple and Uber heavily promoted the two options when iOS 10 rolled

Siri has been updated with additional sports information, allowing the personal assistant to provide details about a range of tennis and golf events. Siri's new knowledge has been introduced just ahead of the Australian Open, which is set to kick off this weekend, and it joins other sports data Siri offers for baseball, basketball, hockey, and football.
As noted by 9to5Mac, Siri can provide information on both upcoming tournaments and past events from recent years, along with details on player backgrounds and statistics.
For tennis, the personal assistant can answer queries about the ATP world tour and the Women's Tennis Association, offering up data from 2016-2018. For golf, Siri can provide details about men and women's PGA and LGPA tours.
The new golf and tennis data available from Siri is accessible on iOS devices running the latest version of iOS, and it is also available on Macs, the Apple TV, and the Apple

iOS 11.2 beta, released this morning, introduces SiriKit support for the HomePod, according to Apple. With SiriKit for HomePod now available, Apple is asking developers to make sure SiriKit-compatible apps are optimized for HomePod ahead of the device's release.
SiriKit is designed to allow iOS and watchOS apps to work with Siri, so users can complete tasks with Siri voice commands. SiriKit is available for a wide range of apps on those two platforms, but its availability is slightly more limited when it comes to HomePod.
Third-party apps that use SiriKit Messaging, Lists, and Notes are compatible with the HomePod. Siri will recognize voice requests given to the HomePod, with those requests carried out on a linked iOS device. So, for example, users can ask HomePod to send a message to a friend, add an item to a list, or create a new note. Sample HomePod requests:
- Send a text to Eric using WhatsApp
- In WeChat, tell Eric I'll be late
- Add chocolate and bananas to my list in Things
- Create a note that says "hello" in Evernote
Developers can test the voice-only experience of their apps using Siri through headphones connected to an iOS device with the iOS 11.2 beta.
Apple plans to release the HomePod this December, but a specific launch date for the speaker has not yet been provided. When it becomes available, the HomePod will cost

A new entry in Apple's Machine Learning Journal provides a closer look at how hardware, software, and internet services work together to power the hands-free "Hey Siri" feature on the latest iPhone and iPad Pro models.
Specifically, a very small speech recognizer built into the embedded motion coprocessor runs all the time and listens for "Hey Siri." When just those two words are detected, Siri parses any subsequent speech as a command or query.
The detector uses a Deep Neural Network to convert the acoustic pattern of a user's voice into a probability distribution. It then uses a temporal integration process to compute a confidence score that the phrase uttered was "Hey Siri."
If the score is high enough, Siri wakes up and proceeds to complete the command or answer the query automatically.
If the score exceeds Apple's lower threshold but not the upper threshold, however, the device enters a more sensitive state for a few seconds, so that Siri is much more likely to be invoked if the user repeats the phrase—even without more effort.
"This second-chance mechanism improves the usability of the system significantly, without increasing the false alarm rate too much because it is only in this extra-sensitive state for a short time," said Apple.
To reduce false triggers from strangers, Apple invites users to complete a short enrollment session in which they say five phrases that each begin with "Hey Siri." The examples are saved on the device.We compare the distances to the reference patterns created during enrollment with another threshold to decide whether

Apple this week "acqui-hired" the team from Init.ai, a startup that designed a smart assistant to allow customer service representatives to easily parse through and automate some interactions with users, reports TechCrunch.
The startup focused on creating an AI with natural language processing and machine learning to analyze chat-based conversations between humans.
Init.ai announced that it was shutting down its service earlier this week to join a new project.Today is an exciting day for our team. Init.ai is joining a project that touches the lives of countless people across the world. We are thrilled and excited at the new opportunities this brings us.
However, this means Init.ai will discontinue its service effective December 16th 2017. While we wish to make this transition as smooth as possible, we cannot continue to operate Init.ai going forward.Apple did not purchase Init.ai and will not obtain any intellectual property nor is there an indication that Apple plans to use any existing Init.ai services. Instead, Apple has taken on the Init.ai team, who will now work on Apple's Siri personal assistant.
The addition of the Init.ai team may hint at Apple's future Siri plans, with the company perhaps planning to build out more business integrations to supplement Business Chat, the iOS 11 iMessage feature that allows businesses to communicate with customers.
TechCrunch says it's not entirely clear how many people from Init.ai will be transitioning to Apple, but the startup only employed six

On October 4, 2011, Apple held a media event in which it introduced Find My Friends, refreshed the iPod Nano and iPod touch, and revealed the iPhone 4s with its all-new Siri voice assistant. This means that today marks the sixth year anniversary of when Apple's Siri was first introduced to the world, although the AI helper wouldn't be available to the public until the iPhone 4s launch on October 14, 2011.
In the original press releases for Siri, Apple touted using your voice to send text messages, schedule meetings, set timers, ask about the weather, and more. Apple explained Siri's understanding of context and non-direct questions, like presenting you with a weather forecast if you ask "Will I need an umbrella this weekend?"
The original Siri interface on iOS 5 Siri on iPhone 4S lets you use your voice to send messages, schedule meetings, place phone calls, and more. Ask Siri to do things just by talking the way you talk. Siri understands what you say, knows what you mean, and even talks back. Siri is so easy to use and does so much, you’ll keep finding more and more ways to use it.
Siri understands context allowing you to speak naturally when you ask it questions, for example, if you ask “Will I need an umbrella this weekend?” it understands you are looking for a weather forecast. Siri is also smart about using the personal information you allow it to access, for example, if you tell Siri “Remind me to call Mom when I get home” it can find “Mom” in your address book, or ask Siri “What’s the traffic like around here?” and it can figure out

Starting today, Apple search results from Siri and Spotlight on Mac and iOS will be provided by Google rather than Microsoft's Bing. Apple announced the news in a statement that was given to TechCrunch this morning, claiming consistency across iOS and Mac devices is the reason behind the switch.
"Switching to Google as the web search provider for Siri, Search within iOS and Spotlight on Mac will allow these services to have a consistent web search experience with the default in Safari," reads an Apple statement sent this morning. "We have strong relationships with Google and Microsoft and remain committed to delivering the best user experience possible."Prior to this morning, all results from a search conducted on Spotlight using Finder on Mac or the swipe down search bar on iOS were Bing search results, as was all search information provided by Siri. Now, when you search using Spotlight or when you ask Siri a question that ends up involving a web search, info will come from Google.
According to TechCrunch, the swap will include both web links and video results from YouTube, but web image results in Siri and Spotlight searches will continue to be provided by Bing for the time being. Google searches will use the standard search API and will provide the same search results you'd get from a Google.com search.
While Apple has used Bing for search results for things like Siri and Spotlight, Google has remained the default search engine on iOS and Mac devices. Earlier this year, reports suggested Google paid Apple nearly $3 billion to maintain its position as the

Ahead of the launch of iOS 11, Apple VP of marketing Greg Joswiak sat down with several publications to talk about Siri, the personal assistant built into all major Apple devices. His interview with Wired was published last week, and today, Fast Company published its interview, in which Joswiak talks Siri and privacy, among other topics.
It's been long believed that Apple's Siri development has been hindered by the company's deep commitment to privacy, but according to Joswiak, privacy, respect for user data, and an intelligent AI can co-exist.
"I think it's a false narrative," he told Fast Company. "We're able to deliver a very personalized experience... without treating you as a product that keeps your information and sells it to the highest bidder. That's just not the way we operate."
Much of Apple's Siri functionality is done on-device, rather than in the cloud like other services. In Apple's 2017 software updates, that's shifting slightly with the company planning to allow Siri to communicate across devices to learn more about users. Still, many things, like Siri's ability to find photos with a specific photo or date are powered on-device."Your device is incredibly powerful, and it's even more powerful with each generation," Joswiak said. "And with our focus on privacy, we're able to really take advantage of exploiting that power with things like machine learning on your device to create an incredible experience without having to compromise your data."Apple does use the cloud to answer requests and to train Siri, but it strips all user identifiable data.

In iOS 11, Apple's AI-based personal assistant Siri has a much more natural voice that goes a long way towards making Siri sound human like. Siri speaks with a faster, smoother cadence with elongated syllables and pitch variation, a noticeable departure from the more machine like sound in iOS 10.
The team behind Siri, including Siri senior director Alex Acero, has worked for years to improve the way Siri speaks, according to a new interview Acero did alongside Apple VP of marketing Greg Joswiak with Wired. While Siri's voice recognition capabilities were powered by a third-party company early on in Siri's life, Acero's team took over Siri development a few years back, leading to several improvements to the personal assistant since then.
Siri is powered by deep learning and AI, technology that has much improved her speech recognition capabilities. According to Wired, Siri's raw voice recognition capabilities are now able to correctly identify 95 percent of users' speech, on par with rivals like Alexa and Cortana.
Apple is still working to overcome negative perceptions about Siri, and blames many of the early issues on the aforementioned third-party partnership."It was like running a race and, you know, somebody else was holding us back," says Greg Joswiak, Apple's VP of product marketing. Joswiak says Apple always had big plans for Siri, "this idea of an assistant you could talk to on your phone, and have it do these things for you in a more easy way," but the tech just wasn't good enough. "You know, garbage in, garbage out," he says.Joswiak says Apple's aim from

Apple has updated its executive leadership page to acknowledge that software engineering chief Craig Federighi now officially oversees development of Siri. The responsibility previously belonged to Apple's services chief Eddy Cue.
Craig Federighi is Apple’s senior vice president of Software Engineering, reporting to CEO Tim Cook. Craig oversees the development of iOS, macOS, and Siri. His teams are responsible for delivering the software at the heart of Apple’s innovative products, including the user interface, applications and frameworks.Apple's leadership page is only now reflecting Federighi's role as head of Siri, but the transition has been apparent for several months, based on recent interviews and stage appearances at Apple's keynotes.
At WWDC 2016, for example, Federighi and Apple marketing chief Phil Schiller joined Daring Fireball's John Gruber to discuss how Apple was opening Siri up to third-party developers with SiriKit later that year.
At WWDC 2017, Federighi was on stage to discuss improvements to Siri in iOS 11, including more natural voice, built-in translation capabilities, and advances in machine learning and artificial intelligence.
Cue continues to oversee the iTunes Store, Apple Music, Apple Pay, Apple Maps, iCloud, and the iWork and iLife suites of apps, and handing off Siri should allow him to focus more on Apple's push into original content.
Apple's updated leadership page also now lists profiles for recently promoted employees Deirdre O'Brien, Vice President of People, and Isabel Ge Mahe, Vice President and Managing Director of

Back in July, Apple introduced the "Apple Machine Learning Journal," a blog detailing Apple's work on machine learning, AI, and other related topics. The blog is written entirely by Apple's engineers, and gives them a way to share their progress and interact with other researchers and engineers.
Apple today published three new articles to the Machine Learning Journal, covering topics that are based on papers Apple will share this week at Interspeech 2017 in Stockholm, Sweden.
The first article may be the most interesting to casual readers, as it explores the deep learning technology behind the Siri voice improvements introduced in iOS 11. The other two articles cover the technology behind the way dates, times, and other numbers are displayed, and the work that goes into introducing Siri in additional languages.
Links to all three articles are below:
Deep Learning for Siri's Voice: On-device Deep Mixture Density Networks for Hybrid Unit Selection Synthesis
Inverse Text Normalization as a Labeling Problem
Improving Neural Network Acoustic Models by Cross-bandwidth and Cross-lingual Initialization
Apple is notoriously secret and has kept its work under wraps for many years, but over the course of the last few months, the company has been open to sharing some of its machine learning advancements. The blog, along with research papers, allows Apple engineers to participate in the wider AI community and may help the company retain employees who do not want to keep their progress a

Back in April, Apple product designer and Siri co-founder Tom Gruber gave a TED Talk, where he spoke about his vision of the future of computers and artificial intelligence.
The full 10-minute TED Talk was today published on YouTube, giving us a chance to hear his complete thoughts on the future of AI and Siri.
In his talk, Gruber says computers should be used to lessen human failings, like memory, and augment human capabilities. He believes computers should log all aspects of our lives, allowing us to remember the people we've met and details about them, like favorite sports, family members, and name pronunciation.
Gruber's talk also covers the importance of privacy and a range of useful applications for AI, like cancer detection and advanced personal assistants like

Apple today uploaded two additional videos in its series that stars Dwayne "The Rock" Johnson and Apple's personal assistant Siri, both of which are meant to show off Siri's range of functionality.
In the first video, The Rock uses Siri to activate the HomeKit-connected lights in his gym, and in the second, he speaks in Mandarin, demonstrating Siri's ability to work with multiple languages.
Support for multiple languages is one of the main differentiating factors between Siri and other voice-based AI assistants like Alexa and Cortana.
Both of today's videos are new, but look similar to scenes that were in the original "The Rock x Siri Dominate the Day" spot, which is three and a half minutes in length. In the first ad, The Rock is seen using Siri throughout an entire day as he commandeers a plane, cooks a meal in a high-end restaurant, and ends up in space.
Apple yesterday shared three other short ads depicting The Rock asking the personal assistant to set a reminder, set a timer, and take a selfie.
Apple's partnership with The Rock to highlight Siri features comes as Apple gears up to release its first Siri-based speaker at the end of this year, the HomePod. Siri is also an important part of the iPhone, iPad, and Mac, with Apple undoubtedly hoping to increase awareness about all of the things Siri can do through the ad

Apple today uploaded three short 15 second ads in its "The Rock x Siri" series, with content that's primarily been pulled from the main three minute "The Rock x Siri Dominate the Day" video, which was originally released on July 23.
Each video features The Rock interacting with Siri to set a reminder, take a selfie, and set a timer. The reminder video is new, while the selfie and timer videos feature scenes from the original spot.
Dwayne "The Rock" Johnson originally teased his partnership with Apple and Siri as a movie, complete with a movie poster, but it turned out that instead of a short film, his feature with Apple was simply an ad spot.
"I partnered with #Apple to make the BIGGEST, COOLEST, CRAZIEST, DOPEST, MOST OVER THE TOP, FUNNEST (is that even a word?) movie ever," The Rock wrote on Facebook ahead of the ad's launch.
Apple is using The Rock to show off the range of tasks that can be completed using the Siri personal assistant built in to the iPhone and the iPad. "You should never, ever, under any circumstances underestimate how much Dwayne Johnson can get done in a day with Siri," reads the description for the first video.

Last week, a source in the Taiwanese supply chain reported that Facebook has entered into small production on a smart home speaker with a touchscreen, preparing to compete with companies like Apple and Amazon in the smart speaker market. A report by Bloomberg this week has continued that rumor, and added onto it by claiming the company is in fact working on two separate speaker devices to release to the public, and that it's hiring from Apple to get a "Siri-style" AI voice assistant up and running for the devices' launch.
Coming out of Facebook's Building 8 lab, today's report confirmed many of the features already discussed regarding the touchscreen-enabled speaker. Facebook plans to launch it with a screen size between 13 and 15 inches, a wide-angle lens, and microphones and speakers all powered by artificial intelligence. The screen rests on a thin, vertical stand and Facebook is now deciding whether the UI will run on a version of Android or if it will build its own OS, according to people familiar with the plans.
A few possible Siri commands on HomePod
Although the touchscreen speaker is only in the prototype stage, Facebook has begun testing it in employee homes.
Featuring a laptop-sized touchscreen, the device represents a new product category and could be announced as soon as next spring’s F8 developer conference, according to people familiar with the matter. They say the large screen and smart camera technology could help farflung people feel like they’re in the same room, which aligns with Chief Executive Officer Mark Zuckerberg’s mission of

Dwayne "The Rock" Johnson today announced that he's teamed up with Apple for a movie that co-stars Apple's AI-based personal assistant, Siri.
According to the poster tweeted by Johnson, the movie is called "The ROCKxSIRI Dominate the Day." There are no details on how long the film is or what it's about, but it apparently debuts tomorrow on Apple's YouTube channel.
Based on the image, it appears to feature car chases, space travel, an alien fight, and a concert performance. In a Facebook post, Johnson calls it the "biggest, coolest, craziest, dopest, most over the top, funnest" movie ever.I partnered with #Apple to make the BIGGEST, COOLEST, CRAZIEST, DOPEST, MOST OVER THE TOP, FUNNEST (is that even a word?) movie ever.
And I have the greatest co-star of all time - #SIRI.
I make movies for the world to enjoy and we also made this one to motivate you to get out there and get the job done.
I want you to watch it, have fun with it and then go LIVE IT.The film will premiere on Apple's YouTube channel on Monday, July 24, but it's not yet clear what time it will debut.
This is the second time Apple has teamed up with a partner to release a short film. Last month, Apple highlighted "Détour," a film French director Michel Gondry shot on the iPhone in partnership with Apple and in a decidedly more Apple style.
The project with Dwayne Johnson is unusual, but it comes at a time when Apple is preparing to release the Siri-based HomePod, so that may be why the company has decided to promote its personal assistant in a fun and unique way.
Update: The short

In iOS 11, Siri has a new feature that lets you ask the personal assistant to translate English into one of several different languages. We already did a Siri overview video showing off Siri's new capabilities, but we thought we'd take a closer look at translate, which can come in handy when you're traveling.
Subscribe to the MacRumors YouTube channel for more videos.
Translate works as expected -- ask Siri how to say something in another language, and Siri does it. For example, you might say "Translate 'Where's the bathroom' to Spanish," or "How do I say 'I am a vegetarian' in Chinese?"
When translating, Siri speaks the translation aloud, so whomever you're speaking to can hear what you have to say directly in their language. There's also a button for quickly repeating the spoken translation.
Siri can translate English to Mandarin, French, German, Italian, and Spanish, but not the other way around. Apple says there are plans to expand the feature to include additional languages in the coming months.
As you might have noticed, Siri's new translating abilities are accompanied by a more natural, human sounding voice with better pronunciation and inflection. Siri's also much smarter in iOS 11, thanks to new on-device learning functionality and cross-device syncing. Siri can also make suggestions based on your browsing habits, and it knows more about music.
For details on Siri's new capabilities and all of the other features in iOS 11, make sure to check out our iOS 11 roundup.

MacRumors attracts a broad audience
of both consumers and professionals interested in
the latest technologies and products. We also boast an active community focused on
purchasing decisions and technical aspects of the iPhone, iPod, iPad, and Mac platforms.