How to Get the Most out of Siri in iOS 12

iOS 12 isn’t a radical change from iOS 11, but Apple did make notable progress on several fronts. iOS 12 improves performance. Older devices may run more quickly and with fewer bugs than they did on previous operating systems. iOS now groups notifications together based on app, a subtle change that—paired with a few extra controls—makes notification management more efficient. iOS 12 also includes some upgrades to Siri, Apple’s digital assistant.

Siri was the first mainstream virtual assistant to land in our electronics, but Google and Amazon (and others) developed their own artificial intelligence helpers that have since surpassed Siri’s abilities. With iOS 12, Siri closes the gap. Apple has expanded its knowledge graph, is using A.I. for proactive suggestions, and includes automation for customizing commands and behaviors.

It would have been disappointing if Apple hadn’t used A.I. to make Siri more proactive, given the improvements we’ve seen with other operating systems and digital assistants. iOS 12 delivers on this front with Siri Suggestions. In iOS 12, your iPhone begins to learn your typical behaviors: You may play a certain Spotify playlist each morning, search for lunch spots around noon each day, or habitually look up Google Maps directions for calendar meetings an hour before they’re scheduled. Siri Suggestions considers about 100 different factors, including time of day, location, upcoming calendar events, and your Wi-Fi network (which can also add context about your location) to decide what apps and shortcuts to surface as a banner on your iPhone or Apple Watch home screen. The goal is to make accomplishing that task faster and easier.

When you download iOS 12, Siri Suggestions won’t appear instantly. Over time, as Siri learns your habits, you’ll begin to see suggestions. This sort of predictive behavior may pose a security concern for some users because of the personal data that’s actively monitored. Apple mitigates this risk by keeping its machine learning on device, rather than sending potentially sensitive information to cloud data servers. You also have the option to disable Siri Suggestions from appearing on the lock screen by heading into your phone’s settings. You can also exclude certain apps from Suggestions or disable the feature more broadly across the OS.

If you prefer a more hands-on approach to improving Siri, there’s also Siri Shortcuts. It’s Apple’s answer to Amazon Alexa features like Routines and Blueprints. Rather than relying purely on A.I., Shortcuts is a more granular way for Siri to work for iOS users through automation. You can program a specific phrase (“Hey Siri, I want coffee”) to initiate a specific string of actions (such as searching Yelp for coffee shops and then providing turn-by-turn navigation to the nearest option). While the Wall Street Journal called these handmade, step-by-step shortcuts “maddeningly complicated” to set up, the end result is a set of unique Siri commands that do exactly what you want them to do, whether it’s making it easier to share photos or information with loved ones or changing connected home device settings. To make the Shortcuts process easier, as your phone offers Suggestions based on your habits, it’ll also offer to turn those Suggestions into dedicated Shortcuts for you. (You can try this out by heading into Settings, Siri & Search, and then looking at iOS 12’s list of Suggested Shortcuts and All Shortcuts.)

When it comes to general queries, Siri can now answer questions on more topics. She can tap into a USDA nutrition database for information about food and nutrition. You can ask things like “How many calories are in a pint of beer?” or “How much cholesterol is in a burger?” You can also ask whether a food is healthy or not. Siri will return the nutritional and macronutrient stats on the food in question, including its calories and total fat, carbohydrate, and protein breakdown. It’s arguably not as complete or extensive as databases in apps such as MyFitnessPal, but it’s a step toward being a more knowledgeable assistant (and besting Google Assistant at these types of queries). To that end, Siri also has expanded knowledge in two other unrelated areas: celebrity trivia and motor sports. You can ask questions such as “How old is Harrison Ford?” or “Who is Beyoncé married to?” You can also ask about NASCAR or Formula One standings.

You could previously use Siri to search your photos, but the assistant’s capabilities were mostly limited to pulling up photos based on location or date taken. In iOS 12, photo searching is expanded enough to actually be useful because of its better object and facial recognition. You can ask Siri to pull up photos of your partner or a friend; you can ask for photos taken at the beach; you can ask for photos that include a specific object or animal, like cats or dogs, or photos taken at a specific establishment. You can also combine these into a single query like “Show me photos of Anita at the beach.”

Siri also adds an ability that makes HomePod more useful: You can use the assistant to help find your devices. If you’ve misplaced your iPhone, you can say “Hey Siri, find my iPhone,” and the device will play audio to help you locate its whereabouts.

For a long time, it felt like Apple was neglecting Siri as competing assistants such as Alexa and Google Assistant gained feature after feature. With iOS 12, Apple addresses some of those shortcomings with a mix of A.I., greater user control, and expanded knowledge. iOS 12 shows promise that Siri may finally evolve from a rarely helpful gimmick into a legitimately helpful tool. If you gave up on Siri, it may finally be time to give the assistant another shot.