Siri Shortcuts is the name for a new feature coming in iOS 12 that'll proactively suggest actions to you on the Lock screen and in Search, and allow you to make your own shortcuts and automations, complete with easy-to-use voice triggers.
That's led some people to question just how Siri can predict what you'll want to do before you do it, and how it will protect your privacy when syncing and communicating across devices and the network.
The good news is, while Siri Shortcuts is a new feature, the security and privacy technology its been built on has been around for a while, and is both hardened and rock solid.
Here's how it works.
Apple has been surfacing data detectors in its products for years now. At first it was for obvious things like making email addresses, telephone numbers, map locations, event dates, etc. click/tap-able and actionable.
Now, iOS can detect a wider, looser range of key data in things like Messages as well. The contents of the messages are never read on Apple's servers. They're analyzed on-device. That also means the contents aren't and can't ever be used for ad targeting or marketing. The on-device analysis means it can only be used for you.
So, if your significant other iMessages you about having brunch tomorrow morning, when tomorrow morning comes, Siri Shortcuts might suggest you turn on Do Not Disturb during brunch so you can, you know, pay attention to each other instead of your phones.
Extensibility was introduced back in iOS 8 as a way to let apps securely surface functionality to the OS and other apps.
Basically, it maintains app sandboxes to protect user data and protect users from things like malicious code, but lets a container app offer up functionality to the OS. At that point, the data can be displayed without allowing it to be read by any other processes or apps.
For Siri Shortcuts, there are two ways apps can surface this data.
- The existing User Activity API, which has also been around since iOS 8 and has previously enabled everything from Continuity Handoff to deep-linking in Search to "Siri, remember this!". It bookmarks where you were in an app so you can get back there quickly and conveniently.
- They new Intents API, which lets apps define what they can do and donate what you do with them to the Siri system for potential suggestion. (Which may make use of them if frequency, time, location, or other signals add up to relevancy.)
In this context, NSUserActivity is again only used on-device and for your convenience, and no data is collected or exploited by Apple. (For Handoff, it's limited to direct sync in immediate proximity to same Apple ID only. Deep links were originally announced with an anonymized server-side machine learning component but ultimately shipped without it. Siri functionality was and remains on-device.)
The Intents API takes things even further. If you remove information or stop using a feature, the app is expected to remove the donation. So, not only does Apple not want to know what you're doing, it doesn't want developers to even bother offering things you're not currently doing.
You can optionally create voice activated triggers for your Siri Shortcuts. Simply tap, say the trigger phrase you want to use, and it's set up and ready to go.
When you later invoke Siri to activate the shortcut, Siri handles it the same way as any other voice command. When Siri is enabled for the first time on a device, it creates a random identifier that's used for voice recognition on the Siri servers. It's not shared with any other service and is used only to make Siri a better service. Any time Siri is turned off and back on, Siri will generate a new random identifier.
Any and all communications between the device and the server are done through HTTPS.
(Apple keeps anonymous utterances for a short period of time both to better train Siri to understand your voice and, for a slightly longer period of time, to improve overall Siri quality of service in general.)
So, nothing you say to Siri can ever be traced back to you.
It took Apple a while to start syncing Siri data between your devices. Previously, if you picked up a new device, you'd have to train Siri all over again. Now, your Apple ID will let Siri pick up right where it left off. It'll also let additional devices, like iPad and HomePod benefit from what you set up on iPhone, for example.
The way Apple handles this is through your Cloud ID, which is the same infrastructure used to for iCloud, only set up for end-to-end encryption.
In other words, Apple securely syncs Siri — and now Siri Shortcuts — the same way it securely syncs iMessages, HomeKit setups, and even Health data.
The encryption is done on-device, signed with keys available only on your device, transmitted over HTTPS, and then decrypted on your other device(s), again using keys only available on those device(s).
During transit, all your data is reduced to pseudorandom gibberish generated by the encryption.
The great data debate
All design is compromise and every approach has advantages and disadvantages. Some people will be absolutely thrilled with the lengths Apple goes to to maintain user privacy and security.
Others would think that it if Apple did take customer data and use it to train its machine learning models, it would be better and faster for everyone. (And make Apple more competitive against the likes of Google and Amazon.)
In some few cases, Apple will let you explicitly opt-in to make your data available so that it can be used for other services — though still strictly for your benefit and still never exploited for advertising or marketing purposes.
But, I think those are still few and far between and Apple believes, deeply, that privacy and security aren't just product differentiators but moral imperatives.
Siri Shortcuts, coming soon!
Siri Shortcuts will ship as part of iOS 12 this fall. They're currently available for limited testing in the developer beta and, soon, in the public beta.
I think they're the next leap forward towards push interface but I'd love to hear what you think!
Get the best of iMore in in your inbox, every day!
Rene Ritchie is one of the most respected Apple analysts in the business, reaching a combined audience of over 40 million readers a month. His YouTube channel, Vector, has over 90 thousand subscribers and 14 million views and his podcasts, including Debug, have been downloaded over 20 million times. He also regularly co-hosts MacBreak Weekly for the TWiT network and co-hosted CES Live! and Talk Mobile. Based in Montreal, Rene is a former director of product marketing, web developer, and graphic designer. He's authored several books and appeared on numerous television and radio segments to discuss Apple and the technology industry. When not working, he likes to cook, grapple, and spend time with his friends and family.
I’m very impressed with Siri learning even on 11.4. This week we are away from home on holiday and in a rented property. We’ve been out-and-about in the car a few times and last night, unprompted, when we got in the car to go back to the let Siri prompted ‘22 minutes to get to X’ where X was a pretty accurate description of the holiday home address. I hadn’tprovided this information in an way, Siri had just worked out this was a hub location for us at the moment. Typical behaviour would be to say ‘X minutes to get to home’
Rene, will you please make this privacy point clear on the next MBW? People need to understand that the power of the Apple processors allows Apple to keep information off their servers, and Leo keeps saying that Apple "finally gave in" on privacy to make Siri better and that's a false and bad message to put out. Leo keeps using the same wrong example that Google could tell you when you needed to leave for the airport, etc., and Siri couldn't, when Siri already could do that before short cuts.
While true, the cost of this is Siri is easily the dumbest and most limited of mobile assistants out there.
Thank you for signing up to iMore. You will receive a verification email shortly.
There was a problem. Please refresh the page and try again.