iOS 10 will identify faces, objects, landscapes, and other elements in your photos without forcing you to share your data.

As part of the live Talk Show at WWDC 2016, Apple senior vice president of software engineering, Craig Federighi, and senior vice president of worldwide marketing, Phil Schiller, provided more details on how deep learning and artificial intelligence is being used in iOS 10 to surface search results without requiring you to share your data with Apple.

How will Apple index my existing photos?

When you first download iOS 10, if you have existing photos in your library, your iPhone or iPad will begin to process them in the background at night when you're plugged in. That way you won't see any performance degradation or excessive power drain during the day when you're trying to use your iPhone or iPad.

Once complete, all your old photos will be indexed for the new, better search.

What about on macOS?

Same thing. iOS typically gets released a few weeks before macOS (formerly OS X), though, and not everyone with an iPhone or iPad has a Mac, so Apple wants you to be able to enjoy the new search benefits immediately.

When the macOS Sierra update arrives later this fall, it'll index your Mac Photos library the same way.

Wait, won't the search index just sync between devices?

Not right now, but maybe one day. A system would need to be built that securely, privately shared metadata and index information between iPhone, iPad, Mac, and other products.

What about new photos? Do I have to wait for them to index?

Nope! Apple enjoys a tremendous lead when it comes to chipset architecture in mobile, and they're "spending" some of it on deep learning and AI processing immediately when photos are capture.

The image signal processor (ISP) inside the Apple A9 already handles an incredible amount of calculations for everything from white balance to burst selection. Deep learning and AI take a couple billion more, but the A9 GPU can still handle those near-instantly.

Why doesn't Apple need me to give them and their servers my photos data to get the indexing done?

According to Federighi, Apple doesn't need our photos to figure out what a mountain looks like in a photo. Their "detectives" managed to look at public domain images and figure that out.

But will it work as well as services that do require photo data sharing?

To be determined. We'll have to wait for iOS 10 to launch this fall and really put it through its paces. Personally, privacy is as valuable to me as money, time, or attention, so having the option is great for customers.

I don't use Google or Facebook photos today, so for me any upgrade will be great. If you aren't sure yet, you'll need to see how much functionality it gives you, weigh the options, and make the best choice for you.