This app claims to detect intimate photos using artificial intelligence and lock them away safely

The new app, first reported on by Casey Newton over at The Verge, was created by UC Berkley entrepreneurs Jessica Chiu and Y.C. Chen and was a result of conversations Chiu had with both Hollywood actresses and friends alike:

Each had sensitive images on their phones or laptop, she said, and expressed doubts about how to keep them secure. When Chiu returned to Berkeley, friends would pass her their phones to look at recent photos they had taken, and she would inevitably swipe too far and see nudity.

In order to solve this issue, Chiu, Chen, and their small team created the most sophisticated and intelligent photo vault app on the market at the moment: Nude. Unlike other vault apps, Nude actually scans your camera roll for sensitive content automatically. Once it finds a photo it detects to be NSFW, it moves it into the app and locks it away in a PIN-protected safe space. Then it will delete the image from your camera roll and from iCloud so nobody will accidentally see it while looking over your shoulder at the family Christmas party.

If you're concerned about Nude itself getting ahold of your lewd selfies, worry not: Apparently, as long as you're using iOS 11, your photos are never actually sent to the app's servers. As Newton explains, this is where CoreML comes in:

Crucially, the images on your device are never sent to Nude itself. This is possible thanks to CoreML, the machine learning framework Apple introduced with iOS 11. These libraries allow developers to do machine learning-intensive tasks such as image recognition on the device itself, without transmitting the image to a server. That limits the opportunity for would-be hackers to get access to any sensitive photos and images.

However, it's definitely worth noting that devices incapable of running iOS 11 will use Facebook's Caffe2 to detect whether your photos are risqué or not. Puzzlingly, Caffe2 is a cloud service, but The Verge claims Nude is still able to do the analysis locally on device.

Chiu and Chen taught their program to recognize nude photos by creating software that scours sites like PornHub for nude images, and eventually collected a library of around 30 million references, making certain that the app would function properly for everyone, including people of color. Though the process still isn't perfect and will, according to Chen, detect and lock away "man boobs," the developers assure users that they will continually work on improving the algorithm. Our colleague Phil recently tested the app's analysis capabilities, and unfortunately it didn't work too well for him:

https://twitter.com/mdrndad/status/920705220878979072

If you have some sensitive content you'd like to try and safeguard, you can download Nude for free. However, you will be charged a monthly fee of $0.99 for using the service.

Thoughts? Questions?

Would you consider giving Nude a try? What are your experiences with photo vault services in general? Let us know in the comments!

Updated October 2017: Added information about Nude's AI analysis on devices incapable of running iOS 11.

Tory Foulk

Tory Foulk is a writer at Mobile Nations. She lives at the intersection of technology and sorcery and enjoys radio, bees, and houses in small towns. When she isn't working on articles, you'll likely find her listening to her favorite podcasts in a carefully curated blanket nest. You can follow her on Twitter at @tsfoulk.