This app claims to detect intimate photos using artificial intelligence and lock them away safely

The new app, first reported on by Casey Newton over at The Verge, was created by UC Berkley entrepreneurs Jessica Chiu and Y.C. Chen and was a result of conversations Chiu had with both Hollywood actresses and friends alike:

Each had sensitive images on their phones or laptop, she said, and expressed doubts about how to keep them secure. When Chiu returned to Berkeley, friends would pass her their phones to look at recent photos they had taken, and she would inevitably swipe too far and see nudity.

In order to solve this issue, Chiu, Chen, and their small team created the most sophisticated and intelligent photo vault app on the market at the moment: Nude. Unlike other vault apps, Nude actually scans your camera roll for sensitive content automatically. Once it finds a photo it detects to be NSFW, it moves it into the app and locks it away in a PIN-protected safe space. Then it will delete the image from your camera roll and from iCloud so nobody will accidentally see it while looking over your shoulder at the family Christmas party.

If you're concerned about Nude itself getting ahold of your lewd selfies, worry not: Apparently, as long as you're using iOS 11, your photos are never actually sent to the app's servers. As Newton explains, this is where CoreML comes in:

Crucially, the images on your device are never sent to Nude itself. This is possible thanks to CoreML, the machine learning framework Apple introduced with iOS 11. These libraries allow developers to do machine learning-intensive tasks such as image recognition on the device itself, without transmitting the image to a server. That limits the opportunity for would-be hackers to get access to any sensitive photos and images.

However, it's definitely worth noting that devices incapable of running iOS 11 will use Facebook's Caffe2 to detect whether your photos are risqué or not. Puzzlingly, Caffe2 is a cloud service, but The Verge claims Nude is still able to do the analysis locally on device.

Chiu and Chen taught their program to recognize nude photos by creating software that scours sites like PornHub for nude images, and eventually collected a library of around 30 million references, making certain that the app would function properly for everyone, including people of color. Though the process still isn't perfect and will, according to Chen, detect and lock away "man boobs," the developers assure users that they will continually work on improving the algorithm. Our colleague Phil recently tested the app's analysis capabilities, and unfortunately it didn't work too well for him:

If you have some sensitive content you'd like to try and safeguard, you can download Nude for free. However, you will be charged a monthly fee of $0.99 for using the service.

Thoughts? Questions?

Would you consider giving Nude a try? What are your experiences with photo vault services in general? Let us know in the comments!

Updated October 2017: Added information about Nude's AI analysis on devices incapable of running iOS 11.

Tory Foulk is a writer at Mobile Nations. She lives at the intersection of technology and sorcery and enjoys radio, bees, and houses in small towns. When she isn't working on articles, you'll likely find her listening to her favorite podcasts in a carefully curated blanket nest. You can follow her on Twitter at @tsfoulk.

  • Well, I wonder if my recent photos of my recovery from open heart surgery will trip the system. I don't want anyone seeing those either. But it is interesting to see the healing process and remember just how traumatized my body was in saving my life.
  • Doesn't BlackBerry have something similar baked into the upcoming BlackBerry Motion?
  • So, you're suppose to trust a service to hide photos on your own device... I don't think so. You're better off just doing it yourself for free.
  • Ok so downloaded to see if it works. So what did it find? A couple of pics of Emilia Clarke (clothed) a pic of a bucket car seat, pics of my wife & family in Cuba, as well as a family pic from NB Canada from 1967. No nudes (which should have been the case). Not sure if because I had none it looked for something close (Front Bucket Car Seat?) like bathing suits.
    I guess the idea is a noble one but selecting a bucket seat??? Couldn’t be more far off.
  • Not surprised. Wouldn’t use a phone or anything else connected to the Internet for nude photos, if I took any (I wouldn’t).