‘Nude’ Uses Machine Learning on iOS Devices to Secure Sensitive Photos

While some manufacturers have included ways to lock down certain folders on devices, Apple hasn’t jumped on that bandwagon, leaving it open for third-party developers to come up with solutions.

A pair of 21-year-old developers from UC Berkeley have come up with what they think is the best possible option for securing sensitive photos, documents, and videos. It’s a new app called “Nude,” and it uses machine learning libraries on an iOS device to scan the owner’s camera roll, find sensitive photos, and automatically remove them into a private camera “vault.”

Y.C. Chen and Jessica Chiu, who put together the app with a small team, saw an opportunity in finding a way to lock down sensitive photos. Chiu says she became interested in the idea after she spoke with Hollywood actresses as part of another film-related project. She also understands the possibility of running into a random nude photo on someone’s phone:

“Chiu says she became interested in nudes-related business models after speaking with Hollywood actresses as part of a movie project she’s working on. Each had sensitive images on their phones or laptop, she said and expressed doubts about how to keep them secure. When Chiu returned to Berkeley, friends would pass her their phones to look at recent photos they had taken, and she would inevitably swipe too far and see nudity.”

The app is only able to automatically detect nude photos right now. If a user has sensitive videos they want to store in the private vault, they will need to manually transfer them for now. The private vault itself is PIN-protected, and the developers say that the app will monitor the camera roll in the background so that any new nude images are quickly moved over. There is also a confirmation dialog that will pop up after an image is stored in the private vault, asking the user if they want to delete the sensitive data from their camera roll and from their iCloud photo storage.

For bonus security, if someone tries to run your PIN to get into the secure vault and fails, the app will use the front-facing camera to snap a photo of them so you can tell who was trying to get into your vault.

With CoreML, the machine learning effort by Apple, all of this is handled on the device. That means that none of the nude photos are sent to Nude’s developers. This inherently limits the fear that these photos can be hacked from a server. That is a big deal, considering the “Celebrate” situation that happened back in 2014.

Nude is a free app to install, but the service itself costs $1 per month to use. The developers also note that the added protection of onboard machine learning is only available on iOS devices running iOS 11, for the best possible experience an upgrade to that version of the mobile OS is recommended.

An Android version of the app is also in development.

Download

Nude — Free
[via The Verge]

Like this post? Share it!
copy by:http://www.iphonehacks.com/
Share on Google Plus

About Unknown

Ut wisi enim ad minim veniam, quis nostrud exerci tation ullamcorper suscipit lobortis nisl ut aliquip ex ea commodo consequat. Duis autem vel eum iriure dolor in hendrerit in vulputate velit esse molestie consequat, vel illum dolore eu feugiat nulla facilisis at vero eros et accumsan et iusto odio dignissim qui blandit praesent luptatum zzril delenit augue duis.

0 comments:

Post a Comment