Look, it’s no secret that the cameras on our computers, smartphones and tablets can spy on us. The thing is, 99.9% of the time, we grant that access.
Take your iPhone for example. How many apps do you use that require access to your photos, camera and mic?
And we grant that access, because otherwise those apps are pretty much useless.
Google engineer Felix Krause has detailed an alarming ( but, expected ) privacy setting in iOS that enables iPhone apps with camera permission to take photos and videos of us. Without our knowledge.
Every time you install a new app on your iOS device, and you grant it access to your camera, it can:
- access both the front and the back camera
- record you at any time the app is in the foreground
- take pictures and videos without telling you
- upload the pictures/videos it takes immediately
- run real-time face recognition to detect facial features or expressions
All without indicating that your phone is recording you and your surrounding, no LEDs, no light or any other kind of indication.
Freaked out yet? What’s even more alarming is that this is not a bug, it’s a feature. This is exactly how this privacy setting is expected to work by design.
These apps can easily track the users face, take pictures, or live stream the front and back camera, without the user’s consent.
- Get full access to the front and back camera of an iPhone/iPad any time your app is running in the foreground
- Use the front and the back camera to know what your user is doing right now and where the user is located based on image data
- Upload random frames of the video stream to your web service, and run a proper face recognition software, which enables you to
- Find existing photos of the person on the internet
- Learn how the user looks like and create a 3d model of the user’s face
- Live stream their camera onto the internet (e.g. while they sit on the toilet), with the recent innovation around faster internet connections, faster processors and more efficient video codecs it’s hard to detect for the average user
- Estimate the mood of the user based on what you show in your app (e.g. news feed of your app)
- Detect if the user is on their phone alone, or watching together with a second person
- Recording stunning video material from bathrooms around the world, using both the front and the back camera, while the user scrolls through a social feed or plays a game
- Using the new built-in iOS 11 Vision framework, every developer can very easily parse facial features in real-time like the eyes, mouth, and the face frame
Felix has disclosed the issue to Apple ( I’m sure they weren’t aware of it. But since nobody complains it’s not a bug it’s a feature ) and also offered some tips on how the issue can be solved.
To read more about this issue, including what can you do to protect yourself, go read Felix’s full blog post here.