Apps are watching you

Dillon
3 min readSep 19, 2020
Photo by William Hook on Unsplash

The modern version of the classic science fiction book 1984. People sat on their phones scrolling through their friends photos on Instagram. As they scroll, their camera watches their face as each post comes into view. Noting which ones encite a reaction, both positive or negative, which helps improve the algorithm that is becoming ever more personalised to them. The algorithm that ensures people are seeing the content that releases the most dopamine and in turn keeps them on the app longer…but maybe this isn’t just science fiction.

I see so many articles debating whether companies like Facebook are watching you through your phone camera, and as a developer I wouldn’t doubt it for a second. Knowing how simple it is to do and how much some companies rely on data, why wouldn’t they?

Let’s look just how easy it is to implement a very basic feature like this into a simple app.

Setup

For this example I’m using a basic Expo app. Simply install the cli and run expo init face-reaction-app and pick the blank managed template.

The app

To demo this we’ll create a basic app that lets the user scroll through photos. We’ll use some photos from Unsplash.

Just run yarn start to start the app.

The goal

We want to make sure our users are seeing photos that make them happy. The best way to check this is to flag the images that make the user smile.

Detecting a smile

If you asked a developer just 5 years ago to detect if a person is smiling from a live video feed, they’d probably run a mile. These days it’s literally an out of the box feature using Expo.

Just run expo install expo-face-detector expo-camera and you’ll have everything you need.

Implementing

First, we’ll write the detector

This code creates a new component that handles the camera permissions and renders the camera with the FaceDetector settings, locked to the front facing camera.

The camera component has a event handler that gets called when it detects a face with a number of properties. We can add simple logic here to call our isSmiling prop with true if the smiling probability is over 50%, otherwise call with false .

Now we can use our new detector component in the app to show a user a message when they smile.

And that’s it! An image gallery app that detects a user smiling at an image, in under 100 lines of code.

Overview

The app we just built only shows a message when the user smiles but we could easily call an API to flag the images that the user smiles at. We could use that data to add a wholesome feature that lets the user look back at all the images that made them smile. Or, we could use it alongside many other data points to continuously train an AI, that decides what content to show to keep the user scrolling for longer and show them more ads.

This is just a basic example, but with a bit of work there are so many other data points that you could get from the camera:

  • Emotion detection
  • Age validation
  • Ethnicity
  • Gender
  • Whether the user shows the screen to another person

Plus many more with similar techniques using the location services, and microphone.

Although this topic is no longer a question of whether it is possible or not, now the question focuses more around how ethical this is as a form of data collection and whether this practice should really be used or not.

--

--

Dillon

Staff Engineer @Wagestream | Building @LclyMe | Geo-data nerd | OpenGraph oeficionado