A new study from London’s Queen Mary University used radio waves like those used in WiFi to measure heart and breathing rate signals which could be used to determine how a person is feeling.
How did they do it?
Participants in the study watched a video that was selected by the researchers to have one of four emotional responses: anger, sadness joy or pleasure.
While the participants watched, WiFi signals were harmlessly transmitted at them, and the researchers studied the radio signals that bounced back. By analyzing those signals and using AI, the researchers could glean information about heart and breathing rate. This in turn could be interpreted to accurately determine the participant’s emotional state.
Innovation: the study used deep learning instead of machine learning
Most approaches in the emotion detection space involve a machine learning-based approach which is subject-dependent. What that basically means is that data about a specific person is used to predict that individual’s emotional state at a later date.
This study used a deep learning-based approach in a subject-independent manner as described by one of the researchers:
- “With deep learning we’ve shown we can accurately measure emotions in a subject-independent way, where we can look at a whole collection of signals from different individuals and learn from this data and use it to predict the emotions of people outside of our training database.”
Innovation: no visual or speech data used to detect emotion
Notably, this study didn’t rely on visual cues or speech, just heart rate and breathing rate detectible “invisibly” via radio signals.
Traditional emotion detection systems involve the assessment of visual or audio-based signals like facial expressions, speech, body gestures or eye movements. Often, this data provides inaccurate conclusions about underlying emotional state.
ECG signals detect electrical activity in the heart – this provides much more accurate linkage between heart rate and the nervous system, similar to a lie detector. It’s of course, not 100% accurate, but it's far better than pure visual cues.
This study takes that logic one step farther because it’s using radio signals instead of reliance upon sensors that are physically placed on the body of the subject. This is non-invasive and also potentially far more pervasive in who data can be captured from…
If the idea of thought detection like this sounds eerily familiar, it’s because it is:
- ‘Crocodile’ episode on Black Mirror;
- The machine that reads emotions in Blade Runner 2049; and
- And of course, the thought police from George Orwell’s 1984.
There are many positive use cases for this tech, but, in our humble opinion, it’s important to highlight the negative to help ward off an Orwellian future.
Easy peasy to share this story with your peeps