Researchers in the UK have used wireless signals and machine learning to remotely detect moods.
The team at Queen Mary University collected heartbeat and breathing signals of 15 participants from radio frequency (RF) reflections off the body followed by novel noise filtering techniques.
The key to interpreting the wireless data was a new deep neural network (DNN) architecture based on the fusion of the raw RF data and the processed RF signal for classifying and visualising various emotion states.
The model achieved a high classification accuracy of 71.67% for independent subjects and provides better performance with limited amount of raw RF and post processed time-sequence data compared to five other ML algorithms. The deep learning model has also been validated by comparing the results with signals from ECG heart monitors.
The study, published in the journal PLOS ONE, asked participants to watch a video selected by researchers for its ability to evoke one of four basic emotion types; anger, sadness, joy and pleasure.
Previous research has used similar non-invasive or wireless methods of emotion detection, however in these studies data analysis has depended on the use of classical machine learning approaches, where an algorithm is used to identify and classify emotional states within the data. For this study the scientists instead employed deep learning techniques, where an artificial neural network learns its own features from time-dependent raw data, and showed that this approach could detect emotions more accurately than traditional machine learning methods.
"Deep learning allows us to assess data in a similar way to how a human brain would work looking at different layers of information and making connections between them. Most of the published literature that uses machine learning measures emotions in a subject-dependent way, recording a signal from a specific individual and using this to predict their emotion at a later stage,” said researcher Achintha Avin Ihalage.
"With deep learning we've shown we can accurately measure emotions in