AI recognizes personality through eye movements
“With our eyes we not only grasp the surroundings, they are also the window to our soul. They reveal who we are, how we feel and what we do,” explains Andreas Bulling, who heads the research group “Perceptual User Interfaces” at Saarland University. Together with fellow scientists in Stuttgart and Australia, Bulling has trained its own software system in such a way that it can evaluate eye movements and draw conclusions about a person’s character traits. The research team used special computational methods of machine learning for this purpose.
To obtain the data for training and evaluation, 50 students, including 42 women and eight men, participated at Flinders University in Australia. The test subjects were equipped with an eye tracker. He filmed the eye movements of the test persons while they strolled around the university campus for about ten minutes and bought a coffee or other articles in the campus shop. The scientists then asked the students to take off their glasses and fill in special questionnaires to determine their personality and degree of curiosity in the traditional way.
“In order to analyse the recorded eye data independently of the duration of the recording, we worked with a shiftable time window, as this does not attenuate any characteristics,” explains Bulling. From each of the resulting time windows, the researchers gained 207 characteristics. These included statistics on eye fixations as well as the blink rate. Based on this and on the information from the questionnaires, the researchers grouped around 100 decision trees per personality trait into a classifier and trained it. The result: In the subsequent test with data material that had not yet been used, they were able to prove that the software system reliably recognizes traits such as emotional instability, sociability, compatibility and conscientiousness.
The knowledge gained about non-verbal behavior can also be transferred to robots, so that they behave like humans, explains the researcher. Such systems would then communicate with humans in a much more natural way and would thus be more efficient and flexible than today’s robots or interactive systems. Together with Sabrina Hoppe from the University of Stuttgart, Tobias Loetscher from the University of South Australia in Adelaide, Australia, and Stephanie Morey from Flinders University, also in Adelaide, Andreas Bulling discussed the results in the article “Eye Movements During Everyday Behavior Predict Personality Traits“, which the researchers published in the journal “Frontiers in Human Neuroscience”.
Data and source code: https://github.molgen.mpg.de/sabrina-hoppe/everyday-eye-movements-predict-person…