Scientists Show Risks of Emotion Recognition Software Through Online Game

Technology designed to identify and recognize human emotions using machine learning has been frowned upon by ethicists. Now, researchers are on a quest to unmask its reality claiming it could be useful in situations such as road safety or market research. Critics allege that the technology is not only a breach of privacy, but is also racially biased and inaccurate.

A research team created a website— — where the public can test emotion recognition systems through their own computer cameras. One focuses on pulling faces to trick the technology, while the other explores how systems can struggle to read facial expressions in context.

The researchers say their objective is to increase awareness about the technology and encourage conversations about its use.

‘It is a form of facial recognition, but it goes farther because rather than just identifying people, it claims to read our emotions, our inner feelings from our faces,’ said Dr. Alexa Hagerty, project lead and researcher at the University of Cambridge Leverhulme Centre for the Future of Intelligence and the Centre for the Study of Existential Risk.

Facial recognition has been under intense scrutiny in the recent years. Just last year, the Equality and Human Rights Commission said its use for mass screening should be stopped, with claims that it could increase police discrimination and affect freedom of expression.

Hagerty however said that a lot of people were not aware how common emotion recognition systems— they are used in job hiring, customer insight work, airport security and even education to observe if students were engaged.

The technology is being used all over the world in Europe, China and the U.S. Taigusys, a company dealing in emotion recognition systems with its main office in Shenzhen, says the technology has been employed in settings ranging from care homes to prisons. According to reports from earlier this year, the Indian city of Lucknow has plans to use the technology to spot distress in women being harassed— a move met with criticism, particularly from digital rights organizations.

While Hagerty said emotion recognition technology may have potential advantages, it must be juxtaposed with concerns of accuracy, racial bias and if it is even the right tool for a particular job.

‘We need to be having a much wider public conversation and deliberation about these technologies’, she said.

The site notes that ‘no personal data is collected and all images are stored on your device’. In one game, the users are requested to pull a series of faces to fake emotions and see if it tricks the system.

‘The claim of the people who are developing this technology is that it is reading emotion’, said Hagerty. However, she said that in reality, the system was reading facial movement and combining it with the assumption that those movements are linked to emotions.

‘There is lots of really solid science that says that is too simple; it doesn’t work quite like that’, said Hagerty who added that human experience showed it was possible to fake a smile. ‘That is what that game was: to show you didn’t change your inner state of feeling rapidly six times, you just changed the way you looked [on your] face’, she said.

Some emotion recognition researchers have admitted that they are aware of the limitations. Hagerty said the hope was that the project being funded by NESTA (National Endowment for Science, Technology and the Arts) will create awareness of the technology and encourage discussions on its use.

‘I think we are beginning to realize we are not really “users” of technology, we are citizens in world being deeply shaped by technology, so we need to have the same kind of democratic, citizen-based input on these technologies as we have on other important things in societies,’ she said.

Vidushi Marda, Senior Programme Officer at the human rights organization Article 19 said it was vital to press ‘pause’ on the enlarging market for emotion recognition systems.

‘The use of emotion recognition technologies is deeply concerning as not only are these systems based on discriminatory and discredited science, their use is also fundamentally inconsistent with human rights’, she said. ‘An important learning from the trajectory of facial recognition systems across the world has been to question the validity and need for technologies early and often, and projects that emphasize on the limitations and dangers of emotion recognition are an important step in that direction.’

By Marvellous Iwendi.

Source: The Guardian