Imagine you are in a job interview. As you answer the recruiter’s questions, an artificial intelligence (AI) system scans your face, scoring you for nervousness, empathy, and dependability. It may sound like science fiction, but these systems are increasingly in motion, often without people’s knowledge or consent.
Emotion recognition technology (ERT) is in fact a burgeoning multi-billion-dollar industry. It aims to use AI to detect emotions from facial expressions. Yet the science behind emotion recognition systems is controversial: there are biases built into the systems.
In a 2012 study, by Aron K. Barbey, et. al, neuroscientists confirmed that emotional intelligence and cognitive intelligence share many neural systems. From integrating cognitive, social, and affective processes. This study confirms what psychologists have suspected for decades: that there are interdependencies between emotional intelligence and general intelligence.
Remember Jarvis from the movie Iron man or TARS from the movie Interstellar. Well, the audience almost cried when those artificial intelligent beings inflicted the essence of love and care.
Many companies use ERT to test customer reactions to their products, from cereal to video games. But it can also be used in situations with much higher stakes, such as in hiring, by airport security to flag faces as revealing deception or fear, in border control, in policing to identify “dangerous people” or in education to monitor students’ engagement with their homework.
AI Demonstrating Empathy & Strong Problem-Solving Skills
Artificial intelligence is not only good at problem-solving but it is also demonstrating empathy. AI has developed spontaneous emotions of its own accord. The company Cogito, founded by Joshua Fest and Dr. Sandy Pentland, melds together machine learning with behavioral adaptation, supported by the latest breakthroughs in behavioral science.
Fortunately, facial recognition technology is receiving public attention. The award-winning film Coded Bias, recently released on Netflix. It documents the discovery that many facial recognition technologies do not accurately detect darker-skinned faces. And the research team managing ImageNet, one of the largest and most important datasets used to train facial recognition. It was recently forced to blur 1.5 million images in response to privacy concerns.
With the increasing volume of visual, audio, and text data in commerce, there have been many business applications using AE. For example, Affectiva analyses viewers’ facial expressions from video recordings while they are watching video advertisements in order to optimize the content design of video ads.
Revelations about algorithmic bias and discriminatory datasets in facial recognition technology have led large technology companies, including Microsoft, Amazon, and IBM, to halt sales. And the technology faces legal challenges regarding its use in policing in the UK. Furthermore, in the EU, a coalition of more than 40 civil society organizations has called for a ban on facial recognition technology entirely.
Lapetus Solutions develops a model to estimate an individual’s longevity, health status, and disease susceptibility from a face photo. Their technology has been applied in the insurance industry.
Citizen science project
ERT has the potential to affect the lives of millions of people, yet there has been little public deliberation about how—and if—it should be used. This is where the citizen science project comes to play.
On the interactive website (which works best on a laptop, not a phone) you can try out a private and secure ERT for yourself. To see how it scans your face and interprets your emotions, you can also play games comparing human versus AI skills in emotion recognition and learn about the controversial science of emotion behind ERT.
When it comes to ERT, we need to collectively examine the controversial science of emotion built. These systems and analyze their potential for racial bias. And we need to ask ourselves: even if ERT can accurately read everyone’s inner feelings, Do we want such intimate surveillance in our lives? These are questions that require everyone’s deliberation, input, and action.
Most importantly, you can contribute your perspectives and ideas to generate new knowledge about the potential impacts of ERT. As a computer scientist and digital activist If you have a face, you have a place in the conversation.
By Mayank Vashisht | Technology Journalist | ELE Times