Emotional AI Is No Substitute for Empathy

Technology


In 2023, emotional AI—technology that can sense and interact with human emotions—will become one of the dominant applications of machine learning. For instance, Hume AI, founded by Alan Cowen, a former Google researcher, is developing tools to measure emotions from verbal, facial, and vocal expressions. Swedish company Smart Eyes recently acquired Affectiva, the MIT Media Lab spinoff that developed the SoundNet neural network, an algorithm that classifies emotions such as anger from audio samples in less than 1.2 seconds. Even the video platform Zoom is introducing Zoom IQ, a feature that will soon provide users with real-time analysis of emotions and engagement during a virtual meeting.  

In 2023, tech companies will be releasing advanced chatbots that can closely mimic human emotions to create more empathetic connections with users across banking, education, and health care. Microsoft’s chatbot Xiaoice is already successful in China, with average users reported to have conversed with “her” more than 60 times in a month. It also passed the Turing test, with the users failing to recognize it as a bot for 10 minutes. Analysis from Juniper Research Consultancy shows that chatbot interactions in health care will rise by almost 167 percent from 2018, to reach 2.8 billion annual interactions in 2023. This will free up medical staff time and potentially save around $3.7 billion for health care systems around the world. 

In 2023, emotional AI will also become common in schools. In Hong Kong, some secondary schools already use an artificial intelligence program, developed by Find Solutions AI, that measures micro-movements of muscles on the students’ faces and identifies a range of negative and positive emotions. Teachers are using this system to track emotional changes in students, as well as their motivation and focus, enabling them to make early interventions if a pupil is losing interest. 

The problem is that the majority of emotional AI is based on flawed science. Emotional AI algorithms, even when trained on large and diverse data sets, reduce facial and tonal expressions to an emotion without considering the social and cultural context of the person and the situation. While, for instance, algorithms can recognize and report that a person is crying, it is not always possible to accurately deduce the reason and meaning behind the tears. Similarly, a scowling face doesn’t necessarily imply an angry person, but that’s the conclusion an algorithm will likely reach. Why? We all adapt our emotional displays according to our social and cultural norms, so that our expressions are not always a true reflection of our inner states. Often people do “emotion work” to disguise their real emotions, and how they express their emotions is likely to be a learned response, rather than a spontaneous expression. For example, women often modify their emotions more than men, especially the ones that have negative values ascribed to them such as anger, because they are expected to.

As such, AI technologies that make assumptions about emotional states will likely exacerbate gender and racial inequalities in our society. For example, a 2019 UNESCO report showed the harmful impact of the gendering of AI technologies, with “feminine” voice-assistant systems designed according to stereotypes of emotional passiveness and servitude. 

Facial recognition AI can also perpetuate racial inequalities. Analysis from 400 NBA games with two popular emotion-recognition software programs, Face and Microsoft’s Face API, were shown to assign more negative emotions on average to Black players, even when they were smiling. These results reaffirm other research showing that Black men have to project more positive emotions in the workplace, because they are stereotyped as aggressive and threatening.

Emotional AI technologies will become more pervasive in 2023, but if left unchallenged and unexamined, they will reinforce systemic racial and gender biases, replicate and strengthen the inequalities in the world, and further disadvantage those who are already marginalized. 



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *