AI Now: Emotional AI Should Be Banned

PinIt

AI Now Institute recommends regulators should ban the use of affect recognition in important decisions that impact people’s lives and access to opportunities. Until then, AI companies should stop deploying it.

Research institute AI Now has called for emotional AI, which uses affect recognition from people’s facial expressions to identify emotion, to be regulated and banned from making important decisions.

In its 2019 annual AI report, the institute said affect recognition is clearly flawed and should not be used to make decisions in recruitment, education, customer services or criminal justice. As noted in a story by The Verge in July: “the belief that facial expressions reliably correspond to emotions is unfounded.”

SEE ALSO: Act Now to Prevent Regulatory Derailment of the AI Boom

“Regulators should ban the use of affect recognition in important decisions that impact people’s lives and access to opportunities. Until then, AI companies should stop deploying it,” said AI Now in the report.

Emotional intelligence is a large market, worth $12 billion. According to Market Reports World that value could rise to $90 billion by 2024. Key players are already deploying technology in the workplace, for example, IBM recently shipped the second version of its AI robot to the International Space Station with built-in emotional intelligence.

As more industries look for ways to integrate artificial intelligence, there is a worry that too much control and power will be handed to these programs, without properly assessing the accuracy and value they bring.

In the case of affect recognition, it may be difficult and time consuming for human evaluation to provide feedback and counterbalance, especially in underfunded public services. This could lead to someone not getting a job or going to prison because an algorithm identified the wrong emotion.

David Curry

About David Curry

David is a technology writer with several years experience covering all aspects of IoT, from technology to networks to security.

Leave a Reply

Your email address will not be published. Required fields are marked *