Microsoft stops selling emotion-reading tech, limits face recognition

New technology, such as facial recognition, brings up a lot of debate about privacy and what is ethical and what is not. Picture: Dollar Gill/Unsplash

New technology, such as facial recognition, brings up a lot of debate about privacy and what is ethical and what is not. Picture: Dollar Gill/Unsplash

Published Jun 22, 2022

Share

By Paresh Dave

OAKLAND, California - Microsoft Corp, on Tuesday, said it would stop selling technology that guesses someone's emotion based on a facial image and would no longer provide unfettered access to facial recognition technology.

The actions reflect efforts by leading cloud providers to rein in sensitive technologies on their own as lawmakers in the United States and Europe continue to weigh comprehensive legal limits.

Since at least last year, Microsoft has been reviewing whether emotion recognition systems are rooted in science.

"These efforts raised important questions about privacy, the lack of consensus on a definition of 'emotions,' and the inability to generalise the linkage between facial expression and emotional state across use cases, regions, and demographics," Sarah Bird, principal group product manager at Microsoft's Azure AI unit, said in a blog post.

Existing customers will have one year before losing access to artificial intelligence tools that purport to infer emotion, gender, age, smile, facial hair, hair and make-up.

Last year, Alphabet Inc's Google Cloud embarked on a similar evaluation, first reported by Reuters. Google blocked 13 planned emotions from its tool for reading emotions and placed under review four existing ones, such as joy and sorrow. It was weighing a new system that would describe movements such as frowning and smiling without seeking to attach them to an emotion.

Google did not immediately respond to a request for comment on Tuesday.

Microsoft also said customers now must obtain approval to use its facial recognition services, which can enable people to log into websites or open locked doors through a face scan.

The company called on clients to avoid situations that infringe on privacy or in which the technology might struggle, such as identifying minors but did not explicitly ban those uses.