Emotion artificial intelligence uses biological signals such as vocal tone, facial expressions and data from wearable devices as well as text and how people use their computers, to detect and predict how someone is feeling. It can be used in the workplace, for hiring, etc. Loss of privacy is just the beginning. Workers are worried about biased AI and the need to perform the ‘right’ expressions and body language for the algorithms.
This feels like the AI equivalent of men telling female workers to smile more. I’m totally sure that bias wasn’t cooked into these algorithms. Honestly, how is this not profiling for neurodiverse individuals?
Truly dystopian