Skip navigation
Your Boss Wants to Spy on Your Inner Feelings
Article

Your Boss Wants to Spy on Your Inner Feelings

Tech companies now use AI to analyze your feelings in job interviews and public spaces. But the software is prone to racial, cultural and gender bias

Scientific American, 2021 more...


Editorial Rating

9

Qualities

  • Scientific
  • Eye Opening
  • Hot Topic

Recommendation

Cameras and sensors that feed vocal tones, facial expressions and body language images into elaborate “emotion AI” systems proliferate in today’s society. They are used by marketers, employers, schools and medical institutions. Although the technology has existed for decades, its recent insidious explosion raises concerns because of gender, racial, cultural and other dangerous biases baked into AI training data.

Take-Aways

  • “Emotion AI” or “affective computing” combines cameras and sensors with AI to analyze people’s attitudes and feelings.
  • Emotion AI reshapes hiring practices, business decisions and interactions with organizations.
  • Emotion AI algorithms can imprint ethnic, racial and gender biases, which prejudice their results.

About the Author

John McQuaid is a journalist and author. He reported this story while a fellow at the Woodrow Wilson International Center for Scholars in Washington, DC. He is currently a PhD student at the University of Maryland Merrill College of Journalism.