Ignorer la navigation
How Do You Teach a Car That a Snowman Won’t Walk Across the Road?
Article

How Do You Teach a Car That a Snowman Won’t Walk Across the Road?

Aeon, 2019

Read offline

résumé audio créé automatiquement
résumé audio créé automatiquement

Editorial Rating

8

Qualities

  • Innovative
  • Engaging

Recommendation

Headlines about artificial intelligence make lofty claims about machines outperforming humans. However, the technology isn’t there yet, as computer science professor and author Melanie Mitchell explains in an engaging essay for Aeon. Humans have inherent “core knowledge” that AI simply doesn’t have, and instilling those understandings is a challenge when developing better systems for the future. Those in the AI sector should keep Mitchell’s message on their radar.

Summary

AI starts out as a “blank slate” that doesn’t have the well-rounded “core knowledge” of a human being.

People understand that if a ball rolls out into the road, a child might soon follow. And that a snowman isn’t going to dislodge itself from a snowbank to cross the street. AI doesn’t get those concepts. Self-driving cars, for instance, are often rear-ended when they suddenly step on the brakes for an object that human drivers wouldn’t worry about.

While AI may perform better than humans in certain capacities, such as language processing, the technology can make unexpected errors that a human ...

About the Author

Melanie Mitchell is a computer science professor at Portland State University and Santa Fe Institute. Her books include Complexity: A Guided Tour and Artificial Intelligence: A Guide for Thinking Humans.


Comment on this summary