A New Model and Dataset for Long-Range Memory
DeepMind,
2020
Read or listen offline
1×
Recommendation
One of the challenges for AI systems is that, as they become more sophisticated and swallow up larger and larger databases, they need ever more processing power to achieve marginally better results. Even the biggest server farms have their limits. Yet people don’t want to wait for more than a few microseconds, so a major task in AI science is to determine how to reduce mountains of crude data to hills of important data. Google’s DeepMind project’s latest proposal is to use sleep as an inspiration for data processing. Your Eureka moment might have come one afternoon during a power nap.
Take-Aways
About the Authors
Jack Rae and Timothy Lillicrap are staff researchers at Google’s DeepMind project in London.
Comment on this summary or Diskussion beginnen