dreams are there to keep you from becoming too fitted to the model of the world https://t.co/C4GIMgCWjN
— v (@pyroboyee) May 14, 2021
Hoel got the idea while considering the way computers learn. An artificial neural network is fed a dataset for training, but a problem arises when it becomes too familiar with the data. The AI’s world becomes very small, as it assumes the dataset is a complete and true representation of the real world. In reality, the world can be a chaotic, unpredictable, and messy place. This problem is known as “overfitting,” and it “leads to failures in generalization and therefore performance on novel datasets,” according to the paper. Them! Has Become Relevant Again... the Way It Does Every 17 Years or So
“During training, artificial neural networks are being fitted to the data,” explained Hoel in an email. “If the network is discriminating between cats and dogs, for instance, it might become fixated on some aspects of cats that is particular to those 100 images that make up the data.” For example, the cat photos might’ve been taken during the day, whereas the dog photos were taken at night.
“By adding noise to the images, or blacking out parts of them, you will improve the generalization to new and novel data sets, like images that contain both night and day,” he said. “I’m arguing that the brain probably faces this problem of learning too well, and dreams help give us exposure to the wildly different stimuli that we need to prevent getting fixated on inconsequential aspects of our lives.”
The original research: Erik Hoel, The overfitted brain: Dreams evolved to assist generalization, Patterns, Volume 2, ISSUE 5, 100244, May 14, 2021. DOI:https://doi.org/10.1016/j.patter.2021.100244.
No comments:
Post a Comment