The beginning of a tweet stream:
"The only constant in life is change"
— Kingson Man (@therealkingson) May 22, 2022
-Heraclitus, on distribution shift
Why does the performance of real-world ML systems tend to degrade over time? <🧵>
"Need is All You Need: Homeostatic Neural Networks Adapt to Concept Shift" Man, Damasio, and Nevenhttps://t.co/ooJ6RwHs6H pic.twitter.com/w6QzYEY2KX
Abstract of the linked article:
In living organisms, homeostasis is the natural regulation of internal states aimed at maintaining conditions compatible with life. Here, we introduce an artificial neural network that incorporates some homeostatic features. Its own computing substrate is placed in a needful and vulnerable relation to the very objects over which it computes. For example, a network classifying MNIST digits may receive excitatory or inhibitory effects from the digits, which alter the network’s own learning rate. Accurate recognition is desirable to the agent itself because it guides decisions to up- or down-regulate its vulnerable internal states and functionality. Counterintuitively, the addition of vulnerability to a learner confers benefits under certain conditions. Homeostatic design confers increased adaptability under concept shift, in which the relationships between labels and data change over time, and the greatest advantages are obtained under the highest rates of shift. Homeostatic learners are also superior under second-order shift, or environments with dynamically changing rates of concept shift. Our homeostatic design exposes the artificial neural network’s thinking machinery to the consequences of its own "thoughts", illustrating the advantage of putting one’s own "skin in the game" to improve fluid intelligence.
No comments:
Post a Comment