Pages in this blog

Friday, August 13, 2021

Attractor nets, from basic vertebrates to humans

I recently had a post where I tacked this on at the end [1]:

That’s what I had in mind when, over two decades ago, I played around with a notation system I called attractor nets [2]. There I was imagining a logical structure implemented over a large and various attractor landscape. The logical structure is represented by a network formalism developed by Sydney Lamb [3]. The network is quite different from those generally used in classical symbolic systems, where nodes represented objects of various kinds and arcs represented types of relationships among those objects. In Lamb’s nets the nodes are logical operators (OR, AND) while the arcs carry the content of the net. In an attractor net each arc corresponds to a basin of attraction. The net then represents a logical structure over basins of attraction.

For some reason I’d never expressed my intention in just that way, though there’s nothing new there.

And so I began thinking: What’s the simplest creature that requires a logical structure over the attractor landscape? I considered the possibility that language is what required an attractor net. That would imply that all animals from worms to great apes have an unadorned attractor landscape. I rejected that.

Instead I’ve decided that coordination between the sensory systems and the motor system is what necessitated an attractor net structure. In “Principles and Structure of Natural Intelligence”[4] David Hays and I made the following remarks about the basic vertebrate nervous system:

The primitive vertebrate nervous system is reticular (Best, 1972; Bowsher, 1973). The only principle active at this level is the modal principle. Within a given mode, behavior is governed by on-blocks, where the conditional elements are innate releasing mechanisms and the executed programs are fixed action patterns (Lorenz, 1969). These on-blocks are executed as they are triggered by the interaction of environmental stimuli and organismic modal shifts. There is little or no autonomous chaining of these on-blocks.

Those on-blocks would require the structure provided by an attractor net. Beyond that consider a creature navigating its way through the environment. Such paths are ‘littered’ with contingencies, and contingencies would likely require an attractor net to weave perception and action together.

Thus language is only the most sophisticated behavior requiring attractor nets. With language the attractor net weaves the language system into cognition thereby allowing the system to take ‘arbitrary’ walks through cognition. Those walks are what we call ‘thinking,’ in the common sense use of the term.

References

[1] Dual-system mentation in humans and machines [updated], New Savanna, August 11, 2021, https://new-savanna.blogspot.com/2021/08/dual-system-mentation-in-humans-and.html.

[2] I never produced an account that I thought was ready for others to read. But, in addition to piles of notes, I did produce two documents intended to summarize the work for my own purposes. The first of these two documents does explain Lamb’s notation while the second document is a collection of diagrams, that is, constructions.

William Benzon, Attractor Nets, Series I: Notes Toward a New Theory of Mind, Logic, and Dynamics in Relational Networks, Working Paper, 52 pp., https://www.academia.edu/9012847/Attractor_Nets_Series_I_Notes_Toward_a_New_Theory_of_Mind_Logic_and_Dynamics_in_Relational_Networks.

William Benzon, Attractor Nets 2011: Diagrams for a New Theory of Mind, Working Paper, 55 pp., https://www.academia.edu/9012810/Attractor_Nets_2011_Diagrams_for_a_New_Theory_of_Mind.

[3] Sydney Lamb, the computational linguist, believed this as well. He argues it in Pathways of the Brain, Amsterdam: John Benjamins (1998), pp. 181-182.

[4] William Benzon and David Hays, Principles and Development of Natural Intelligence, Journal of Social and Biological Structures, Vol. 11, No. 8, July 1988, 293-322, https://www.academia.edu/235116/Principles_and_Development_of_Natural_Intelligence.

No comments:

Post a Comment