But it also allows for content addressed memory. That’s very important, for it gives fine-grain control over the memory and planning systems. That’s the job of sentence-level syntax together with discourse structure.
“Classical” semantic or cognitive networks had a problem with coming up with just the right set of node types and arc types. David Hays dissolved the problem in his 1981 book, Cognitive Structures (scan down the page), by grounding cognition in an analog system modeled on William Powers perceptual control stack (in Behavior: The Control of Perception, 1973). The identity of a cognitive node is a function of its parameter values, where the parameters are derived from the control stack. The identity of the arcs is a function of the difference in parameter values between the nodes it connects.
Concerning Chomsky’s approach to syntax: It depends on a sharp distinction between grammatical and ungrammatical sentences. A generative grammar, in Chomsky’s theory, must account for all and only the grammatical sentences.
However, there are no explicit criteria for separating sentences into the two categories, grammatical and ungrammatical. Rather, the separation depends on the intuitions of the linguist. Naturally enough, different syntacticians have different intuitions. The problem is insoluble.
Moreover, anyone who pays close attention to real speech soon realizes that people do not (always) speak in complete grammatically correct sentences. Real language is sloppy, but nonetheless effective. A neural net of very high dimensionality can deal with this readily enough. A purely symbolic system cannot. Augmenting the system through fuzzy logic and the like doesn’t fix the problem.
LLMs provide a very useful simulacrum of the natural language system. Since LLMs are trained on written texts, the resulting model necessarily conflates the functions of semantics and syntax/discourse. Thus they cannot achieve the flexibility and precision of the full system, where semantics and syntax/discourse are separated.
No comments:
Post a Comment