I first became interested in linguistics during my undergraduate years at Johns Hopkins University. They didn’t have a linguistics department, nor anyone who taught linguistics, but they did have a distinguished psycholinguist, James Deese. He taught a course in psycholinguistics, which I took in the fall of my sophomore year. Each week he assigned us to read a classic article in psycholinguistics; we then had to write a summary of the article. Much of our grade depended on those summaries.
That was so long ago that I no longer remember most of the articles. But I remember four. One was an article on Hanunóo color categories by Harold Conklin. George Miller’s classic article, “The Magical Number Seven, Plus or Minus Two”. Then there was Chomsky’s review of B. F. Skinner’s Verbal Behavior, and Robert Lees’s review of Chomsky’s Syntactic Structures. One of the two last lead me to write a summary that Deese liked, but thought was “unnecessarily original”. I was hooked, sorta’.
That is, I didn’t really have any need, beyond mere curiosity, for linguistics at that point. So I just let that stuff settle into the intellectual background. It wasn’t until the early 1970s that I began reading my way through linguistics in earnest. That’s when I bought copies of both Syntactic Structures and Aspects of the Theory of Syntax. Truth be told, they were just a bit beyond me; it’s not the kind of material you can work though on your own without any prior background in linguistic analysis and theory. But I got enough to see that, as interesting as this material was–I really liked the diagrams–it wasn’t what I needed. I needed semantics and generative grammer (that is, Chomsky’s school of linguistic thought) didn’t have it. Neither did the so-called generative semanticists; what they called semantics looked more like syntax to me. It wasn’t until I found my way to some of the early work in cognitive networks that I felt like there was something there for me. By that time I was ready to head of to graduate school, in English literature, where I found a computational linguistics, David Hays, I could work with.
All of this is by way of introduction to a position paper just published by Morten Christiansen and Nick Chater, “Towards an integrated science of language”. They begin:
The modern era in the study of language began in the 1950s with Noam Chomsky's invention of transformational grammar: a mathematically rigorous system of rules aiming to generate the grammatical sentences of each natural language. Transformational grammar itself underwent various important theoretical developments and soon became associated with some striking claims: that all human languages follow the same deep universal patterns; that this ‘universal grammar’ is innate and unfolds gradually during language development in the same way that a chicken grows a wing; and that evolution of language is instantaneous, perhaps arising from a sudden large-scale genetic mutation. The generative grammar project initially promised to forge important links across disciplines. Psychologists searched for traces of linguistic transformations in language-processing times; developmentalists tried to interpret child language as generative grammar ‘in flux’; engineers tried to incorporate generative grammar into their natural language systems; neuroscientists and geneticists searched for the biological roots of universal grammar; and students of language variation assessed the universality of the supposedly universal principles.
That’s more or less where things stood when I was at Hopkins.
Christiansen and Chater go on to say, however, that things began to unravel. I could see the beginnings that in my readings and by the time I got to David Hays in the mid-1970s, well he had a very different view of language, one I was happy enough to adopt. The point I’m stalking, though, is that the various strands of post-Chomskyian linguistic thinking did owe a debt to Chomsky. For it was Chomsky’s early work that brought attention to the study of language and helped catalyze the cognitive sciences (as they came to be called). And it was the (perceived) inadequacies of Chomsky’s ideas that led some thinkers to pursue different, sometimes very different, lines of inquiry. Chomsky’s ideas were fertile enough that proving them wrong was a way to advance our knowledge.
Now, generative grammar by no means dead. It’s still alive and well. But it’s no longer as dominant (in North America) as it was from the 1960s through, say, the 1980s and into the 1990s. Other ways of investigating language have been gaining force in the last two decades. Christiansen and Chater sketch one alternative:
At the heart of this emerging alternative framework are constructions, which are learned pairings of form and meaning ranging from meaningful parts of words (such as word endings, for example, ‘-s’, ‘-ing’) and words themselves (for example, ‘penguin’) to multiword sequences (for example, ‘cup of tea’) to lexical patterns and schemas (such as, ‘the X-er, the Y-er’, for example, ‘the bigger, the better’). The quasi-regular nature of such construction grammars allows them to capture both the rule-like patterns as well as the myriad of exceptions that often are excluded by fiat from the old view built on abstract rules. From this point of view, learning a language is learning the skill of using constructions to understand and produce language. So, whereas the traditional perspective viewed the child as a mini-linguist with the daunting task of deducing a formal grammar from limited input, the construction-based framework sees the child as a developing language-user, gradually honing her language-processing skills. This requires no putative universal grammar but, instead, sensitivity to multiple sources of probabilistic information available in the linguistic input: from the sound of words to their co-occurrence patterns to information from semantic and pragmatic contexts. Computational analyses of speech addressed to children have revealed that there is much more information available to the child than previously assumed. For example, word categories and meanings can partly be inferred through statistical analysis of which words and phrases occur together; and cross-linguistic analyses show that nouns and verbs tend to sound different, with subsequent experiments showing that children use such cues to help learn new words, and that adults also rely on them during sentence processing.
I have no particular brief for construction grammar [1], which has a fairly specific meaning in linguistics, but I find the general characterization in that paragraph congenial. Meaning and communication are now central to the study of language. Strangely enough, they are secondary to the generative enterprise, which centered on syntax. Where did such a peculiar idea come from?
[1] Per Aage Brandt has expressed reservations about constructions in remarks on Facebook, as has Avery Andrews:
No comments:
Post a Comment