Pages in this blog

Friday, July 29, 2016

Why Chomsky’s Ideas Were so Popular

I am in the process of revising, Golumbia Fails to Understand Chomsky, Computation, and Computational Linguistics, and uploading it as a downloadable PDF. The following is from the new material going into the revision.
At the beginning of the chapter on Chomsky, Golumbia speculates that Chomsky’s ideas were so popular because they filled an existing need (p. 31):
[...] despite Chomsky’s immense personal charisma and intellectual acumen, it is both accurate and necessary to see the Chomsky revolution as a discourse that needed not so much an author as an author-function – a function “tied to the legal and institutional systems that circumscribe, determine, and articulate the realm of discourses” (Foucault 1965, 130).
I believe he is correct in this, that the intellectual climate was right for Chomsky, though I’m skeptical about his analysis.

Golumbia goes on to suggest that nascent neoliberalism provided the ideological matrix in which Chomsky’s ideas flourished (p. 32). More specifically (p. 32):
Chomsky insists that the only reasonable focus for analysis of cognitive and linguistic matters is the human individual, operating largely via a specific kind of rationality [...]; second, specifically with regard to the substance both of cognition and of language, Chomsky argues that the brain is something very much like one of the most recent developments in Western technology; the computer.

What is vital to see in this development is not merely the presence of these two views but their conjunction: for while one can imagine any number of configurations according to which the brain might be a kind of computer without operating according to classical rationalist principles (a variety of recent cognitive-scientific approaches, especially connectionism [...] offer just such alternatives), there is a natural fit between the computationalist view and the rationalist one, and this fit is what proves so profoundly attractive to the neoliberal academy.
Before going on, note that parenthetical remark in the second paragraph. We will return to it in a moment.

It is one thing to argue that Chomskyian linguistics dovetails nicely with neoliberalism, but it is something else to argue that it is attractive only to neoliberalism, and it is this latter that Golumbia seems to be arguing. And there he encounters an immediate problem: Chomsky’s own politics. For Chomsky’s own “real-world politics” (my scare quotes) are quite unlike the politics Golumbia finds lurking in his linguistics. He unconvincingly glosses over this by pointing out that Chomsky’s “institutional politics are often described as exactly authoritarian, and Chomsky himself is routinely described as engaged in ‘empire building’” (p. 33). Authoritarian empire building is common enough in the academy, but what has that to do with the radical left views that Chomsky has so consistently argued in his political essays?

To be sure, Chomsky is only a single case of divergence between real-world politics and ideology, but it is an important one as Golumbia has made Chomsky himself the center of his argument for the confluence between computationalist ideology and conservative politics. If the center does not hold, is the connection between real-world politics and computationalist ideology as close as Golumbia argues?

There’s a problem with this story, which is after all a historical one. And history implies change. That parenthetical reference to connectionism as providing an alternative to “classical rationalist principles”, that gives us a clue about the history. While it’s not at all clear to me that connectionism is meaningfully different from and opposed to those classical rationalist principles, let’s set that aside. While connectionism has roots in the 1950s and 1960s (if not earlier), the same time that Chomsky’s ideas broke through, it didn’t really become popular until the 1980s, by which time neoliberalism was no longer nascent. It was visibly on the rise. Shouldn’t a visible and powerful neoliberalism have been able to suppress conceptions inconsistent with it? Shouldn’t those classical rationalist principles become more prevalent with the rise of neoliberalism rather than retreating to the status of but one conception among many?

Connectionist ideas flourished within an intellectual space that originated in the 1950s, and Chomsky’s ideas were catalytic, but certainly not the only ones (as we’ll soon see). As a variety of thinkers began to explore that space, some of them were attracted to different ideas; at the same time, the originating ideas, often grounded in classical logic, ran into problems. Consequently this new conceptual space became populated with ideas often at odds with those that brought the space into existence.

I suggest that at the primaryintellectual attractiveness of newly emerging computing technology is the simple fact that it gave thinkers a way to conceputalize how mind could be implemented in matter, how Descartes’ res cogitans could be realized in res extensa. Computing had been around for a long time, but not computing machines. The computing machines that emerged after World War II promised to be far more powerful and flexible than all others. That is what was attractive, not nascent neoliberalism, though that neoliberalism may have been waiting in the wings and shouting encouragement.

Consider the following thoughts by George Miller, one of the founders of cognitive science (whom Golumbia does discuss). Here are some remarks about a two-day conference in 1956 which he regards as the founding event of cognitive science [X, pp. 142-143]:
The first day, 10 September, was devoted to coding theory, but it is the second day of the symposium that I take to be the moment of conception for cognitive science. The morning began with a paper by Newell and Simon on their ‘logic machine’. The second paper was from IBM: Nat Rochester and collaborators had used the largest computer then available (an IBM 704 with a 2048-word core memory) to test Donald Hebb’s neuro- psychological theory of cell assemblies. Victor Yngve then gave a talk on the statistical analysis of gaps and its relation to syntax.

Noam Chomsky’s contribution used information theory as a foil for a public exposition of transformational generative grammar. [...] His 1956 paper contained the ideas that he expanded a year later in his monograph, Syntactic Structures [9], which initiated a cognitive revolution in theoretical linguistics.

To complete the second day, G.C. Szikali described some experiments on the speed of perceptual recognition, I talked about how we avoid the bottleneck created by our limited short-term memory, and Swets and Birdsall explained the significance of signal-detection theory for perceptual recognition. The symposium concluded on the following day.

I left the symposium with a conviction, more intuitive than rational, that experimental psychology, theoretical linguistics, and the computer simulation of cognitive processes were all pieces from a larger whole and that the future would see a progressive elaboration and coordination of their shared concerns.
As Miller said earlier in the article, “the cognitive counter-revolution in psychology brought the mind back into experimental psychology” (p. 142).

First, I call your attention to the variety of topics under discussion – a ‘logic machine’, cell assemblies, statistical analysis of language (most un-Chomsky-like), perceptual recognition, short-term memory, and, yes, Chomsky on syntax. Why is it that Chomsky somehow emerged as the star? I can’t believe it was only or even primarily his rationalism; they were all rationalists of one sort or another. Rather, I suggest it is because he was talking about language, not gaps as Yngve was, but about the syntactic tissue that holds language together. Could it be that it is language that holds the human mind together? The answer may not obviously be “yes”, but it is worth exploring, no? And that’s what made Chomsky so central.

It was not so much his specific approach to language, but that he had a coherent and systematic approach to language. It looked like he knew how to write a grammar of thought. And that promises a new way of thinking about the mind, not as a bundle sensations, percepts, thoughts, and (yes, in time), feelings and motives, but as a system. Computation created a bridge between mind and matter and Chomsky showed what you could drive across that bridge, even if he was ambivalent making the trip himself.

Finally, let me repeat that computing gave investigators a way to realize mind in matter. That is obvious enough. But with Golumbia's confusion about the distinction between the abstract theory of computation (which Chomsky used in his investigation of syntax) and real computation taking place in some device (whether artificial, a computer, or organic, a nervous system), perhaps he simply could not see this was enormously important in and of itself, important enough to overshadow portents of neoliberalism. That is, without the computational bridge between mind and matter, the neoliberal resonance would not have been sufficient to bring Chomsky’s ideas to the attention of a wide audience across linguistics, psychology, computer science, and philosophy. And once those ideas did take hold, the neoliberal resonance has not been strong enough to prevent alternative approaches from prospering. As I write this, at a moment when the 2016 American presidential race  is between a neoliberal, Hillary Clinton, and I-don’t-know-what Donald Trump is but he’s wealthy, it is by no means clear just which of Chomsky’s will remain active in the next intellectual generation.

Reference

[1] George Miller, The cognitive revolution: a historical perspective, TRENDS in Cognitive Sciences, Vol. 7 No. 3, March 2003, pp. 141-144.

No comments:

Post a Comment