Back when I threw in my lot with cognitivism in the early 1970s, I did so because I was excited by the idea of computation. That's what animated the early years of the “cognitive revolution.” But, by the 80s you could get on board with the cognitive revolution without really having to think about computation. Computation made the mind thinkable in a way it hadn't been in in the Dark Ages of Behaviorism.
Once that had happened psychologists and others were happy to think about the mind and leave computation sitting off to the side. Among other things, that’s the land of cognitive metaphor, mirror neurons, and theory of minds.
By the mid-90s literary critics were getting interested in cognitive science, but with nary hint of computation. A lot of cognitive criticism looks lot old wine in new barrels. The same with most literary Darwinism. All that's new here are the tropes, the vocabulary.
As far as I can tell, digital criticism is the only game that's producing anything really new. All of a sudden charts and diagrams are central objects of thought. They're not mere illustrations; they're the ideas themselves. And some folks have all but begun to ask: What IS computation, anyhow? When a died-in-the-wool humanist asks that question, not out of romantic Luddite opposition, but in genuine interest and open-ended curiosity, THAT's going to lead somewhere.
If you think of a singularity as a moment where change is no longer moving away from the old, but (now has the possibility of) moving toward an as yet unknown something new, then that's where we are NOW.
No comments:
Post a Comment