Tuesday, August 5, 2014

Reading Macroanalysis 1: Framing: Hyperobjects, Objectification, and Evolution

Matthew L. Jockers. Macroanalysis: Digital Methods & Literary History. University of Illinois Press, 2013. x + 192 pp. ISBN 978-0252-07907-8
The book arrived midway last week, when I hadn’t even finished reading Tim Morton’s Hyperobjects, much less finished blogging about it. But that didn’t stop me from giving Macroanalysis a look-thru: contents, some of the figures, read a bit here and there. I ended up reading Chapter 9, “Influence”, first; I’d read Matt Wilkins’ review in the LA Review of Books:
It’s a nifty approach that produces a fascinatingly opaque result: Tristram Shandy, Laurence Sterne’s famously odd 18th-century bildungsroman, is judged to be the most influential member of the collection, followed by George Gissing’s unremarkable The Whirlpool (1897) and Benjamin Disraeli’s decidedly minor romance Venetia (1837). If you can make sense of this result, you’re ahead of Jockers himself, who more or less throws up his hands and ends both the chapter and the analytical portion of the book a paragraph later.
Would I be able to make sense of those results? thought I to myself as I read. Nope, I couldn’t. Better luck next time.

I then read though the first four chapters, gathered together as Part I: Foundation (“Influence” ended Part II: Analysis). OK, I’ll go along with most of that, but... I skipped over Chapter 5, “Metadata” and dug into Chapter 6, “Style”. Hmmm, thought I to myself, if you recast the analysis in terms of cultural evolution, you might be able to frame an argument for the autonomous aesthetic realm, though Jockers frames the discussion as constraints of the author. And when I went back to the “Metadata” chapter, wouldn’t you know it, I saw another opening for an evolutionary formulation.

And that’s about where I am now. I’ve read the short coda, “Orphans”, where Jockers expresses ambivalence about cultural evolution, and I’ve got two substantive chapters to go, “Nationality” (ch. 7) and “Theme” (ch. 8). But I really need to get blogging.

As the title suggests, this post is preliminary. I’m not going to say much about Jockers’ specific arguments. Rather, I want to do a bit of framing.

The Scope of the Humanities

One can hardly imagine two such different examples of contemporary humanistic thought as Macroanalysis and last week’s book, Tim Morton’s Hyperobjects. Morton is working within an Anglophone Continental discourse with roots in Hegel, Heidegger, and post-structuralist philosophy and Theory while Jockers’ methodology is grounded in humanistic computing, corpus linguistics, and social science. If you were to cross-match their bibliographies, you wouldn’t find many texts in common. Further, while Morton is trained as a literary critic, and has a lit crit job at Rice, Hyperobjects is not literary criticism. It’s philosophy and cultural criticism. Jockers is all literature.

Such is the contemporary scope of the humanities.

And yet there is a deep connection between these two works. Morton coined the term “hyperobjects” to designate very strange large-scale objects that have been broken – as in taming a wild horse – to human thought largely through use of computational methods of one kind or another. Global warming is his paradigm example. We wouldn’t know about global warming without sophisticated data-hungry computational models. We don’t need the models to feel the heat, or flee from the water driven into the basement by hurricane Sandy, but we need those models to link those proximal phenomena to the larger causal structure of the world.

Similarly, we don’t need massive data crunching to interpret Robinson Crusoe or David Copperfield. But what of the tens of thousands of non-canonical texts that constitute literary culture. How do we deal with them? Until a decade or so ago we didn’t. But now we’ve got thousands of digital texts and we have software that lets us analyze them thousands at a time. Whole bodies of texts have now become thinkable as coherent and analyzable objects. Given that they are massively distributed in time and space, they fit the criteria for Morton’s hyperobjects.

So, global warming is a hyperobject. But so is the collection of 3,346 19th Century British texts that Jockers works on in the “Theme” chapter. Or is the ‘collective mentality’ that produced those texts the hyperobject?

Objectification and Analysis

Now, to the line, Hartman’s Line. Hartman of course is Geoffry Hartman and the line is the one he draws between reading and semiotics/structuralism in this passage, among others, from The Fate of Reading (p. 271):
I wonder, finally, whether the very concept of reading is not in jeopardy. Pedagogically, of course, we still respond to those who call for improved reading skills; but to describe most semiological or structural analyses of poetry as a ”reading” extends the term almost beyond recognition.
Jonathan Culler drew the same line in Structuralist Poetics (1975) when he asserted that linguistics is not hermeneutic. That line, I submit, was about objectification. Linguistics objectifies language; structuralism and semiotics, at least in their more technical incarnations, objectify poetry. Objectification gets in the way of reading, of interpretation.

It goes without saying that macroanalysis objectifies its subject. It could hardly do otherwise, for objectification is a necessary precondition for computational analysis. As such, we would expect Hartman to exclude macroanalysis from the kingdom of reading. Even conventional attempts to generalize over long stretches of literary history – Ian Watt’s The Rise of the Novel is Jockers’ example – stay firmly on the near side of Hartman’s line (p. 25):

This is simply close reading, selective sampling, of multiple “cases”; individual texts are digested, and then generalizations are drawn. It remains a largely qualitative approach.

Jockers is quite clear that he does not offer macroanalysis as a replacement for traditional close reading, which “will continue to reveal nuggets” (p. 9).

But, and here’s my point, traditional close reading is not the only way to go after nuggets. When Hartman was drawing his line, he was doing so with respect to analytical techniques the worked close to the text, but did so through explicit objectification. Computational analysis of a corpus of texts requires objectification, but it is not necessary for it.

If I am comfortable with digital criticism, that is in part because I long ago saw the need to objectify literary texts in the service of knowledge, knowledge of a different kind than that sought through hermeneutic methods. Thus I advocate objectification at all scales of inquiry, micro and meso as well as macro, a point I’ve argued in my brief for Computational Historicism.

One last point before taking up cultural evolution. Back in the 1950s Monroe Beardsley argued that any form of art criticism and four activities: description, analysis, interpretation, and evaluation – a formulation which has proved to be quite popular: google “description, analysis, interpretation, evaluation” and see what you get. In traditional literary criticism description is informal. You quote passages, paraphrase them, summarize plots, apply genre classifications, and the like. It’s an activity that’s received little or no theoretical attention, which has mostly been devoted to interpretation.

In macroanalysis Jockers devotes an enormous amount of time and effort to arrive at (mere) descriptions, which more often than not takes the form of a chart or diagram. The texts must, of course, be cleaned and normalized, and the metadata extracted. There may be a tagging operation. Algorithms have to be developed, implemented, and tested. Those descriptions thus float on a large body of collective effort and technology.

It’s a new intellectual world. But it’s the objectification that’s important. It’s objectification that defines this world. It’s objectification that puts this work on the far side of Hartman’s line. The computer is only a means of objectification, but it’s not the only means.

Evolution

In his coda Jockers notes “Evolution is the word I am drawn to, and it is a word that I must ultimately eschew” (p. 171). In glossing his reservations Jockers quotes the Wikipedia entry for a book (Darwin’s Dangerous Idea) by the philosopher Daniel Dennett (pp. 171-172):
Darwin’s discovery was that the generation of life worked algorithmically, that processes behind it work in such a way that given these processes the results that they tend toward must be so” (Wikipedia 2011a, emphasis added). The generation of life is algorithmic. What if the generation of literature were also so? Given a certain set of environmental factors–gender, nationality, genre–a certain literary result may be reliably predicted; many results may be inevitable. This is another dangerous idea, perhaps a crazy one.
If cultural evolution should work like that, well then Jockers is right to be worried. For that sounds like Edward Said’s late career plea for an autonomous aesthetic realm is in vain (“Globalizing Literary Study”, PMLA, Vol. 116, No. 1, 2001, pp. 64-68). Literary culture is ‘determined’ by external forces (the forces of production?) and that’s that. QED.

But is that really so? Is evolution algorithmic?

That depends on what you mean by algorithm. In the strictest sense an algorithm is an effective computational procedure that produces a result in a finite number of steps. As far as I can tell, biological evolution is not algorithmic in that sense. Just as “deconstruction” has come to have a popular sense that’s only vaguely related to the work of Derrida, “algorithm” has come to have a popular sense that’s not tightly coupled with the sense it has in mathematics and computer science. In this popular sense an algorithm is simply a procedure that proceeds according to rules, or something.

As a thought experiment, consider the Cretaceous–Paleogene extinction event, which wiped out some three quarters of all plant and animal species. It is believed to have been triggered by the impact of a massive comet or asteroid. Let’s turn time back to the moment of impact and start things running again. What are the chances that the biosphere will evolve in the rerun in exactly the same way it originally did? I’d say they’re very close to zero. What are the chances that humankind would evolve once again? There’s no way to tell. It’s a meaningless question.

I suppose one could object that this particular thought experiment is too massively open-ended. Perhaps so. But the point isn’t going to disappear by choosing a more circumscribed case. That biological evolution is lawful doesn’t make it algorithmic in any strict sense, no more so than the regularity of planetary motions means that they are executing “Newtonian algorithms.”

Biological evolution is an open-ended process that is constrained and lawful, but it is not algorithmic in any useful sense. Nor is cultural evolution. The concept of cultural evolution gives us a way of thinking about how populations of people circulate populations of texts among themselves and thereby regulate their interactions with one another. No more, no less.

No comments:

Post a Comment