Monday, December 31, 2012
Writing in The New Atlantis, Matthew Rees reconsiders Kuhn's The Structure of Scientific Revolutions on it's 50th anniversary. Here are a few passages directed specifically at the social sciences, which were not much on Kuhn's mind when he wrote the book. I'm not sure I agree with Rees, but he makes interesting points.
While the physical sciences were the most prominent in the public mind when Kuhn was writing Structure in the early 1960s, today biology is in ascendance. It is striking, as Hacking notes in his introductory essay, that Kuhn does not explore whether Darwin’s revolution fits within his thesis. It is far from clear that Kuhn’s thesis can adequately account for not only Darwin’s revolution but also cell theory, Mendelian or molecular genetics, or many of the other major developments in the history of biology.... But in the half century since Kuhn wrote his book, biology has taken the place of physics as the dominant science — and so in the social sciences, the conception of society as a machine has gone out of vogue. Social scientists have increasingly turned to biology and ecology for possible analogies on which to build their social theories; organisms are supplanting machines as the guiding metaphor for social life.
Here come our friends IS and OUGHT:
While I'm obviously not on hiatus at this point, I'm still working with a rental computer. It's taking time to get data transferred from my backup drive to the new machine. The problem seems to be in my email contacts. For some reason they weren't able to recover them from the back-up, so they're trying to recover them from the machine that died.
Ain't technology wonderful?
and I'm not at all a Zep fan. But you can't deny the power of this performance, which isn't by Led Zepplin, but was performed for them at the Kennedy Center.
Which reminds me: What about aging rockers? I heard that Mick Jagger tore it up at the Garden a couple of weeks ago. Maybe we need to rething old age.
Sunday, December 30, 2012
So says drummer Duffy Jackson, son of bassist, Chubby Jackson. Sounds like a real sweet cat. Listen to the interview. It's a pleasure.
This comment is undoubtedly naïve. A major sin in the contemporary philosophical environment. It is what it is.
The sin of correlationism is generally laid at the feet of Immanuel Kant and his Copernican revolution. Why not implicate Descartes?
He’s the one who divided the cosmos into two utterly different kinds of substance, res cogitansand res extensa. That’s where the problem lies, no? Given that these two substances are utterly different, and that our minds are constituted by one of them, the relationship between mind and the world becomes problematic. Descartes invoked God to handle that problem.
If you jettison God, then you’ve really got a problem. Correlationism is one solution. But it’s not the only one.
Why not jettison the division of the cosmos into two utterly different substances? What happens then? The relationship between humankind and the rest of the world becomes just another relationship. It may be a particularly complex one, but that’s OK. Complexity is not the same as Utterly Different.
In this regime correlationism simply disappears. Hence, you don’t even have to take the trouble to deny it or argue against it.
Back in August of 2011 I published a relatively short document outlining my sense of where literary studies should go. I’ve now revised that document, retaining the four-part structure of my assessment, but somewhat revising my sense of what those four parts are. Then I talked of 1) description, 2) the newer psychologies, 3) object-oriented ontology, and 4) digital humanities. I’ve retained 1 and 4, but 2 and 3 have become naturalist criticism and ethical criticism, respectively. This change, of course, reflects the work I’ve recently done on a pluralist metaphysics.
This revision has the merit, I believe, of being a bit closer to what is actually happening in literary studies as naturalist criticism isn’t so restrictive a rubric as the newer psychologies and ethical criticism isn’t nearly so restrictive as object-oriented ontology. But the emphasis remains as it was then:
The primary texts constitute the treasure we study. Full and accurate descriptions of those texts are the key to that treasure. Everything else is built on those descriptions.
1) Description: We need to develop richer descriptions of the texts we study. I’ve blogged about this here and there, and I note that some folks at Arcade seem to be thinking about these lines. But mostly what I’ve been doing is working at honing my descriptive skills, with texts and with films. The postscript about a handbook for Heart of Darkness is as close as I’ve come to an explicit justification for description, though my recent post, Corpus Linguistics, Literary Studies, and Description, hints at what a fuller argument might entail.
Saturday, December 29, 2012
One of my main hobbyhorses these days is description. Literary studies has to get a lot more sophisticated about description, which is mostly taken for granted and so is not done very rigorously. There isn’t even a sense that there’s something there to be rigorous about. Perhaps corpus linguistics is a way to open up that conversation.
The crucial insight is this: What makes a statement descriptive IS NOT how one arrives at it, but the role it plays in the larger intellectual enterprise.
Back in the 1950s there was this notion that the process of aesthetic criticism took the form of a pipeline that started with description, moved on to analysis, then interpretation and finally evaluation. Academic literary practice simply dropped evaluation altogether and concentrated its efforts on interpretation. There were attempts to side-step the difficulties of interpretation by asserting that one is simply describing what’s there. To this Stanley Fish has replied (“What Makes an Interpretation Acceptable?” in Is There a Text in This Class?, Harvard 1980, p. 353):
The basic gesture then, is to disavow interpretation in favor of simply presenting the text: but it actually is a gesture in which one set of interpretive principles is replaced by another that happens to claim for itself the virtue of not being an interpretation at all.
And that takes care of that.
Except that it doesn’t. Fish is correct in asserting that there’s no such thing as a theory-free description. Literary texts are rich and complicated objects. When the critic picks this or that feature for discussion those choices are done with something in mind. They aren’t innocent.
But, as Michael Bérubé has pointed out in “There is Nothing Inside the Text, or, Why No One’s Heard of Wolfgang Iser” (in Gary Olson and Lynn Worsham, eds. Postmodern Sophistries, SUNY Press 2004, pp. 11-26) there is interpretation and there is interpretation and they’re not alike. The process by which the mind’s eye makes out letters and punctuation marks from ink smudges is interpretive, for example, but it’s rather different from throwing Marx and Freud at a text and coming up with meaning.
Thus I take it that the existence of some kind of interpretive component to any description need not imply that the necessity of interpretation implies that it is impossible to descriptively carve literary texts at their joints. And that’s one of the things that I want from description, to carve texts at their joints.
Of course, one has to know how to do that. And THAT, it would seem, is far from obvious.
Friday, December 28, 2012
This is a bonsai tree I photographed at Longwood Gardens outside Philadelphia:
The label for each such tree noted the date on which training began.
That’s what interests me, the use of the word “training.” One would not say that a black smith trains iron to become horseshoes, or a knife blade, or hardware fittings for doors and cabinets. Metal is not trainable in the sense that plants, animals, and humans are.
This usage seems most common for humans and animals, perhaps because training plants is not so common. Of myself I can say that I was trained by David Hays, but I would not say that I was trained by the other graduate faculty with whom I studied in graduate. It is true, that they trained me in a general way, but I worked more closely with Hays than with any of the others and acquired both a richer and a more specific set of skills from him. There are aspects of my work that are identifiably linked to David Hays, most obviously my work in cognitive networks.
When I first started looking into object-oriented ontology I found all the talk of being (sometimes capitalized as Being) a bit peculiar, and this despite the fact that I’d read a fair amount of such talk in my youth and didn’t find it peculiar then. What had happened since then that made such ordinary philosophical discourse seem odd?
And correlation, as in correlationism, that really had me spinning. I read the words and made sense of them one at a time, but the sentences didn’t hold together for me. I couldn’t grasp what’s being talked about. I didn’t resonate. It just stayed there, like a lump of indigestible meat.
While talk of being and correlationism no longer seems strange, I’m still not sure what’s being talked about. I’m not even sure that the talkers themselves know. Consider this passage from Alexander Galloway’s recent cri de Coeur in Critical Inquiry, The Poverty of Philosophy: Realism and Post-Fordism (vol. 39, Winter 2013, p. 354):
For Meillassoux correlationism means that knowledge of the world is always the result of a correlation between subject and object. “By ‘correlation’ we mean the idea according to which we only ever have access to the correlation between thinking and being, and never to either term considered apart from the other,” Meillassoux writes. Under the system of correlationism, subjectivity and objectivity are forever bound together.
Meillassoux’s statement naively reads as though there is being somewhere out there, like kumquats, x-rays, and elephants, and there is thinking, again somewhere out there, like galaxies, frogs, and geodes. And those things out there, being and thinking, they are terms than cannot be separately considered. What can that possibly mean?
From Walt Disney's obituary in The New York Times (December 16, 1966):
From Harvard and Yale, this stocky, industrious man who had never graduated from high school received honorary degrees. He was honored by Yale the same day as it honored Thomas Mann, the Nobel Prize-winning novelist. Prof. William Lyon Phelps of Yale said of Mr. Disney:
"He has accomplished something that has defied all the efforts and experiments of the laboratories in zoology and biology. He has given animals souls."
Presumably Prof. Phelps said that the day Disney received his honorary degree, which, I believe, would have been in the late 1930s. That was long before animals and our attitudes toward them had become a focal concern in the humanities. Which means that his assertion about Disney didn't have THAT particular kind of weight.
I rather imagine that what he had in mind is simply that Disney (among others) made cartoons about animals that walked about on their hind legs and talked, as though they were humans. What I'm wondering is this: What's the relationship between those cartoons, then, and the present existence of animal studies?
Animal studies didn't come from nothing. It has had to draw on existing cultural resources, both immediately and directly, and indirectly as well. I'm thinking that those cartoons are among the most important of the indirect cultural resources on which animal studies draws. Without those cartoons suffused throughout the society, animal studies would be a much harder sell.
Thursday, December 27, 2012
Phil Freeman of Burning Ambulance has noted the death of Fontella Bass, RnB singer and wife of Lester Bowie, avant-garde trumpeter with the Art Ensemble of Chicago. She sang "Theme do Yoyo" on one of their earliest and best-known albums, Les Stances a Sophie, which was soundtrack music for a movie of the same name. Here's a YouTube clip from the movie:
I haven't got the foggiest idea what's going on in the movie, which I've never seen. But that album hit me like a ton of bricks when I first heard it back in, say, 1969 (when it was released) or 1970, and I've still got the vinyl. I could hear the funk in the bass line, but it was jittery in a way that funk and RnB are not. And the horn solos, wonderful!
A quick check of YouTube shows that others have covered the tune. Very interesting. It's not often that anyone covers avant-garde/free jazz tunes. But then this wasn't an ordinary free jazz tune. There is jazz that is free in the sense that it rigorously avoids recognizable rhythm, melody, and harmony. It CAN be fun to play, though I'm not so fond of listening to it. The Art Ensemble of Chicago (AEOC) was free even of THAT stricture. And that freedom left them room to make ordinary sense, or at least to float it on an extraordinary foundation.
The jitter in that bass line was avant-garde. But the core was pure funk. And so with Fontella Bass's vocal, pure RnB. But her ability to negotiate it over that bass line, pure freedom. A remarkable performance.
She's at it again, Nina Paley. She's been working on a Biblical epic under the working title Sadermasochism. And, she's working on the last part first–The last shall be first? Here's the final scene:
While this little bloodbath is about one particular and relatively small piece of land in a particular place that has been ongoing for thousands of years, one might well be tempted to see it as both metaphor and metonymy for all nationalists land claims.
I learned about Piaget as an undergraduate at Johns Hopkins. He had a profound influence on me. The idea of learning how the mind works by studying how it develops had intuitive appeal and Piaget’s systematic exploration and exposition of mental mechanisms gave a sense the mind was not just an inchoate box full of processes and miasmatic desires but that there was a complex and highly structured “device” at work.
Furthermore Piaget 1) considered himself a structuralist and wrote a little book about it, 2) and quite explicitly argued that the mind constructs reality. He even addressed himself to the history of the mind, under the heading of genetic epistemology through analysis of the historical development of mathematical and causal concepts.
Even then, it was apparent that, if he was a structuralist, is was a different kind of structuralist from, say, Lévi-Strauss and Roland Barthes. However much they were interested in the mind, Piaget had much more to say about it. He did not see it as a profusion of signs variously chasing, eluding, and opposing one another. And, alas, his ideas were not of much use for critique. And THAT, I suspect, is why he never really made the team for humanist discourse, though a large, ingenious, and industrious body of psychologists has been busily at work extending, amending, revising, and critiquing his work.
Not that I think that a knowledge of Piaget alone would plug the hole I see in Latour’s thinking and that of his students, the object-oriented philosophers. But such knowledge would at least give them a sense of “thickness” to the mind, a sense that there’s something more than a bunch of signs playing hide-and-go-seek.
Here's a screenshot from the Ave Maria episode of Disney's Fantasia:
Look at those trees, and their reflection in the water. Now look at at the trees, and their reflections, in this photo I just took at Longwood Gardens, once a DuPont estate, which are located near Philadelphia, Pennsylvania:
The general resemblance is remarkable, no?
Well, interesting, but not remarkable. The general resemblance between certain forests of tall trees and the vaulted ceilings of gothic cathedral's was obvious to the men who designed those cathedrals. THAT's why they designed them that way, to embody that similarity. When Pierre DuPont's gardeners created the gardens in the first quarter of the 20th century they certainly would have had those cathedrals in mind. And Disney's artists, particularly concept artist, Kay Nielson, certainly had the same thing in mind when they designed the Ave Maria episode in the late 1930s.
Tuesday, December 25, 2012
I’m thinking about three ideas for the introduction to the PDF of the main argument in my pluralism series:
- Philosophy cannot provide the foundations for the naturalist study of anything. The specialized disciplines must necessarily be responsible for their own foundations.
- The naturalist study of human kind necessarily includes an account of why humans seek to live an ethical life, but it cannot itself provide an ethics. The metaphysician must, of course, take note of this in considering the ways of Being. I discuss this in my argument under the heading of Ethical Criticism and Unity of Being
- Philosophy can provide the foundations for ethical discussions. Though it should feel free to call on any of the specialized human sciences for support and insight, it cannot base its arguments on them for the well-known reason that one cannot derive an OUGHT from a IS.
I suspect, maybe even fear? that the relationships between these three are tricky. But that’s OK.
Monday, December 24, 2012
Levi Bryant quotes from Michael Serres, Conversations on Science, Culture, and Time (p. 14)
Either science must develop its own intrinsic epistemology, in which case it is a question of science and not of epistemology, or else it’s a matter of external annotation– at best redundant and useless, at worst a commentary or even publicity.
And (p. 29):
Epistemology requires one to learn science in order to commentate it badly, or worse, in order to recopy it. Scientists themselves are better able to reflect on their material than the best epistemologists in the world– or at least more inventively.
He further notes that "All these things are still seen everywhere today." OK. So why does he so consistently "commentate it badly, or worse, in order to recopy it"? Is it because he summarizes science in the name of ontology and he feels he can get it wrong and it won't matter?
Sunday, December 23, 2012
The audio commentary to the 60th Anniversary Edition of Fantasia has many comments by Walt Disney, often in his own voice (rather than someone reading something he said). Of the leaves in the final segment of The Nutcracker Suite, Disney said this (emphasis mine):
Have perfect control of the leaf. Make the leaf do all the movements we want it to do. I don’t like it where it gets down and dances; we’re limiting the leaf to what a human being can do. That’s another thing. We could make a comedy out of this, but I don’t think we should. Take a floating leaf and the shapes it might assume. Like when the leaf floats down and lands on the ground and the movement of the wind. You’ve seen them in a high wind. Try to take the natural movement of the leaves and things like that being tossed around in the wind. A ballet effect.”
I've noted before on New Savanna that animation seems an inherently object-oriented medium. Everything had to be drawn by hand, sticks, stones, apples, cats, and people alike. And anything and everything could be and often was brought to life.
Friday, December 21, 2012
Can 140 years of Tasmanian tradition be given new life on the West Bank of the Hudson River?
Every year for the past dozen or so years Tony Hicks has put together a brass band to play Christmas tunes, first for the Hamilton Park Ale House, and then when Maggie opened her own place on Newark Street, Tony booked us there, Skinner’s Loft. It was a fun gig. Around 6:30 we’d line up outside on the street and play for half and hour or so. Then we’d come inside, have a beer or two, and play a couple sets in the upstairs dining room. The staff would wear funny Christmas hats and people would sing along with the band.
Good holiday cheer.
This year, alas, for whatever reason, Maggie decided not to do it. We decided, on the contrary, that we’d give her a freebie. Not the whole gig, but out doors on the side walk, we’d do that.
So, after a good half-hour devoted to finding a parking space I enter Skinner’s Loft and see Tony and Ed, another trumpet player, sitting at the bar. I join them and Tony starts telling us about the Latrobe Federal Brass Band, back in Tasmania, where he’s from. Along about the time he gets to telling us about a particularly opinioned character named Scudgy Clayton we decided it was time to play.
So we go outside, set up our music stands, break out our horns—Ed on trumpet, Tony on Euphonium (a $3000 horn he got for $50 in a pawn shop), and me on trumpet, and start playing, Hark the Herald Angels, Jingle Bells, and so forth. Before you know it an eight-year old Vietnamese kid lays a twenty on Ed’s music stand.
Whoa! That never happened before. So we play some more while dreaming of mortgage payments and new shoes and before you know it, another dollar, and another, a quarter, and by the time we’re done, $29.25. All unexpected.
Nancy Demerdash, "Concuming Revolution: Ethics, Art, and Ambivalence in the Arab Spring, New Middle Eastern Studies, December 3, 2012.
First paragraph of the section, Graffiti, Memory and the "Political Street': The Production and Reclamation of Public Space:
Artists across the region are also taking their craft to the street, the very locus of resistance. Of course, there is a long and sustained tradition of graffiti and mural arts in the Middle East, mainly in Palestine and the Occupied Territories. But since the revolutions, these public art forms have exploded in Tunisia, Egypt, Libya and to a far lesser degree in Syria. Recognizing the Lefebvreian social and political production of urban space and its constant reconfigurations, artists identify with the urban marginals who continually negotiate their integration with and contestations to the disintegrating systems of state and bureaucratic power. Meaning, for these urban artists and everyday locals alike, is constituted from lived experiences within the spaces of revolution. Street artists have come to inscribe on public walls past memories and memorials to those struggles and lives lost in the revolutions. But for the general public, the spaces on these walls have acquired profound collective and personal meaning. Murals and graffiti panels embody concrete memories for passersby and neighborhood locals; people gather around the walls, engaging in discussions of what should be represented and how a particular piece moves them or invokes a certain memory. These art forms have transformed public spaces and streets into what Asef Bayat terms the “political street,” signifying “the collective sensibilities, shared feelings, and public judgment of ordinary people in their day-to-day utterances and practices… The Arab Street… should be seen in terms of such expression of collective sentiments in the Arab public sphere.”
Check out the images. They're superb. H/t Alexander Key.
Thursday, December 20, 2012
Steven Kotler writes an op-ed with neuroscientist James Olds. They suggest that shooting guns might be addictive:
Dopamine shows up when we take a risk—and firing a gun is always a risk. It shows up when we encounter something novel and since guns blow things up, well that usually pretty novel. If you’re serious about your guns and use them for target practice or hunting, well that requires pattern recognition and this increases dopamine as well.
Are there direct correlations? Has anyone yet done a PET or MRS scan (the only ways to screen for dopamine in the brain) of people just leaving a firing range? Not that we can tell (though we’ll outline this and a few possible areas of research in a moment). We do know, from copious amounts of video game research, that first person shooter games release dopamine, and this has been linked to everything from learning and rewards to ideas about violence and harm to winning and motivation.
What does all of this really mean? It means that the reason gun violence continues to rise (and the reason gun control legislation remains so hard to pass) is because we are quite literally addicted to our guns.
Seinfeld will nurse a single joke for years, amending, abridging and reworking it incrementally, to get the thing just so. “It’s similar to calligraphy or samurai,” he says. “I want to make cricket cages. You know those Japanese cricket cages? Tiny, with the doors? That’s it for me: solitude and precision, refining a tiny thing for the sake of it.”
On the way to the Gotham Comedy Club for a surprise set:
Seinfeld likes pressure. He describes doing live comedy as “standing against a wall blindfolded, with a cigarette in your mouth, and they’re about to fire.” His objective at Gotham was piecework. “A lot of what I’ll be doing tonight are tiny things in my bits where I’m looking for a little fix, where something isn’t quite smooth,” he said. “A lot of stuff I do out of pure obsessiveness.” One bit began with the observation that “tuxedos are the universal symbol for pulling a fast one.” “That line works,” he said. “But I want to get from there to a point about how the places where you see tuxedos are not honest places — casinos, award shows, beauty pageants, the maitre d’ — all these things feel shady.” He added: “But I’ve been having trouble getting the audience to that. I’m trying to bring that to a punchline.” ... “I have this old ’57 Porsche Speedster, and the way the door closes, I’ll just sit there and listen to the sound of the latch going, cluh-CLICK-click,” Seinfeld said. “That door! I live for that door. Whatever the opposite of planned obsolescence is, that’s what I’m into.”
After the set, which got him a standing ovation:
“I’d say two-thirds of that set was garbage,” he said, matter-of-factly. “Whether it was lines coming out wrong or the rhythm being off.” He said he’d counted “probably eight” jokes that failed to get the kinds of laughs he desired. “There’s different kinds of laughs,” he explained. “It’s like a baseball lineup: this guy’s your power hitter, this guy gets on base, this guy works out walks. If everybody does their job, we’re gonna win.” I told him about the khaki guy’s spit take, and Seinfeld cracked up, calling this “a rare butterfly.” Nevertheless, “there wasn’t one moment where I was where I wanted to be. That was just a workout. I had to get it going again.”
Levi Bryant's singing another variation on the tune "Critique is Over." Sounds like he's digging himself out of a hole, perhaps one of those holes of perpetual withdrawal.
This, of course, is good.
The point is that today we need to find the will to believe a little, to affirm a little, and to commit a little.
One object at a time?
Only where we abandon our foundationalist, obsessional assumptions, our desire to have the truth before we pursue the truth, our intoxication with epistemology, will we be able to move beyond this paralysis.
How about abandoning the idea that the interpretive mode is the only mode of thought a humanist needs? It's all well and good to abandon critique, but without a richer and more robust conceputal tool kit, the affirmative conceptual structures will become tangled in their own verbal complexities. And that will lead to calls for another round of critique to clean up the mess.
No, critique isn't the problem. It's a sympton of the problem. The problem is trying to get too much conceptual mileage out of nothing more than verbal constructs. As I argue in this report on undergraduate education in the human sciences, we need to learn the USE of structural tools from linguists, mathematicians and software engineers and the USE of statistical tools from social and behavioral scientists.
Wednesday, December 19, 2012
Here's the opening paragraph:
It might be thought that classification in the special and historical sciences is occasionally atheoretical, but that in the general sciences, physics and chemistry, it is derived from Theory. But in fact one of the most exemplary cases of empirical classification that led to Theory is in these sciences: the periodic table.
Wilkins is singing my song: description and classification lead to theory. That, of course, is what happened in biology, as Wilkins mentions later. Without three of four CENTURIES of describing and classifying prior to him, Darwin wouldn't have had the basis on which to float his theory of evolution.
Mendeleev even expressly noted that he was taking a Lockean or even operationalist approach:. . . by investigating and describing what is visible and open to direct observation by the organs of the senses, we may hope to arrive, first at hypotheses, and afterwards at theories, of what has now to be taken as the basis of our investigations. (quoted in Kultgen 1958: 180)Subsequent to the adoption of the table by chemists, there arose a program to improve and explain the “periodic law”. As Scerri says, once scientists have a classification, they seek an underlying cause of the regularities (as Darwin did).
Literary studies needs to take a cue from biology and chemistry and get our descriptive house in order.
That's us, or rather our impact on the physical disposition of the planet. Here's how it goes:
the Pleistocene, 2.5 million years ago to 12 thousand years ago, saw the emergence of humans from clever apes.
the Holocene, 12 thousand years ago to to 1800 AD, from agriculture to industry and the emergence of loosely integrate world order of human commerce and exchange;
and now the Anthropocene, 1800 AD the present and into the future, when industry started pumping CO2 into the atmosphere, little knowing the consequences.
The anthropocene is mapped out in images of the earth at Globaïa. H/t Tim Morton.
Tuesday, December 18, 2012
We'll get to that fairly tale in a minute, for it embodies a deep truth about living in society. But let's first think about guns. That gun ownership has been such a controversial issue in American politics suggests that it speaks to our sense of who and what we are.
What kind of phenomenon is gun ownership? Obviously, it's a fact about human beings. Some own guns and some do not. The question becomes: Is gunownership related to other characteristics of a person or not? It might be the case, for example, that gun owners are more likely to have blue eyes than non-gun owners. It that's the case–and there's no reason it is, this is just a hypothetical example–what's that about? Is there a common causal factor behind blue eyes and gun ownership?
Polling data indicates that there IS a relationship between reported political affiliation and gun ownership: Republicans are more likely to own guns than Democrats. This has changed over time: Gun ownership has diminished considerably over that last 40 years among Democrats but NOT Republicans. What's THAT about and is it correlated with anything else.
In 1973, about 55 percent of Republicans reported having a gun in their household against 45 percent of Democrats, according to the General Social Survey, a biennial poll of American adults.Gun ownership has declined over the past 40 years — but almost all the decrease has come from Democrats. By 2010, according to the General Social Survey, the gun ownership rate among adults that identified as Democratic had fallen to 22 percent. But it remained at about 50 percent among Republican adults.
The poll makes clear that gun ownership is deeply embedded in political identity, and vice versa. Some other variables, such as whether a voter lives in an urban area, also strongly predict gun ownership. But the differences between the parties remain even after accounting for these characteristics.
But the differences are most apparent in suburban areas. There, 58 percent of Republican voters said there was a gun in their household, against just 27 percent of Democrats.
It seems, further more, that "gun ownership rates are inversely correlated with educational attainment." That is, the more education one has, the less likely one is to own a gun. Why?
Monday, December 17, 2012
Literary History, the Future: Kemp Malone, Corpus Linguistics, Digital Archaeology, and Cultural Evolution
In scientific prognostication we have a condition analogous to a fact of archery—the farther back you draw your longbow, the farther ahead you can shoot.– Buckminster Fuller
The following remarks are rather speculative in nature, as many of my remarks tend to be. I’m sketching large conclusions on the basis of only a few anecdotes. But those conclusions aren’t really conclusions at all, not in the sense that they are based on arguments presented prior to them. I’ve been thinking about cultural evolution for years, and about the need to apply sophisticated statistical techniques to large bodies of text—really, all the texts we can get, in all languages—by way of investigating cultural evolution.
So it is no surprise that this post arrives at cultural evolution and concludes with remarks on how the human sciences will have to change their institutional ways to support that kind of research. Conceptually, I was there years ago. But now we have a younger generation of scholars who are going down this path, and it is by no means obvious that the profession is ready to support them. Sure, funding is there for “digital humanities,” so deans and department chairs can get funding and score points for successful hires. But you can’t build a new and profound intellectual enterprise on financially-driven institutional gamesmanship alone.
You need a vision, and though I’d like to be proved wrong, I don’t see that vision, certainly not on the web. That’s why I’m writing this post. Consider it a sequel to an article I published back in 1976 with my teacher and mentor, David Hays: Computational Linguistics and the Humanist. This post presupposes the conceptual framework of that article, but does not restate nor endorse its specific visionary recommendations (given in the form of a hypothetical computer program, called Prospero, for simulating the “reading” of texts).
The world has changed since then and in ways neither Hays nor I anticipated. This post reflects those changes and takes as its starting point a recent web discussion about recovering the history of literary studies by using the largely statistical techniques of corpus linguistics in a kind of digital archaeology. But like Tristram Shandy, I approach that starting point indirectly, by way of a digression.
Who’s Kemp Malone?
Back in the ancient days when I was still an undergraduate, and we tied an onion in our belts as was the style at the time, I was at an English Department function at Johns Hopkins when someone pointed to an old man and said, in hushed tones, “that’s Kemp Malone.” Who is Kemp Malone, I thought? From his Wikipedia bio:
Born in an academic family, Kemp Malone graduated from Emory College as it then was in 1907, with the ambition of mastering all the languages that impinged upon the development of Middle English. He spent several years in Germany, Denmark and Iceland. When World War I broke out he served two years in the United States Army and was discharged with the rank of Captain.Malone served as President of the Modern Language Association, and other philological associations ... and was etymology editor of the American College Dictionary, 1947.
Who’d have thought the Modern Language Association was a philological association?
The object-oriented ontologists think of objects as, in a philosophical sense, unbounded. As do I. But we differ in how we conceptualize that difference.
The object-oriented ontologists think of the object as withdrawing. Objects are always withdrawing from one another, hence they are always withdrawing from us. Which means that we, as philosophers, are always chasing after them. No sooner do we lay a philosophical glove on them than they exude a dab of philosophical grease and slip away.
Hence theirs is a flat landscape punctured by the black holes of ever withdrawing objects. They probe the holes but nothing ever comes out. Godot never shows up.
Pluralists think of objects as manifold, a plenitude of presence, abundant. Grasping objects is easy, but full comprehension is all but impossible. As soon as you break off a piece for examination the object fills the void. No matter how many pieces you examine there’s always more to see and touch.
Sunday, December 16, 2012
In casually thinking over the pluralist work, as I’ve been doing this past week, I’ve realized that I can push the exploration one more step without too much work. This post should be brief.
Let’s start with the commonsense distinction between living things and non-living things. By the kinds of arguments Jane Bennett has advanced in Vibrant Matter that distinction seems questionable as given. Considering the strange world of quantum mechanics and the self-organizing turbulence of complex dynamics and other such things, plain old stuff seems more dynamic, more vibrant, than is seemed to, say, Descartes. What I did when, using more poetry than reason, I declared the universe to be alive, I simply invested life in the whole universe, not simply in the earth’s biosphere. In that formulation, the universe was alive even before life, as we think about it, appeared on earth, or anywhere else (if life has indeed arisen elsewhere).
The universe was, and is, abundant. It thus evolves.
Yet, if the life/non-life distinction isn’t quite what common sense makes it out to be, still, there is a distinction of some sort to be made. There is a difference between the biosphere and, say, the Moon or the Sun. What I want to say is that, when life as we call it arose on earth, the abundance that had made the cosmos as a whole a living thing, had now become invested in (incarnated in) the biosphere considered as a small component of the universe as a whole.
The Big Thing I have in mind, is of course, the one Ian Bogost invoked at the end of his post on object-oriented ontology and politics, though I’ve got rather different ideas about where it’s coming from and where it’s going.
That discussion has flared up once again and, to my mind, by far the most sophisticated discussion is taking place at Terrence Blake’s Agent Swarm. I particularly recommend the post, Badiousian Background to Galloway’s argument vs Dumbing Down of the “Controversy”, with contributions from David Columbia, Virgilio Rivas, Blake himself, and some remarks from me as well. The discussion of BADIOU’S AND GALLOWAY’S CLONES is not quite so full, but Philip has an insightful comment to the effect that Galloway has merely asserted a bunch of connections each of which must, in fact, be argued. He’s preaching to the choir—there’s a lot of that, of course.
I find Blake’s discussions useful precisely because Blake himself, and his commenters, are familiar with a Continental literature that is now foreign to me, though I studied Continental thought early in my career. In an ideal world I’d read that literature for myself. This is not that ideal world and my time, like theirs, is limited. So it is useful for me to swim in their waters on my terms and thereby establish common themes and ideas arising in very different discourses.
At the same time, I’ve been hanging out in a discussion at Ted Underwood’s digital humanities blog, The Stone and the Shell. Underwood and Andrew Goldstone have just posted a fascinating piece on work they’ve been doing with the corpus of articles in PMLA (Publications of the Modern Language Association), What can topic models of PMLA teach us about the history of literary scholarship? In addition to Goldstone and Underwood, Jonathan Goodwin, Scott Weingart and Matt Wilkens have joined in (me too). To my rather speculative mind they seem to be on the trail of the “memetic” undercurrents of the cultural evolutionary process through which philology split into linguistics and interpretive lit crit after World War II.
For a glimpse into the deep background of that split take a look at Bernard Dionysius Geoghegan, From Information Theory to French Theory: Jakobson, Lévi-Strauss, and the Cybernetic Apparatus, which appeared in Critical Inquiry. Geoghegan looks at the period during and immediately after World War II when Jakobson, Lévi-Strauss and Lacan picked up ideas about information theory and cybernetics from American thinkers at MIT and Bell Labs. THAT line of development leads to deconstruction and post-modernism when it comes back across the Atlantic and crashes into the New Criticism in the middle and late 1960s—think of the 1966 structuralism conference at Johns Hopkins. Galloway and company are playing in and around those waters.
But another stream from those currents hits land in Boston where it becomes Chomksian linguistics, which in turn drove the expansion of linguistics into a fully autonomous intellectual disciplines. And an offshoot from that gave us computational linguistics, from which corpus linguistics emerged in the 1980s and 1990s. That’s where the topic analysis stuff comes from.
Saturday, December 15, 2012
In the wake of the Connecticut shootings Nate Silver (NYTimes) has an intersting column about America's "conversation" on guns as it is reported in the media. Here's the core finding:
If the news coverage is any guide, there has been a change of tone in recent years in the public conversation about guns. The two-word phrase “gun control” is being used considerably less often than it was 10 or 20 years ago. But the phrase “gun rights” is being used more often. And the Second Amendment to the United States Constitution is being invoked more frequently in the discussion.
After some interesting discussion of the data:
The change in rhetoric may reflect the increasing polarization in the debate over gun policy. “Gun control,” a relatively neutral term, has been used less and less often. But more politically charged phrases, like “gun violence” and “gun rights,” have become more common. Those who advocate greater restrictions on gun ownership may have determined that their most persuasive argument is to talk about the consequences of increased access to guns ... For opponents of stricter gun laws, the debate has increasingly become one about Constitutional protections...Their strategy may have been working. The polling evidence suggests that the public has gone from tending to back stricter gun control policies to a more ambiguous position in recent years. There may be some voters who think that the Constitution provides broad latitude to own and carry guns – even if the consequences can sometimes be tragic.
Friday, December 14, 2012
I’m sure you’ve all heard of those large lizards lurking in the New York City subway tunnels. You know, Amy gets a cute little pet lizard, feeds it for several months—lettuce, grubs, flies, whatever lizards eat—and it grows, and grows, and gets a little too big. So Mommy and Daddy flush it and it ends up Underground. Where it grows and grows until it becomes So Huge it becomes the Stuff of Legend.
In this story the role of The Lizard is played by graffiti, which, as is well known, also came up from the subways of New York City and has become the stuff of legend. Writing on the walls all over the world, six continents no less—but penguins don’t do graffiti. Too cold down there. Survival takes 110% of their time and effort.
As for Corporate Largesse, that’s played by Johnson & Johnson, the bandage and baby lotion company. At least that’s how I think of them, because that’s what I remember from my childhood, in which there was no graffiti—too early in time. But there were lizards, small ones, living near the creek over there, that one too.
Thursday, December 13, 2012
While this is not the last post in my pluralism series—for I have to write an introduction to the collected series—it’s the last substantive post. While I do not introduce any new conceptions, I do make some adjustments here and there. First I take a look at the relationship between philosophy and the other disciplines: What’s philosophy doing that they are not and cannot?
Then I review the entire system by hanging it on eight key propositions. I started that with the two propositions I adopted from Harman, to which I subsequently added two more. I’ve now thought through the entire discussion and decided that it boils down to eight propositions, including those initial two.
That analysis and reduction, that’s serious work, but at this stage it is also provisional. Asserting that these eight (or five or ten) ideas are the ones that matter, that’s a good way to focus one’s thinking. And an enterprise like this needs focus or it will fall apart. But it is also provisional, for the work has just begun.
Finally, I offer a few concluding remarks on what I see the next steps to be, next steps which I will not, however, be taking any time soon.
Philosophy Among the Disciplines
Let’s go back to where I began in From Objects to Pluralism, with a passage from Harman’s interview at ASK/TELL:
...the reason to focus on objects rather than on “language, social change, sexuality or animals” is because philosophy is obliged to be global in scope. If philosophy were to give one of these other entities a starring role, it would have to reduce the rest of the universe to them. “Language is the root of everything.” Here, you are choosing one specific kind of entity to be the root of all others, and there is no basis for this. Sociology tends to view all reality in terms of its emergence from human societies and belief-systems. Psychology treats all reality as made up primarily of mental phenomena. Physics deals with tiny physical objects and says that everything is made out of them, except that physics is useless when trying to explain things like metaphors, the Italian Renaissance, the meaning of dreams, and so forth.All these other disciplines focus on one kind of object as the root of all else in the world. Only philosophy can be a general theory of objects, describing Symbolist poetry and the interaction of cartoon characters just as easily as the slamming together of two comets in distant space.
As I said in that post I had two reactions to this post, and they are related.
My immediate reaction was to be skeptical of the claim that Harmon and his intellectual companions had anything particularly interesting to say about cartoon characters. Why’d I think that? Because I’ve spent a great deal of time thinking a writing about cartoon characters over the last several years and that work included descriptive work of a kind that’s not at all characteristic of these object-oriented philosophers. That is, I was in my mind claiming specialized expertise on the topic of cartoon characters and it wasn’t at all obvious that these philosophers had anything particularly interesting to say about them. It the very least, I’d not then read them as saying interesting about cartoon characters and that hasn’t changed since then, though I now know, as I didn’t then, that Latour seems to have been the one to put Popeye into the repertoire of standard examples.
And THAT perhaps ungenerous little quibble is but an instance of the larger issue, which is that of the relationship between philosophy and “all these other disciplines,” the ones that “focus on one kind of object as the root of all else in the world.” Of course, in the last half of that statement Harman is doing a bit of careless editorializing—this is, after all, an interview, not a formal publication. Not all those other disciplines claim their particular objects of interest to be the root of all else in the world. Some may—physicists likely—but most do not.
It’s a simple statement, but it clicked:
If the last fifty years in particular have witnessed a constant, slow increase in admissions of validity among worldly things, then OOO could be understood to propose: let's just take that pattern for granted and get it over with all at once.
And what I thought, immediately, was: but that’s how the intellectual world works, in the large. Everything is worthy of attention, though not everything gets it. Resources—time, attention, infrastructure of all kinds—are limited. But, in the intellectual world at large, has anything received more scrutiny than quarks and quasars?
But Ian Bogost, that’s who wrote that statement, wasn’t talking about the intellectual world at large. As the reference to OOO (object-oriented ontology) indicates, he is talking about philosophy: what do philosophers study? And, for the most part, philosophers in the 20th Century have studied human beings, their minds mostly, and their language, and even their society. Yes, the philosophy of science looms large, but it’s not about bosons and bats, it’s about how scientists think, or should think, about bosons and bats.
And so Bogost’s observation is well-taken. Why shouldn’t philosophers think about everything, and not just about the human? While I have my doubts about whether object-oriented ontology is itself a big intellectual thing, I’m more favorable to the view that there is a big intellectual thing among us, and there has been for some time—like global warming, it crept up on us before we gave it official welcome. And that big intellectual thing is re-drawing the boundaries, vectors, and destinations of inquiry. OOO is certainly playing a role in that process. Whatever’s provoked the seas, OOO’s certainly riding the waves.
Which is what Bogost said, more or less, back in June of this year:
Things are changing in philosophy, and that change is terrifying to some and liberating to others—perhaps it should be both. This conflict, if that's really what it is, is evidence of something big. We can fear it, or we can scoff at it, or we can make accusations about it. Or we can work, in whatever our medium.
Wednesday, December 12, 2012
At the Verso blog:
For the adult, it is a way to give to the child without expecting the child to be grateful to the parent. Rather, it is so the child can know that world itself could be generous. Nothing is owed in return. At least not yet. Later, the child can be let in on the secret: that we are staging a marvelous ritual about how the world itself could be experienced as bounty and plentitude, but we do so in a long loop through the generations. The gift the child will owe does not come until much later, when the child grows up, and owes a gift in turn to another child. Such long loops are what constitute the plural subject ‘we.’
Galloway has posted something of a rejoinder to his critics: The Secondary Correlation — Further Thoughts on the Realism Kerfuffle. At the very end he offers an olive branch:
Let me add, and I think I speak for everyone involved, that this debate has gone on for too long. It’s lead to a lot of bad blood. I’m friends with some of the SR/OOO cohort, and I’d like to keep it that way. This article only appeared now due to the slow timeline of academic publishing. I’m on the record and I’m willing to defend my position. But I’m also interested in moving on to think about other things.
The comments did not, alas, go well for him. Even Adam Kotsko, no friend of object-oriented ontology and speculative realism (Galloway’s targets, remember?) though perhaps not quite an enemy, admitted after having read Galloway’s article and much of the criticism:
I have to say that his claim that math has “entered history” in some totally new way in post-Fordism such that it “no longer” can be ahistorical is simply baffling... Overall, I think people were right to react negatively.
John Holbo, no friend of contemporary Continental thought, weighed in from the analytic tradition:
This argument puns on ‘correlation’. Anti-correlationists are against correlation in a narrow, technical sense. (Quoting M. from the paper itself: “By ‘correlation’ we mean the idea according to which we only ever have access to the correlation between thinking and being, and never to either term considered apart from the other.” For better or worse, and setting aside any differences amongst anti-correlationists, that’s a narrow-bore technical sense.) It clearly does not follow, then, that anti-correlationisms are conceptually committed to the preclusion or prevention of any and all correlations, even in the loosest (‘looks like’) sense. They don’t have to be opposed to things ‘looking like’ other things. Anti-correlation isn’t omni-anti-analogism. So, specifically, a realist anti-correlationism isn’t committed to the result that there shall be nothing analogous to realist anti-correlationism besides it itself. So the objection fails, I think.In short, argument by correlation (argument by analogy) is not automatically appropriate, merely because the topic is correlation.
To which Galloway himself agreed: “point taken.”
Tuesday, December 11, 2012
A cliché has it that there’s “two sides to every question.” If you’re lucky. Simple questions may have two sides. Otherwise ...
I’m playing in the discussion in and around Object Oriented Ontology as though it had many sides. For one thing, though the Central Four (Harman, Morton, Bogost, Bryant) tend to play is as though they were the three Musketeers—One for all and all for one!—I don’t believe it, nor, for that matter, do they, really. There’s more than one philosophy there. Just how many, who can say? It depends.
Terence Blake (Agent Swarm) has his perspective. And I have mine. Even if we take the Central Four at face value—One for all and all for one!—that’s three sides right there. Alexander Galloway (whoever he is) makes four. And then there’s Latour, five. Jane Bennett, that’s six. And it just gets more complicated when we throw more thinkers into the mix.
I mean, I agree with Harman on some things, but on other things, not. Same with Morton. And Blake. Heck, despite all the energy I’ve put into criticizing Bryant I agree with him on a thing or two as well.
Intellectual configurations are complex multi-faceted beasts. I’m reasonably sure this is a general characteristic of intellectual life. But if, as the OOOists believe, there’s a Big Thing on the horizon, it’s likely to be particularly true of intellectual configurations in search of that Big Thing.
Big Things are messy. They die hard and birthing them is long and painful.
Monday, December 10, 2012
Horvath G, Farkas E, Boncz I, Blaho M, Kriska G (2012)
Cavemen Were Better at Depicting Quadruped Walking than Modern Artists: Erroneous Walking Illustrations in the Fine Arts from Prehistory to Today.
PLoS ONE 7(12): e49786. doi:10.1371/journal.pone.0049786
Abstract: The experts of animal locomotion well know the characteristics of quadruped walking since the pioneering work of Eadweard Muybridge in the 1880s. Most of the quadrupeds advance their legs in the same lateral sequence when walking, and only the timing of their supporting feet differ more or less. How did this scientific knowledge influence the correctness of quadruped walking depictions in the fine arts? Did the proportion of erroneous quadruped walking illustrations relative to their total number (i.e. error rate) decrease after Muybridge? How correctly have cavemen (upper palaeolithic Homo sapiens) illustrated the walking of their quadruped prey in prehistoric times? The aim of this work is to answer these questions. We have analyzed 1000 prehistoric and modern artistic quadruped walking depictions and determined whether they are correct or not in respect of the limb attitudes presented, assuming that the other aspects of depictions used to determine the animals gait are illustrated correctly. The error rate of modern pre-Muybridgean quadruped walking illustrations was 83.5%, much more than the error rate of 73.3% of mere chance. It decreased to 57.9% after 1887, that is in the post-Muybridgean period. Most surprisingly, the prehistoric quadruped walking depictions had the lowest error rate of 46.2%. All these differences were statistically significant. Thus, cavemen were more keenly aware of the slower motion of their prey animals and illustrated quadruped walking more precisely than later artists.
H/t Tyler Cowen
One Alexander Galloway has a piece in Critical Inquiry, a top-tier lit crit journal, in which he takes certain philosophers to task. Some of those philosophers, and their allies, have gotten upset with the critique as it unceremoniously dumps them in bed with Ewww! capitalism. I’ve not read the article myself, but I have read a long post in Galloway did which covers similar philosophical ground, A response to Graham Harman’s “Marginalia on Radical Thinking.”
Boy, did that ever kick up a fuss! 105 comments, some quite long, and who knows how many posts on other blogs. This Galloway fellow seems to have his pulse on something or another.
If it IS Galloway who wrote it. But I’m getting ahead of myself.
* * * * *
It’s early 1969 and I’ve just read The Pooh Perplex: A Freshman Casebook, by Frederick Crews. Crews satirized literary criticism by writing a bunch of articles about Winnie the Pooh, each in the style of a different critic. Though, in most cases, I didn’t recognize Crews’ target, I thought the whole exercise was wonderful.
And so I decided to try my hand at it. I wrote a learned and somewhat florid analysis of this little ditty:
In days of old when knights were boldand rubbers weren’t inventedThey wrapped a sock around their cockthus babied were prevented.
My college’s student newspaper published it and the result was a minor one-day scandal. I’m told it was the talk of the table in the faculty dining room that day.
Sunday, December 9, 2012
Two years ago I wrote three posts (see links at the end of the post) around and about the fact that the protagonist in Miyazaki’s Porco Rosso (that’s him in the white suit above) is a man with the head of a pig. The general idea was to account for that fact without laying it off on symbolism. I still think avoiding a symbolic account is a good idea.
It’s time to take another look, this time with a comparison from a very different film, Mamoru Oshii’s Ghost in the Shell 2: Innocence (GITS2: Innocence).
But let’s not go there yet. Let’s meander a bit. Here’s a frame from Miyazaki’s Spirited Away:
The girl with her back to us is the protagonist, Chihiro, while the seated creatures are her parents. Just moments ago they were humans. Now they’re all but pigs. The next time we see them they’ll be naked, on the ground, and looking pretty much like the other pigs in the pen with them. As far as anyone can tell, they have become pigs.
How did this happen? Well, the family happened up what appeared to be an abandoned theme park. In the course of exploring it Chihiro’s parents smelled some food. So they followed their noses, saw the food, made an attempt to find out who was offering it for sale and, when that failed, they started to eat.
It turns out that this wasn’t an abandoned theme park at all. It was something else, a bathhouse for spirits. But Chihiro and her parents didn’t know that and by the time Chihiro had learned, her parents had become pigs.
Now, we could, I suppose, argue that becoming pigs symbolized something about her parents. But is such symbolizing doing anything above and beyond what we can actually see in their actions? No, it’s not. Those actions speak louder than any words.
Why then the transformation? (1) Well, we could moralize and say its punishment for being greedy. And it IS something like that I suppose. But there’s something else: It gives Chihiro a problem. (2) Now she’s got to try to get her parents back while freeing herself from this world as well.
Not only are those two different explanations, but they’re different KINDS of explanation. The first is pitched within the logic governing actions within the story world. If you take food meant for the spirits and you get turned into a pig. If you ask the old witch for a job, she’ll give you one, but she’ll also take your name from you. THAT’s how that world works.
The second explanation is pitched at the level of story craft. The protagonist has to have challenging tasks to perform if the story is to be at all interesting. That is, whatever the story-world logic is, the story, in order to be interesting, must exploiting it in a compelling way. Restoring her parents to human form is one of Chihiro’s tasks.
Getting back to Porco Rosso, about two-thirds of the way through the film we learn of the incident during which Marco Pagot acquired the head of a pig. It was during the war and he was the loan survivor of a vicious air fight that took the life of his best friend. But the story’s vague on just how or why that incident gave him a pig’s head. It just did. The connection is not so clear and as the connection between Chihiro’s parents’ actions—eating food not theirs—and the consequence—becoming pigs.
As for the second level, that of story craft, that’s really where my question lies. It’s a device. It allows Miyazaki to do something. But what?
Well, it allows him to tell THIS story. That’s what I think. But I’m not sure why.
In Evopsychopathy 3: the explanatory target, that is, whatever the hell you're trying to account for, Wilkins tell us that Darwin
spent some time trying to work out how bees had an instinct for the formation of hexagonal honey combs. Instinct was a kind of Platonic remembrance, something that evolved before you were born but which you “knew” at birth. This is the hoary old chestnut* of nature-nurture. And it was employed at length by the nascent science of ethology that was spawned by Darwin, especially in the theories of Konrad Lorenz, who argued that the synthetic a prioria of Kant (things known to be true a priori that are not true by necessity) are the evolutionary a posterioria (1996). We are born with instincts.
He goes on to say that one Danny Lehrman took that apart back in the 1950s, concluding, in Wilkins's works, that
“instincts” must develop in the right environment. Change the environment during crucial developmental phases, and you do not get the “inherited” behaviour ... There is “learned information”, or better “acquired information” from the developmental environment. What is inherited is not the behaviour, but a disposition to develop it in the right circumstances.
As far as Wilkins is concerned
Genes do not have culture on a leash, they merely bias the ways in which culture is acquired. This is not really genetic determinism, so much as genes as one factor among many (and not even the most significant) for behavioural development. And moreover, once you have identified that target of explanation correctly, you cannot justify some behaviour as “natural” therefore “justified”, since the multiplicity of causes for the shared behaviour will include culture, social organisation, availability of food during childhood, the local climate...
Saturday, December 8, 2012
Friday, December 7, 2012
At the beginning of the week, when I was thinking through my writing schedule—which had, once again, been perturbed by this and that, such as the dance competition I’d been to over the weekend—it seemed possible that I’d wrap-up the main line of my pluralism series today, Friday December 7. I picked the day because it was my birthday, one of those milestone birthdays, and so a good one on which to more or less (but not completely) wrap-up such a project.
And that goal seemed well within reach when I posted the penultimate installment, Facing up to Relativism: Negotiating the Commons, on Wednesday. However, I’ve decided not to do it. Oh, sure, I could jam it on through. I’ve got a fairly robust outline done and I know more or less what I want to say. But I’ve decided to hold off a day or two.
For one thing, Fridays have become a casual sort-through-things-and-see-where-we-are kind of day. Such sorting-out and stock-taking is essential to keeping several lines of activity in motion, but it’s antithetical to concentrating on any one of them. And writing that last post will require concentration.
After all, it WILL Be a summing-up of a line of thinking that’s occupied me for the past year and a half, a line of thinking that’s touched base with just about everything I’ve studied and written about over the years: literature, music, cognition, the brain, culture and cultural evolution, film (cartoons in particular), and graffiti. That’s a stew that would best simmer a bit before I deliver it to the table.
* * * * *
Here’s what I want to hammer home in that final post: the connection between pluralist ontology and the ethics and aesthetics of multiculturalism. Now that I’ve made the connection (in Wednesday’s) post it seems obvious to me. But I didn’t see it coming, and that despite the fact that I have spent a great deal of time sorting out matters of culture, identity, and nation.