Thursday, January 18, 2018
I slipped over to Manhattan yesterday for a panel discussion about artificial intelligence that was held – wouldn’t you know? at the New York Yacht Club, an honorable establishment with old money written all over it, not to mention a handsome stash of full and half-hull ship models – and was delighted with the Times Square area at night. It’s like something from the future, all bright and slithering lights. Next time I’ll take my camera.
Would the New York of Kim Stanley Robinson’s 2140 be like that? Time Square itself, of course, would be under water, but many of the tall buildings would still be sticking around, their middle and upper floors rising above the water. And that’s where a lot of the electronic signage was. That would be quite a sight, to see all those animated lights reflecting in the water.
So, I ask you: Does New York 2140 exhibit a distinctive mode of fictional being? That’s perhaps not the best way to put the question, but there’s no really good way. What I have in mind is the way Robinson combines rich array of details about New York’s past – something Adam Roberts reminded me of on Facebook – with a richly imagined future. We’ve got the real and the imaginary combined into one seamless extended novelistic present. I could almost have said “the real past–as Robinson imagines it–and the future–as Robinson imagines it”, for the imagination is a faculty we use for everything, not just fantasy and fiction. It’s all imaginable, and some is real, some not so real.
Back in the 1960s and 1970s there was something called “the new journalism”, in which writers like Tom Wolfe, Norman Mailer, and Hunter Thompson wrote about real events using literary techniques. They gave up the conventions of journalistic objectivity and entered into the events they chronicled. At the same time E. L. Doctorow was earning praise for his “fictionalized history”. What about alternate history, for example, P. K. Dick’s The Man in the High Castle, set in a world where Japan and Germany won World War II? How does New York 2140 fit into that, whatever that is?
This, it seems to me, is something for Latour’s world of modes of existence, where each mode has its own truth conditions. What is the mode for New York 2140, versus, say, The Tale of Genji, The Rise and Fall of the Roman Empire, The Interpretation of Dreams, or On the Origin of Species, and what are the respective truth conditions for each?
For extra credit: What about Donald Trump’s Twitter stream?
Tanya Stivers, et al. Universals and cultural variation in turn-taking in conversation, PNAS June 30, 2009 vol. 106 no. 26 10587-10592
Abstract:Informal verbal interaction is the core matrix for human social life. A mechanism for coordinating this basic mode of interaction is a system of turn-taking that regulates who is to speak and when. Yet relatively little is known about how this system varies across cultures. The anthropological literature reports significant cultural differences in the timing of turn-taking in ordinary conversation. We test these claims and show that in fact there are striking universals in the underlying pattern of response latency in conversation. Using a worldwide sample of 10 languages drawn from traditional indigenous communities to major world languages, we show that all of the languages tested provide clear evidence for a general avoidance of overlapping talk and a minimization of silence between conversational turns. In addition, all of the languages show the same factors explaining within-language variation in speed of response. We do, however, find differences across the languages in the average gap between turns, within a range of 250 ms from the cross-language mean. We believe that a natural sensitivity to these tempo differences leads to a subjective perception of dramatic or even fundamental differences as offered in ethnographic reports of conversational style. Our empirical evidence suggests robust human universals in this domain, where local variations are quantitative only, pointing to a single shared infrastructure for language use with likely ethological foundations.
I remember all the gnashing of teeth, the rending of garments, and the wailing of women, boys, and men when Ross Douthat got the nod as a NYTimes Op-Ed writer. I remember having a vague sympathy with this commotion, weak and vague, but no more. After all, I do not live and die on the Gray Lady Op Ed page. I suppose I read a column or three a weak, and I've read a few by Douthat, even some I've been in sympathy with, which is a bit of a surprise as he writes as a conservative Catholic, which is not my part of the world. Anyhow, he's been interviewed by Tyler Cowen, the ostentatiously well-read libertarian economist at a school in northern Virginia that was once upon a time referred to as a "cow college" but is now a R1 research institution, George Mason University.
Anyhow, I'm reading my way through the interview, skipping much of it, and was struck by this exchange:
COWEN: As you know I come at all of this as very much an outsider, so let me ask a very naive question.If I look at the Catholic Church, there’s a movement, as you know, called Opus Dei. The priests of that movement, they seem to be less caught up in sex scandals. Parts of the movement seem to have some understanding of what you might broadly call conservative economics. In Spanish politics in the ’30s, ’40s, and ’50s they were actually considered a liberalizing force, so they don’t have to be seen as reactionary per se.Why aren’t they simply the good guys? They don’t come up much in your writings. I’m reading you and I think, “Where’s Opus Dei?”DOUTHAT: I mean, I’m pro–Opus Dei overall. I think that my only . . . It seems to me sometimes that Opus Dei is a particular apostolate, right, and the particular idea of Opus Dei is that it’s not primarily supposed to be a priestly order, even though there are of course priests of Opus Dei.The central idea, and with apologies to Opus Dei members if I’m getting this at all wrong, but the central idea is that it’s a ministry. It’s an apostolate for laypeople who are at work in the business world, the journalism world, the corporate world, the communications world, and so on. And as such, I think it has an admirable and important vocation in the life of the world and the life of the church. But it seems to me in part that there is a sort of . . . There’s a kind of, not set-apartness exactly, but there’s an element of . . .Well, I think a big part of the crisis in Catholicism in the last 60 or 70 years can simply be distilled to a collapse in the sense of the importance of religious life, of consecrated life, of the priesthood, religious orders, sisters and brothers, and so on. And it’s as easy for me to say because I did not become a priest and so it’s always easier once you haven’t become a priest to say, “Oh, well, you need more people to become priests.” But to the extent that that’s true, Opus Dei seems like it’s very well tailored in certain ways to secular society as it exists right now.But I think the ultimate revival of the church is likely to come from a slightly more radical view of the proper relation to the world — that essentially what the church needs now is the equivalent of the Franciscans, the Dominicans, the Jesuits, these kind of orders from previous eras that are sort of . . . I mean, Opus Dei asks laypeople to take vows of various kinds; celibate laypeople are part of the Opus Dei structure. And I think that there is . . . Essentially, there is just a straightforward need for a more old-fashioned model of just priests and nuns. The church needs more priests and nuns. Catholicism can’t function without priests and nuns, which doesn’t take anything away from what Opus Dei is doing and, of course, they have many vocations and many priests.But yeah, to the extent that it doesn’t get the due maybe that it deserves in my writings, that’s probably, maybe, the root of it. Again, you’re teasing out things I haven’t even begun to think about before, which is . . .There’s no particular reason why. The sacramental life of the church depends on a strong priesthood, depends on men becoming priests; it depends on religious orders and so on. And so full revival in the church would need a priestly center to it, in a way, and not just a focus on sort of apostolates and evangelization within the world. Catholicism has been caught up in the idea that this is the Age of the Laity for the last 50 or 60 years. I think the Age of the Laity has kind of been a disaster for the church in certain ways.
It's that phrase, "consecrated life". Can we have consecrated lives without the Catholic Church? Douthat would likely think that, in asking that question, I reveal that I haven't got the foggiest idea of what "consecrated life" means. Perhaps. Perhaps. Still, I ask the question: Can we have consecrated lives without the Church? Can we have consecrated lives without ... ?
And before too long Douthat is talking in praise of Watership Down, a children's book I've heard of, but not read.
...it’s such a great book and it’s a book about — essentially, it’s about a founding.It’s connected, in a sense, to the kind of things that the Straussians are always arguing about and so on. What does the founding mean, and so on? But you have a group of rabbits who go forth and encounter different models of political order, different ways of relating to humankind, that shadow over rabbit-kind at any point.You have a warren that has essentially surrendered itself to humanity and exists as a kind of breeding farm, and you have a warren that’s run as a fascist dictatorship essentially. And then you have this attempt to form a political community that is somewhere in between the two, getting back to the Hegelian synthesis and so on. And you have sort of this primal narrative where the problem is of course that they don’t have any females, and so there’s this competition, this competition for reproductive power that’s carried out between these different warrens where the rabbits from the good warren have to literally — not kidnap, because the does come willingly — but steal women from the fascist dictatorship, which maintains a ruthless control over reproduction.So there’s just a lot of fascinating stuff there, and then it’s all interspersed with storytelling. There’s the sort of rabbit folktales that Richard —COWEN: So, narrative again.DOUTHAT: Narrative again.— that Richard Adams came up with, that are just brilliant, about El-ahrairah, the great rabbit folk hero, and his relationship. There is actually the rudiments of a rabbit theology in Watership Down.
Rabbit theology indeed.
And then there’s even, right, there’s even a mystical element. The book begins with this rabbit Fiver, who is sort of a runt, who has visions — and the whole founding is based on various prophecies and visions that he has throughout the beginnings of this rabbit warren, that these rabbits go out and found. So he has a vision of apocalypse, so there’s an Aeneid element, clearly, where — probably he uses quotes from the Aeneid; he has quotes before every chapter — where the city falls and you have to go found a new city and there’s religious visions along the way that relate to the legitimacy of the founding.
Hey, Ross, my man. I wouldn't get too uppity about secular consecration if I were you.
Wednesday, January 17, 2018
Science fiction isn’t just thinking about the world out there. It’s also thinking about how that world might be—a particularly important exercise for those who are oppressed, because if they’re going to change the world we live in, they—and all of us—have to be able to think about a world that works differently.– Samuel Delaney
As I began reading Kim Stanley Robin’s latest book, New York 2140, I was thinking about that old cliché:
Science fiction’s not about the future, it’s about the present.
But then isn’t all fiction like that? No matter when and where it’s set, it is necessarily about the authorial present, because that’s what the author lives, day in and day out. The rest is window dressing.
That’s what I was thinking. But I was also thinking that THAT’s not why I’m reading New York 2140, not at all. It’s about NYC after the climate apocalypse, and that’s why it interests me: How do we get through it? How do we live afterward? Not, mind you, that I think KSR actually knows, not, mind you, that I somehow think KSR is a prophet. He isn’t (a prophet) and he doesn’t (know the future). But he’s a smart guy with a good imagination and really, that’s the best we can do under the circumstances, no?
And I kept thinking that as I read the book. It’s as though I was almost looking for a how-to-do-it book. I say “almost” because when you put it that baldly it seems silly and I wasn’t really thinking that. But sorta, kinda’, almost.
As I read through the book – which is both complex (lots of interacting characters) and simple (little in the say of intricate scheming, but some) – I read about the financial collapse of 2008. That’s something very real to me, as it depressed my $$ net worth and hence my wellbeing. Hurricane Sandy – again, very real, I was without power for four or five days (I forget which), but others were without power for two or more weeks–not to mention flooding and homes destroyed, and the effects ripple out from there. They’re still rippling.
By the time I got to the end I was telling myself, whoa! this isn’t about the future, it’s about the present! Financial collapse, massive debilitating storm crushing New York City, those may well happen in the future, but I’ve already lived through them. And as for a spontaneous uprising of people in protest, that’s Occupy Wall Street: I marched in that!
Trapped by a cliché!
But just what is THE PRESENT? That’s a very tricky question.
What time scale do we use to measure the present? There’s a body of research in psychology that pegs the perceptual present at about three to four seconds. That’s certainly not the appropriate scale. But what is? A year, a decade, a century? There’s a reasonable sense in which the financial crash of 2008 and the devastation of Hurricane Sandy in 2012 are nonetheless part of my present. They’re certainly in the time horizon Stanley invokes/evokes in his book. And, if I’m going to extend my present a decade into the past, then perhaps I can also extend it a decade into the future, call it 2030. That’s still over a century short of Robinson’s D-Days. But then climate change looms large in his imagination and surely we can push that back to the beginning of the carbon-spewing Industrial Revolution. Now we’re going two centuries back to the beginning of the 19th century, and that entitles us to push two centuries into the future, to the end of the 22nd century. Our imaginative present, Robinson’s novelistic present? now runs roughly from 1800 to 2200, leaving him a little wiggle room after the imaginary events he’s detailed for us.
In his penultimate chapter, attributed merely to ‘the citizen’–who functions a bit like a Greek chorus, commenting on the action–he tells us:
Every moment is a wicked struggle of political forces, so even as the intertidal emerges from the surf like Venus, capitalism will be flattening itself like the octopus it biomimics, sliding between the glass walls of law that try to keep it contained, and no one should be surprised to find it can squeeze itself to the width of its beak, the only part of it that it can’t squish flatter, the hard part that tears our flesh when it is free to do so. No, the glass walls of justice will have to be placed together closer than the width of an octopus’s beak–now there’s a fortune cookie for you! And even then the octopus may think of some new ways to bite the world. A hinged beak, some super suckers, who knows what these people will try.
For you see, capitalism had just suffered a crushing defeat. But the book’s gone on for 604 pages at this point, so it really must come to an end – though I note that KSR’s Mars adventure extended over three volumes. But we mustn’t think that the end of the book is also the end of the causal forces it cast into wicked struggle.
So no, no, no, no! Don’t be naïve! There are no happy endings! Because there are no endings! And possibly there is no happiness either!
Though there a few more sentences in this chapter and then, yes, there's one final chapter. It takes place in “some submarine speakeasy” called “Mezzrow’s” – named, we presume, after a mid-20th century jazz musician and scenester who hung with the cats and supplied them with joints (aka mezzes) – where we dance to West African rhythms.
And the take-home? The lesson, what does it tell us about, I suppose, radical historical change? That’s tricky. Perhaps I should write another post about that. Or perhaps not. Whatever it is, it would be about chance favoring the prepared mind and how in this case, in KSR’s New York City and the world of 2140, there were lots of minds prepared by decades upon decades of subservience to the 1% (which, we know, is actually a tiny fraction of the 1%) in which 100s of millions managed to eek out a more or less self-sufficient existence in the tidal boondocks created by massive coastal flooding.
• • • • • S P O I L E R • • • • •
Not so long after the storm hit, displacing millions of New Yorkers, the message went out and a massive world-wide rent and mortgage strike brought capitalism to its knees.
Despite the excitement of all the new data, it’s unlikely to trigger an evolution revolution for the simple reason that science doesn’t work that way – at least, not evolutionary science. Kuhnian paradigm shifts, like Popper’s critical experiments, are closer to myths than reality. Look back at the history of evolutionary biology, and you will see nothing that resembles a revolution. Even Charles Darwin’s theory of evolution through natural selection took approximately 70 years to become widely accepted by the scientific community, and at the turn of the 20th century was viewed with considerable skepticism. Over the following decades, new ideas appeared, they were critically evaluated by the scientific community, and gradually became integrated with pre-existing knowledge. By and large, evolutionary biology was updated without experiencing great periods of ‘crisis’.
The same holds for the present. Epigenetic inheritance does not disprove genetic inheritance, but shows it to be just one of several mechanisms through which traits are inherited. I know of no biologist who wants to rip up the textbooks, or throw out natural selection. The debate in evolutionary biology concerns whether we want to extend our understanding of the causes of evolution, and whether that changes how we think about the process as a whole. In this respect, what is going on is ‘normal science’.
Why, then, are traditionally minded evolutionary biologists complaining about the misguided evolutionary radicals that lobby for paradigm shift? Why are journalists writing articles about scientists calling for a ‘revolution’ in evolutionary biology? If nobody actually wants a revolution, and scientific revolutions rarely happen anyway, what’s all the fuss about? The answer to these questions provides a fascinating insight into the sociology of evolutionary biology.
Revolution in evolution is a misattribution – a myth propagated by an unlikely alliance of conservative-minded evolutionists, creationists and the press. I don’t doubt that there are a small number of genuine, revolutionarily minded evolutionary radicals out there, but the vast majority of researchers working towards an extended evolutionary synthesis are simply ordinary, hardworking evolutionary biologists.
We all know that sensationalism sells newspapers, and articles that portend a major upheaval make for better copy. Creationists and advocates of ‘intelligent design’ also feed this impression, with propaganda that exaggerates differences of opinion among evolutionists and gives a false impression that the field of evolutionary biology is in turmoil. What’s more surprising is how commonly conservative-minded biologists play the ‘We’re under attack!’ card against their fellow evolutionists. Portraying intellectual opponents as extremist, and telling people that they are being attacked, are age-old rhetorical tricks to win debate or allegiance.
If the extended evolutionary synthesis is not a call for revolution in evolution, then what is it, and why do we need it? To answer these questions, we need to recognise what Kuhn got right – namely, that every scientific field possesses shared ways of thinking, or ‘conceptual frameworks’. Evolutionary biology is no different, and our shared values and assumptions influence what data is collected, how that data is interpreted, and what factors are built into explanations for how evolution works.
That is why pluralism in science is healthy. Lakatos stressed that alternative conceptual frameworks – what he called different ‘research programmes’ – can be valuable to the extent that they encourage new hypotheses to be generated and tested, or lead to novel insights. That is the primary function of the EES: to nurture, or even open up, new lines of enquiry, and new productive ways of thinking.
The EES, at least as my collaborators and I frame it, is best viewed as an alternative research programme for evolutionary biology. Inspired by recent findings emerging within evolutionary biology and adjacent fields, the EES starts from the assumption that developmental processes play important roles as causes of novel (and potentially beneficial) phenotypic variation, causes of differences in fitness of those variants, and causes of inheritance. In contrast to how evolution has traditionally been conceived, in the EES the burden of creativity in evolution does not rest on natural selection alone. This alternative way of thinking is being used to generate fresh hypotheses and establish new research agendas. It’s early days, but there are already signs that this research is starting to yield dividends.
If evolution is not to be explained solely in terms of changes in gene frequencies; if previously rejected mechanisms such as the inheritance of acquired characteristics turn out to be important after all; and if organisms are acknowledged to bias evolution through development, learning and other forms of plasticity – does all this mean a radically different and profoundly richer account of evolution is emerging? No one knows: but from the perspective of our adapting dog-walker, evolution is looking less like a gentle genetic stroll, and more like a frantic struggle by genes to keep up with strident developmental processes.
Tuesday, January 16, 2018
Jill Lepore has a fascinating article in The New Yorker about an IP (intellectual property) squabble over Barbie and Bratz dolls, which do billions of dollars in business. Yes, tells us a bit about the history of copyright, which is at issue in several law suits she discusses. But the article also discusses sexual harassment and feminism. Here's two paragraphs near the end:
The final sentence: "Mattel owns Barbie. MGA owns Bratz. And corporations still own the imaginations of little girls."Empowerment feminism is a cynical sham. As Margaret Talbot once noted in these pages, “To change a Bratz doll’s shoes, you have to snap off its feet at the ankles.” That is pretty much what girlhood feels like. In a 2014 study, girls between four and seven were asked about possible careers for boys and girls after playing with either Fashion Barbie, Doctor Barbie, or, as a control, Mrs. Potato Head. The girls who had played with Mrs. Potato Head were significantly more likely to answer yes to the question “Could you do this job when you grow up?” when shown a picture of the workplaces of a construction worker, a firefighter, a pilot, a doctor, and a police officer. The study had a tiny sample size, and, like most slightly nutty research in the field of social psychology, has never been replicated, or scaled up, except that, since nearly all American girls own a Barbie, the population of American girls has been the subject of the scaled-up version of that experiment for nearly six decades.#MeToo arises from the failure of empowerment feminism. Women have uncannily similar and all too often harrowing and even devastating stories about things that have happened to them at work because men do very similar things to women; leaning in doesn’t help. There’s more copying going on, too: pornography and accounts of sexual harassment follow the same script. Nobody writes anything from scratch. Abandoning structural remedies and legislative reform for the politics of personal charm—leaning in, dressing for success, being Doctor Barbie—left women in the workplace with few choices but to shut up and lean in more and to dress better. It’s no accident that #MeToo started in the entertainment and television-news businesses, where women are required to look as much like Barbie and Bratz dolls as possible, with the help of personal trainers, makeup artists, hair stylists, personal shoppers, and surgeons. Unfortunately, an extrajudicial crusade of public shaming of men accused of “sexual misconduct” is no solution, and a poor kind of justice, not least because it brooks no dissent, as if all that women are allowed to say about #MeToo is “Me, too!” The pull string wriggles.
Here's the illusion: pic.twitter.com/thNAPQLZlk— Steven Pinker (@sapinker) January 16, 2018
Look carefully at the coloring on the lines. On one set of pairs the coloring alternates between peaks and troughs; on the other set it alternates between the slopes. Though the wave forms are exactly the same for these pairs, they're perceived differently against the medium gray background.
Excellent new visual illusion by Kohske Takahashi: shows how the brain seeks an interpretation of the retinal input with the simplest combination of 3D shape, surface pigmentation, & illumination. //www.newsweek.com/psychology-neuroscience-optical-illusions-744675— Steven Pinker (@sapinker) January 16, 2018
Monday, January 15, 2018
I’ve been making a lot of posts over the past year or so about language, computation, and literary form, with a particular flurry in the last month or two linking Jakobson’s poetic function into the mix. This is going to be another one of those posts. I figure I’ve got to keep going over it until I feel that I’ve got it right, whatever that means.
What I’m NOT saying
I’m not saying that the human mind, or brain, is essentially computational, or digital. Those may or may not be true, but my assertion is more limited.
It limited to language and, within language, to the process whereby word forms are linked to semantic objects and structure (informally, to meaning). Let me emphasize process. It is something the mind does rather than something the mind is – to put it rather sketchily.
Other processes may or may not be fundamentally computational – sensation, perception, movement, pattern recognition, feeling, whatever. Off hand I’d think there’s computation in the sense that word-form-binding is computation, but I also think there are non-computational processes.
Moreover, I’m NOT saying that word-form-binding can be modeled by or usefully thought of as computation. I’m saying that it IS computation. And linguistic form is computational form. Linguistic form guides the binding process.
In what sense computation?
In the sense that we say the earth is a planet that revolves around the sun, and the moon revolves around the earth. Neither of those assertions can be verified by direct perception. Direct perception tells us that the earth is stationary and that both the sun and the moon move over the earth’s surface. Where does the sun go at night? Direct observation doesn’t tell us. Where does the moon go during the day? Direct observation doesn’t tell us.
The heliocentric model of the solar system is an abstract idea. It’s based on a wide range of observations by many observers at many times and places. But not simply observations. Reasoning, physical reasoning and mathematical reasoning.
It’s model subject to revision as needed. Thus Pluto has recently been demoted from planet status, a relatively minor matter of definition. More consequentially, the advent of complexity theory and digital simulation has allowed us to realize that, over the long term, the system is chaotic. The bodies in the system are constantly influencing one another and, consequently, orbits are gradually changing.
Well, that language involves computation is like that, irreducibly so. We don’t understand the system nearly so well. And, to some extent the argument has to be a negative one: What else could it be? As far as I know there simply are no other proposals on the table.
Alan Turing has defined computation in a way that’s independent of any particular physical realization, and THAT’s the kind of thing that can perform the binding task. We know that primarily because we have built artificial system that perform the binding task in limited domains. We have no reason to think that the limitations of those systems can be attributed to computation itself.
In looking over my posts I realized that back in August of 2016 I’d posted, Words, Binding, and Conversation as Computation, and What’s Computation? What’s Literary Computation? YES. The back and forth of computation strikes me as being inherently computational (read those two posts).
That, of course, involves the interaction of autonomous agents, which is not something we ordinarily think of in conjunction with computation, which has been characterized as something done by a single agent. I don’t see that as a problem, however. In fact, that may be how computation (in this sense) got started. And through a process perhaps first described by Vygotsky (in his account of language learning) the process that had been distributed across two agents becomes internalized in one.
Anaphoric reference and duality of patterning – but I’m not going to remark on these here and now. I note, however, that duality of patterning pretty much implies the binding problem. And it is, of course, related to indexing as Hays and I discussed it in Principles and development of natural intelligence .
Form on a string
Language is manifest as a string of word forms, one after the other. In the case of some but not all written language the word-forms are sharply separated from one another. They are not sharply separated in spoken form nor, I believe, in the gestures of signing. Where the string does not naturally exist in discrete forms the perceptual system must do the job of segmenting; sometimes there are failures.
In a way and somehow I want to say that computation is somehow necessitated by the fact that semantic structures (meaning) are inherently multi-dimensional and cotemporaneous while language takes the material form of one-dimensional strings. It takes computation to go back and forth between these two – not, alas, a very felicitious formulation.
Literary form and description
It is because literature is made of language that literary form is, like linguistic form, computational in nature . Note that the literary string may be subject to quasi-independent sources of ordering (as in verse, where sound may be ordered independently of sense).
I’m thinking – pace yesterday’s discussion of ring composition – the routine description of literary form is only possible in the context of the explicit recognition of the computational nature of linguistic and therefore literary form. The ring composition literature seems to me a bit ‘spotty’ and in a way ‘opportunistic’. It’s a bunch of local accommodations and fixes without any overall system. Also, in some forms it is overly reliant on spatial metaphors and references to oral practice, both of which are beside the point (and the spatial metaphors are misleading).
Routinization requires systematic thought and description. In the case of literary form we must explicitly recognize that literary texts ARE strings. It’s not that anyone doesn’t know that, but it’s not something that’s thought about and theorized. And we must recognize that literary form words outside the bounds of conscious thought and deliberation (as, indeed, does linguistic form as well).
Literary form is necessarily about restrictions on the structure of literary strings. That’s what Jakobson’s poetic function is about. It remains to be seen whether the poetic function is absolutely general, providing a ‘complete’ account.
 William L. Benzon and David G. Hays. Principles and development of natural intelligence. Journal of Social and Biological Structures 11, 1988, pp. 293-322. https://www.academia.edu/235116/Principles_and_Development_of_Natural_Intelligence
 My 2006 article on literary morphology is my major systematic statement about literary form, Literary Morphology: Nine Propositions in a Naturalist Theory of Form, PsyArt: An Online Journal for the Psychological Study of the Arts, August 2006, Article 060608. https://www.academia.edu/235110/Literary_Morphology_Nine_Propositions_in_a_Naturalist_Theory_of_Form
Sunday, January 14, 2018
Adam Roberts, a British colleague from The Valve, has been blogging his way through the works of H. G. Wells in preparation for writing a literary biography of the man. His latest entry is on Wells's Experiment in Autobiography (1934). This is what Alan Jacobs had to say about that:
— Alan Jacobs (@ayjay) January 14, 2018
Here's a longish passage from the middle:
The question that naturally arises, here but also of course with any autobiography, is: what specific relation exists between the character at the heart of this book and the human being Herbert George Wells who lived between 1866 and 1946? It's more than a question about autobiography, actually. It touches on the fundamental structural misprison of writing as such: the priority of representation over actuality. I'm old enough to find something reassuringly deconstructive about this idea, actually: the inescapability of textuality, the counter-intuitive precession of the written version of H G Wells over the irrecoverable (irrecoverable even when he was alive!) biological version of H G Wells. And Wells himself is certainly aware of what he is doing here: crafting himself, unveiling not the echt Wells but the Wellsian persona. That canny self-awareness is one of the great strengths of the Experiment in Autobiography precisely because the persona so created does have value as a way of apprehending what was ‘really’ going on to and in the Wellsian sensorium. As Wells argues, and as I think even Derrida would have conceded, the necessarily manque de hors-texte nature of all discourse, including that discourse we use to construe of our own selves to our own selves, doesn't mean that there is no actual self to talk about. Representation distorts and exaggerates but it doesn't invent out of whole cloth. This is how Wells puts it—how he theorises his own autobiographical praxis, via Jung:A persona, as Jung uses the word, is the private conception a man has of himself, his idea of what he wants to be and of how he wants other people to take him. It provides therefore, the standard by which he judges what he may do, what he ought to do and what is imperative upon him. Everyone has a persona. Self conduct and self explanation is impossible without one ... So that this presentation of a preoccupied mind devoted to an exalted and spacious task and seeking a maximum of detachment from the cares of this world and from baser needs and urgencies that distract it from that task, is not even the beginning of a statement of what I am, but only of what I most like to think I am. It is the plan to which I work, by which I prefer to work, and by which ultimately I want to judge my performance. But quite a lot of other things have happened to me, quite a lot of other stuff goes with me and it is not for the reader to accept this purely personal criterion.A persona may be fundamentally false, as is that of many a maniac. It may be a structure of mere compensatory delusions, as is the case with many vain people. But it does not follow that if it is selected by a man out of his moods and motives, it is necessarily a work of self deception. A man who tries to behave as he conceives he should behave, may be satisfactorily honest in restraining, ignoring and disavowing many of his innate motives and dispositions. The mask, the persona, of the Happy Hypocrite became at last his true faces.... A biography should be a dissection and demonstration of how a particular human being was made and worked; the directive persona system is of leading importance only when it is sufficiently consistent and developed to be the ruling theme of the story. But this is the case with my life. From quite an early age I have been predisposed towards one particular sort of work and one particular system of interests. I have found the attempt to disentangle the possible drift of life in general and of human life in particular from the confused stream of events, and the means of controlling that drift, if such are to be found, more important and interesting by far than anything else. I have had, I believe, an aptitude for it. The study and expression of tendency, has been for me what music is for the musician, or the advancement of his special knowledge is to the scientific investigator. My persona may be an exaggeration of one aspect of my being, but I believe that it is a ruling aspect. It may be a magnification but it is not a fantasy. A voluminous mass of work accomplished attests its reality. [Autobiography, 9-11]What this means, in practical terms, in this book, is that Wells consistently underplays himself, produces a persona more comically inept than the public record might suggest was ‘actually’ the case. Wells was, let's not forget, someone who, almost entirely on the strength of his own energy, genius and persistence, turned himself from a nobody into one of the world's most famous, influential, and wealthy authors. He went from being a draper's apprentice with literally no prospects to being friends with Jung, Beaverbrook, Roosevelt, Marie Stopes, George Gissing, Henry James, Joseph Conrad, Dorothy Richardson and Bernard Shaw, with Roger Fry and Charlie Chaplin and Booker T Washington, a man who took tea with Prime Ministers, Presidents and Archbishops. He was a man who overcame almost wholly impermeable barriers of class and background in the country and at the time when class and background were greater impediments than almost anywhere in the world, the man who took a congeries of futurist and technological-novum tropes and made them a coherent genre called ‘science fiction’, who made prophesy respectable and helped reconfigure the political landscape of his homeland. But the Wells who writes his Autobiography softpedals all that, and instead repeatedly stresses his inadequacies. What's remarkable is that he manages to do so in a way that doesn't come across as false modesty. His modesty has the sheen of genuineness, even of a kind of baffled ingenuity. How did all this happen to me? he seems to be saying.The brain upon which my experiences have been written is not a particularly good one. If there were brain-shows, as there are cat and dog shows, I doubt if it would get even a third class prize. Upon quite a number of points it would be marked below the average. In a little private school in a small town on the outskirts of London it seemed good enough, and that gave me a helpful conceit about it in early struggles where confidence was half the battle. It was a precocious brain, so that I was classified with boys older than myself right up to the end of a brief school career which closed before I was fourteen. But compared with the run of the brains I meet nowadays, it seems a poorish instrument. I won't even compare it with such cerebra as the full and subtly simple brain of Einstein, the wary, quick and flexible one of Lloyd George, the abundant and rich grey matter of G. B. Shaw, Julian Huxley's store of knowledge or my own eldest son's fine and precise instrument. But in relation to everyday people with no claim to mental distinction I still find it at a disadvantage. [Autobiography, 13]He is disarmingly honest about the limitations of his own writing: Mankind in the Making (1902) is ‘extremely sketchy’ and its component elements ‘do not interlock’ ; Joan and Peter (1918) ‘is as shamelessly unfinished as a Gothic cathedral’ ; What Is Coming (1916) was assembled ‘in a very blind and haphazard fashion’ (he says he would prefer to ‘let this little volume decay and char and disappear and say nothing about it’ ) and so on. Of his experience with the Fabians, he notes: ‘on various occasions in my life it has been borne in on me, in spite of a stout internal defence, that I can be quite remarkably silly and inept; but no part of my career rankles so acutely in my memory with the conviction of bad judgment, gusty impulse and real inexcusable vanity, as that storm in the Fabian tea-cup’ . The reader feels that he's being perfectly honest in acknowledging his silliness and ineptitude.The key to all this is (an English person is liable to say, but of course) class. Wells lives his own life on his own terms, but that life is also to an extent already overwritten by the social class into which he was born. Another way of expressing the quality of silliness or ineptitude, of comical bumptiousness, of his whole small-stature squeaky-voiced Britling-y nature, would be to say: he's a bit vulgar. Which is a thoroughly class-saturated way of putting things. What Wells never had as a person, and what his Autobiography never tries to mimic, are: breeding, refinement, elevation, suavity. Repudiating these things, and insisting on speaking the plain truth as he sees it, is the core of Wells's philosophy of life. The truth is that he wasn't even a parvenu. He was, in the crushingly snobbish English phrase, a counter-jumper. And the great merit of his Autobiography is that he owns that fact, revels in it, and so makes something potent out of it.And that also speaks to the nature of autobiography as such. Because one of the unspoken truths of the mode is that telling your life story is, inherently, just a rather vulgar thing to be doing—a tad me! me! me!, a touch ungentlemanly or unladylike. And Wells's book owns that truth, mitigating it with wit and charm and pushing it through to suggest that the whole social hierarchy that supports such an attitude is due a bottom-up refit.
Once more, and thinking of ring composition: Why aren’t literary critics interested in describing literary form?
I keep coming back to this question. Sometimes I think I’ve got an answer, but then, gradually, it goes away. I’m not quite sure what the question is.
If literary critics didn’t talk about form and if formalism weren’t a well recognized (family of) critical position(s), the answer would be simple. But that, I’m afraid, is not the case. Form and formalism are much discussed. And one of the big discussions revolves around the questions: What IS form, anyhow? There’s no consensus answer, not even close.
What brought the subject to mind just now is that I was thinking about ring composition, a particular kind of formal structure. I bring up ring composition because it’s not form-in-general. It’s a specific kind of form. And so is more tractable.
And what I’ve been thinking about that is that it has mostly been studied in oral culture, classical texts, the Bible, and in non-Western classical literatures (the Vedas, the Quran, etc.). And, as far as I can tell (from running searches on Google Scholar and Academia.edu), those discussions remain current, though it doesn’t seem to be a scholarly hot-bed. That is, it’s studied in literatures that AREN’T US.
But also, it really isn’t studied as one kind of form in the general world of literary forms. It seems to be this oddball kind of topic unto itself. Sometimes it’s approached through spatial metaphors, which I fear is a mistake. But sometimes it’s also discussed in relation to music – I recall seeing such an article.
Why isn’t the study of literary form as sophisticated as the study of musical form (at least I think it’s sophisticated)? Literary art, after all, is a temporal art, no? But it’s difficult to think of it that way, I suspect, because words have meaning and musical pitches don’t. Those meanings and their accumulation get in the way of focusing on the temporality.
On the whole, and at the moment – I’m making this up as I type it – I’m thinking that literary form is difficult to focus on. Ring form is studied in ‘remote’ literature precisely because they’re remote. That remoteness reduces the ‘pressure’ on meaning and allows the formal features to be treated more straightforwardly. What makes literary form difficult to focus on is the fact that, as physical objects, literary texts are strings. How do we think about the form of strings?
Where the elements on those strings are words and so have meaning, I suspect that, in the end, computation is the only way we’ve got for thinking about the form of strings of meaningful objects. I note that when I blundered into the form of “Kubla Khan”, my experience with computation played a role. I thought about computer programs and how the difference between, say, a comma and a semicolon was the difference between a program that ran and one that didn’t. Natural language isn’t so unforgiving. But form in natural language texts is broadly of a kind with form in computer language texts. Grouping of word forms matters.
I note that arriving at the form of “Kubla Khan” was not a straightforward process for me. On the one hand, I had no model. I wasn’t looking for form. I was looking for meaning. Once I’d blundered into the form, I didn’t know what to do with it. What I did was to go looking for the underlying mechanisms, which was as close as I could get to looking for meaning. I didn’t find them, though I certainly did find a fascinating intellectual world in computational semantics and such. Beyond that, it was a BIG DEAL for me, two decades later, to reflect back on it all and realize that what drove me was form, the formal features I’d identified in various texts.
That is, while I may have been following form for years, it took me a long time to consciously realize and focus on it. On the whole it seems to me that, as an undergraduate at Johns Hopkins, I had internalized a fairly standard litcrit mindset, on more or less oblivious to form. But I had internalized a bit of linguistics and perhaps a bit of computation as well, and those things came into confluence and collision in my study of “Kubla Khan”.
At the moment I’m of the opinion that there won’t be any serious study of literary form that doesn’t recognize computation as essential to language. By “serious” I mean analytic and descriptive. The descriptive work doesn’t have to start from scratch. Much existing work in poetics and narratology is relevant. What’s most important at this point, however, is the description of individual texts in varying levels of detail.
The concept of computation authorizes the study of form. Without that authorization there can be no such study . It’s not simply a matter of authority, though it IS that as well. It’s about intellectual focus.
The ring composition literature is interesting and important, but it’s a fluke, a fortunate fluke.
* * * * *
 See, for example, my recent post, Jakobson’s Poetic Function and Textual Closure, December 25, 2017, https://new-savanna.blogspot.com/2017/12/jakobsons-poetic-function-and-text.html
Saturday, January 13, 2018
Friday, January 12, 2018
Ted Underwood was scheduled to speak at the 2017 MLA convention at a session on “Varieties of Digital Humanities”. However, his flight to New York got cancelled, so he posted his presentation, “A broader purpose” online at his blog, The Stone and the Shell. He began by distinguishing his interest, cultural analytics – “using numbers to understand cultural history” – from digital humanities in general, pointing out that it has a long pedigree with roots predating and independent of the recent digital humanities.
His general point is the cultural analytics cannot be contained within the humanities nor can it adequately be taught as a unit in a course on the digital humanities. What is required is “sequence of courses that guides them through basic principles (of statistical inference as well as historical interpretation)”. Consequently,
I think the courses that can really open doors to cultural analytics are found, right now, in the social sciences. That’s why I recently moved half of my teaching to a School of Information Sciences. There, you find a curricular path that covers statistics and programming along with social questions about technology. I don’t think it’s an accident that you also find better gender and ethnic diversity among people using numbers in the social sciences. Methods get distributed more equally within a discipline that actually teaches the methods. So I recommend fusing cultural analytics with social science partly because it immediately makes this field more diverse. I’m not offering that as a sufficient answer to problems of access. I welcome other answers too. But I am suggesting that social-scientific methods are a necessary part of access. We cannot lower barriers to entry by continuing to pretend that cultural analytics is just the humanities, plus some user-friendly digital tools. That amounts to a trompe-l’oeil door.
Yes. To put it bluntly: “To use numbers wisely, students need preparation that an English major doesn’t provide.“
Though here I do have a caveat. In talking about the social sciences Underwood emphasizes statistics, as though that were the defining characteristic of the social sciences. He neglects theory – yes, there is (lower case) theory in the social sciences – and experimental design, which, in effect, is where theory meets statistics.
During my undergraduate years at Johns Hopkins I took a course in social theory taught by Arthur Stinchcombe, one of those courses intended for both advanced undergraduates and entering graduate students. I remember we had to read a big fat book of essays by Robert Merton which included his classic 1949 article, “On Sociological Theories of the Middle Range”. And we had write a term paper in which we, 1) picked some ‘middle range’ phenomenon (or smaller) for investigation, 2) proposed two or three possible explanations, and 3) devised two or three empirical tests, or experiments, that would discriminate between the proposed explanations. The interesting thing about the assignment is that we phenomenon we choose could be real OR imaginary. Stinchcombe didn’t care which. He was interested in our ability to come up with testable explanations and derive empirical observations against which those explanations could be tested. That too – theorizing that has testable consequences – is important in the social sciences.
Setting that aside, notice that Underwood talks of the undergraduate English major. That’s where the preparation must be done, at the undergraduate level. He concludes:
Instead of imagining cultural analytics as a subfield of DH, I would almost call it an emerging way to integrate the different aspects of a liberal education. People who want to tackle that challenge are going to have to work across departments to some extent: it’s not a project that an English department could contain. But it is nevertheless an important opportunity for literary scholars, since it’s a place where our work becomes central to the broader purposes of the university as a whole.
Forget the “almost”. It IS “an emerging way to integrate the different aspects of a liberal education.” On the one hand we the analysis and description of literary texts in the broader context of literary history. On the other hand we’ve got general social theory, experimental method, and statistical analysis. The combination might actually lead to powerful new modes of thought and exploration.
But how many graduate programs in literature would be imaginative and generous enough to educate students with such undergraduates preparation?
Thursday, January 11, 2018
In the wake of assertions that Oprah, you know, should, um, err, run for president, I’ve been thinking about the United Kingdom (aka Great Britain). It’s a constitutional monarchy. The monarch is head of state, but has no responsibilities for governance, while the prime minister is head of the government. The monarch symbolizes the nation while the prime minister gets things done. The American presidency combines those two functions.
Trump, however, has not been particularly adept at getting things done. He has not prior experience in government, is not at all suited to the demands of the presidency, and apparently didn’t really want the job in the first place. He is, in effect, a largely symbolic head of state – though the majority of his citizens did not elect him (but then, the British monarch isn’t elected either) – who talks about making America great again while leaving governance, if not to change, to a poorly coordinated congeries of underlings of varying degrees of incompetence. That is, under Trump the functions of head of state and head of governance are being pried apart with governance, alas, being largely headless.
And now, along comes Oprah with a stirring speech at the Golden Globes. Now she’s being put forward as a candidate for the presidency. It seems to me that, in effect, she’s being put forward as a candidate for a head of state position that’s only loosely coupled with head of governance. THAT’s what the Trump presidency is doing. Decoupling the two functions.
Oprah is a more attractive person than Trump is. But we have no particularly good reason to think she would be effective at governance. Would she appoint more competent staff? Probably. Would he nominations to high level posts be more competent? Likely so, if only because she will have been thinking about it and working on it.
But still, is this what we want, a separation between head of state and head of governance, with the latter function being basically catch as catch can? More to the point, is this what we’re stuck with given the current media environment? Has the carefully calibrated system of checks and balances been broken, permanently?
Wednesday, January 10, 2018
We’ve heard about Oprah the entrepreneur, Oprah the celebrity, Oprah the champion of holistic medicine and the enabler of anti-vaccine paranoia, even Oprah the neoliberal (don’t ask). But though she is entrepreneurial and rich, Oprah is not Jeff Bezos; though she is famous, she is not the Rock; though she has elevated various dubious approaches to wellness, she is not Gwyneth Paltrow.Instead, her essential celebrity is much closer to the celebrity of Pope Francis or Billy Graham. She is a preacher, a spiritual guru, a religious teacher, an apostle and a prophetess. Indeed, to the extent that there is a specifically American religion, a faith tradition all our own, Oprah has made herself its pope. [...]But in between secularism and traditionalism lies the most American approach to matters of faith: a religious individualism that blurs the line between the God out there and the God Within, a gnostic spirituality that constantly promises access to a secret and personalized wisdom, a gospel of health and wealth that insists that the true spiritual adept will find both happiness and money, a do-it-yourself form of faith that encourages syncretism and relativism and the pursuit of “your truth” (to borrow one of Oprah’s Golden Globes phrases) in defiance of the dogmatic and the skeptical alike. [...]
because the divide between blue-state spirituality and red-state spirituality is much more porous than other divisions in our balkanized society, and the appeal of the spiritual worldview cuts across partisan lines and racial divides. (Health-and-wealth theology is a rare pan-ethnic religious movement, as popular among blacks and Hispanics as among Americans with Joel Osteen’s skin tone, and when Oprah touts something like “The Secret,” the power-of-spiritual-thinking tract from the author Rhonda Byrne, she’s offering a theology that’s just Osteen without Jesus.) Indeed, it may be the strongest force holding our metaphysically divided country together, the soft, squishy, unifying center that keeps secularists and traditionalists from replaying the Spanish Civil War.
Tuesday, January 9, 2018
Ted Hamilton reviews Tim Morton's Humankind in the Los Angeles Review of Books. Here's Morton's recent career in a paragraph:
Humankind is the latest in a barrage of brief volumes that has transformed Morton from a literary critic with an interest in food and Romantic poetry to a globe-trotting public intellectual elaborating a new ontology for the Anthropocene. Ecology without Nature (2007) and The Ecological Thought (2010) developed the argument that Nature with a capital “N” inhibits true ecological awareness. Hyperobjects (2013) introduced the titular concept of massively extended, empirically elusive objects, such as “global warming,” that demand new ways of thinking about human action and subjectivity. In Dark Ecology, released last year, Morton expanded these claims into a deep-time story of how everything went wrong with the invention of agriculture. In a stylistic analogue of the syncretic spirit of these works, Morton’s writing has become increasingly breezy in its references to object-oriented ontology, Buddhism, and My Bloody Valentine, often in the same sentence. Humankind picks up right where Dark Ecology left off, arguing — among many other things, and through a bewildering array of asides — that our best chance for solidarity with nonhuman beings is fixing the “bug” of anthropocentrism in Marx.
Followed by this:
Don’t be put off by, or expect too much from, the Marx. While Humankind is broadly Marxist in orientation, and while Morton ultimately locates the proper place for Anthropocene politics in an interspecies — nay, interbeing — communism, the book is hardly Marxian in tone or method. Morton spends more time with quantum theory than he does with class-consciousness. This is because, for him, our ecological crisis, signified most clearly by the hyperobject of global warming, begins not with James Watt’s steam engine or the logic of surplus value, but with the separation between humans and nonhumans that occurred with the Neolithic Revolution.
So, what's Morton up to?
Now he’s staging an intervention against our addiction to a whole way of thinking and feeling: the logic of explosive holism born of the Severing, or what he calls “agrilogistics.” Agrilogistics is a bad program that we’ve left running so long that it’s produced the Sixth Mass Extinction Event. It’s what’s behind patriarchy, racism, and dead polar bears. It’s the basis of anthropocentrism and — as Morton argues in complicated but compelling fashion — Kantian correlationism, another enemy. The analogy to code [“agrilogistics”?] isn’t just a cute way of flattening the ontology of humans and computer viruses: just like the anthropocentric “bug” in Marxism, ideas are real, material forces, and if we don’t learn to think correctly, we’ll never come to act correctly. In fact, rigidly distinguishing thought from action is another part of the problem.
And so it goes.
Why read Morton, then? Surely not in search of a roadmap for revolution. Like most theory, whether of the capital-T or lower-case variety (and this is decidedly lower-case), Humankind is meant to prompt political projects, not to guide them. There are some fair contributions in this direction, a bit more concrete than what Morton has offered before: embrace a politics of pleasure, don’t buy into the obsession with scarcity and efficiency (a legacy of agrilogistics), and double down on moments of cross-species kinship like the Cecil the lion controversy. Of course, this still leaves us wondering what kind of solidarity we should be aiming for, and what politics for nonhumans means. Are we going to stand at the barricades with other objects? And will the barricades be part of our rebel army, too?The best image of Morton is not a rabble-rouser on the streets but a lecturer with a flair for showmanship and arrogance, capable of synthesizing disparate sources into an argument for why we must think differently or perish.
I'll buy that. And that leads to a bit of waggishness:
The upshot of Morton’s capaciousness is that this invitation is open to philosophers, eco-critics, activists, artists, psychologists, and historians, and perhaps even to the tomato sauce that I spilled on my copy of the book, temporarily occluding some of the verbal content while illuminating the previously withdrawn capacity of the pages to absorb liquid. I hadn’t physically related to a book in that way in a long time. It reminded me of the Ziploc bag with which I preserved my copy of The Neverending Story in third grade.
And that as well.
Monday, January 8, 2018
Most Western video games about the Middle East involve killing, shooting, and war. This game from Iran, however, is about the beauty of Islamic art. And its core gameplay mechanic is wildly creative. Thanks @bahrami_ for letting me interview you! pic.twitter.com/pNV9iVxCAt— Yara Elmjouie (@yelmjouie) January 7, 2018
Sunday, January 7, 2018
I've been reading Kin Stanley Robinson's New York 2014. Big book, 613 pages. I'm on 399.
The title tells you what the book's about. There's been two major "pulses"–he calls them–between now and then, with the result that the sea level's been raised 50 feet. Lower Manhattan is now a network of canals threaded among tall buildings. Finances rule. National governments seem much pared away.
All well and good.
But it seems that Robinson's been reading OOO. Concerning the gray-world financial system (319): “It grew in the dark, it’s a stack, a hyperobject, an accidental megastructure.” Has he read Tim Morton, or just someone who's read Morton? And of course Morton is all about anthropogenic climate change which, obviously, is central to the book, though in the past.
A bit later (399): “Have you ever noticed that our building is a kind of actor network that can do things? We got the cloud star, the lawyer, the building expert, the building itself, the police detective, the money man...add the getaway driver and it’s a fucking heist movie!” “Actor network”, that’s Latour, and Robinson certainly does treat that particular building as a Latourian actor.
It goes: Met Life tower in lower Manhattan, where all the central characters live; New York City; the world.
Saturday, January 6, 2018
A tweet stream from this year's MLA Convention, ongoing in New York City.
Still thinking about the "Varieties of DH" panel yesterday #mla18 #s347. Glaring absence for me was mention of the labor scenarios that resulted in the past 10 years because of DH. (+)— Alex Gil (@elotroalex) January 6, 2018
As I've said before, what makes DH different as "a project" than Cultural Analytics (ala @Ted_Underwood) for ex. is the collapse of certain divisions of labor, and the creation of a labor class where "numbers" work, design, etc (+)— Alex Gil (@elotroalex) January 6, 2018
gets assigned to library service work (with its gendered connotations). Same happened to bibliographical work, one could argue. This new class of workers is threatening and threatened because they work at the seams (+)— Alex Gil (@elotroalex) January 6, 2018
of both institutional divisions, but also the very mechanisms of knowledge production. Take @omeka work for example. That gives newbie students and fac a first glimpse at how archives are actually made (+)— Alex Gil (@elotroalex) January 6, 2018
but the software itself is still considered NOT the main library repository technology, the work not really part of "The Library." Like Omeka, cultural analytics itself is outsourced to the liminal spaces (+)— Alex Gil (@elotroalex) January 6, 2018
eventually the divisions of labor may reify again, but for now DH (or DS as it's called now. lol) remains both the promise and the threat of a new scholarly record and recording operation. Onward. (.)— Alex Gil (@elotroalex) January 6, 2018
I think you're right that this marks a difference from CA. Because we do more arguing than "building," we don't tend to rely on local support staff to the same extent. We do rely on grad/undergrad students to gather data, and also very heavily on @hathitrust.— Ted Underwood (@Ted_Underwood) January 6, 2018
The two models require very different kinds of institutional/grant support, and that does need to be discussed more candidly. One of the reasons I chafe under the "DH" label is that it tends to foster an assumption that everyone is building a project/archive/website.— Ted Underwood (@Ted_Underwood) January 6, 2018
Friday, January 5, 2018
At the beginning of "Drugs Du Jour" (Aeon) Cody Delistraty points out that Aldous Huxley was opposed to psychoactive drugs in the middle of his career, but that he changed his mind 1955 after he'd taken LSD for the first time.
What explains Huxley’s changed perspective – from seeing drugs as an instrument of dictatorial control to a way to escape from political-cultural repression? Indeed, in the grander picture, why are drugs universally despised at one time, then embraced by intellectuals and cultural influencers at another? Why do we have an almost decadal vogue for one drug or another, with popular drugs such as cocaine all but disappearing only to pop up again decades later? Above all, how are drugs used to affirm or tear down cultural boundaries? The answers colour nearly every aspect of modern history.Drug use offers a starkly efficient window into the cultures in which we live. Over the past century, popularity has shifted between certain drugs – from cocaine and heroin in the 1920s and ’30s, to LSD and barbiturates in the 1950s and ’60s, to ecstasy and (more) cocaine in the 1980s, to today’s cognitive- and productivity-enhancing drugs, such as Adderall, Modafinil and their more serious kin. If Huxley’s progression is to be followed, the drugs we take at a given time can largely be ascribed to an era’s culture. We use – and invent – the drugs that suit our culture’s needs.The drugs chosen to pattern our culture over the past century have simultaneously helped to define what each generation has most desired and found most lacking in itself. The drugs du jour thus point towards a cultural question that needs an answer, whether that’s a thirst for spiritual transcendence, or for productivity, fun, exceptionalism or freedom. In this way, the drugs we take act as a reflection of our deepest desires and our inadequacies, the very feelings that create the cultures in which we live.
But while drugs can both answer cultural questions and create entirely new cultures, there is no simple explanation for why one happens rather than the other. If rave culture is created by ecstasy, does that mean ecstasy is also ‘answering’ a cultural question; or was ecstasy simply there and rave culture blossomed around it? The line of causality is easily blurred. [...] ‘Every time a drug is invented that interacts with the brains and minds of users, it changes the very object of the study: the people who are using,’ says Henry Cowles, assistant professor of the history of medicine at Yale. On this reading, the idea that drugs create culture is true, to an extent, but it is likewise true that cultures can shift and leave a vacuum of unresolved desires and questions that drugs are often able to fill.
...if drugs can create and underscore cultural limitations, then drugs and their makers can tailor-make entire socio-cultural demographics (eg, ‘the depressed housewife’ or ‘the hedonistic, cocaine-snorting Wall Street trader’). Crucially, this creation of cultural categories applies to everyone, meaning that even those not using the popularised drugs of a given era are beholden to their cultural effects. The causality is muddy, but what is clear is that it swings back and forth: drugs both ‘answer’ cultural questions and allow for cultures to be created around themselves.Looking at the culture of today, perhaps the biggest question answered by drugs are issues of focus and productivity – a consequence of the modern ‘attention economy’, as termed by the Nobel Prize-winning economist Herbert Alexander Simon.
From time-limited intervention in acute episodes to long-term use:
Critically, it is the way in which we now take drugs that shows the shift in the notion of the ‘self’. So-called ‘magic-bullet drugs’ – one-off, limited-course drugs designed to treat highly targeted problems – have given way to ‘maintenance drugs’ – eg, antidepressants and anti-anxiety pills that must be taken in perpetuity.‘This is a big shift from the old model,’ says Cowles. ‘It used to be: “I am Henry. I am ill in some way. A pill can help me get back to being Henry, and then I’m off it.” Whereas now: “I am only Henry when I’m on my meds.” Between 1980, 2000, and now, the proportion of people on that kind of maintenance pill with no end in sight is just going to keep going up and up.’
Note that in the 18th and 19th centuries opium, usually in the form of tincture of opium, or laudanum, was the maintenance drug of choice in Europe and the West. That drug gave us "Kubla Khan".