Sunday, July 23, 2017
Psychiatry, the study and prevention of mental disorders, is currently undergoing a quiet revolution. For decades, even centuries, this discipline has been based largely on subjective observation. Large-scale studies have been hampered by the difficulty of objectively assessing human behavior and comparing it with a well-established norm. Just as tricky, there are few well-founded models of neural circuitry or brain biochemistry, and it is difficult to link this science with real-world behavior.That has begun to change thanks to the emerging discipline of computational psychiatry, which uses powerful data analysis, machine learning, and artificial intelligence to tease apart the underlying factors behind extreme and unusual behaviors.Computational psychiatry has suddenly made it possible to mine data from long-standing observations and link it to mathematical theories of cognition. It’s also become possible to develop computer-based experiments that carefully control environments so that specific behaviors can be studied in detail.
The article then goes on to discuss research reported in:
Sarah K Fineberg (MD PhD), Dylan Stahl (BA), Philip Corlett (PhD), Computational Psychiatry in Borderline Personality Disorder, Current Behavioral Neuroscience Reports, March 2017, Vol 4, Issue 1, pp31-40: arXiv:1707.03354v1 [q-bio.NC]
Purpose of review: We review the literature on the use and potential use of computational psychiatry methods in Borderline Personality Disorder.Recent findings: Computational approaches have been used in psychiatry to increase our understanding of the molecular, circuit, and behavioral basis of mental illness. This is of particular interest in BPD, where the collection of ecologically valid data, especially in interpersonal settings, is becoming more common and more often subject to quantification. Methods that test learning and memory in social contexts, collect data from real-world settings, and relate behavior to molecular and circuit networks are yielding data of particular interest.Summary: Research in BPD should focus on collaborative efforts to design and interpret experiments with direct relevance to core BPD symptoms and potential for translation to the clinic.
Tuesday, July 18, 2017
In his new study, Burridge presents a deliberately minimal model of language change, which focuses on explaining dialect distribution solely in terms of topographical features and speaker interaction. The model assumes the existence of multiple linguistic variants for multiple linguistic variables, which effectively define different dialects. In determining whether a given speaker adopts a specific variant, the model does not consider “social value” factors. Instead, it assumes that speakers interact predominantly with people living in their local environment (defined by some radius around their home), and that they will conform to the speech patterns of the majority of people in that geographic vicinity. Such local linguistic alignment favors the emergence of distinct dialect areas, with dialect boundaries tending to shorten in length in a way that mimics how surface tension minimizes the surface area of a water droplet (see Fig. 1). In a region with uniform population density, this language-based surface tension will cause the boundary between two dialects to form straight lines. Densely populated areas, however, interfere with boundary straightening by repelling boundaries and effectively creating new dialect areas around themselves. Furthermore, topography can have an imprint on dialect spatial distributions. In systems with irregular perimeters, Burridge shows that boundary lines tend to migrate to places where they emerge perpendicular from the edge of the system, such as indentations in coastlines.Original research HERE (PDF).
Monday, July 17, 2017
I checked in at Academia.edu today and found another article by medievalist Stephen Nichols. I've not finished it, but wanted to blog a passage or two anyhow.
Stephen G. Nichols, Dynamic Reading of Medieval Manuscripts, Florilegium, vol. 32 (2015): 19-57 DOI: 10.3138/ or.32.002 download at Academia.edu. https://www.academia.edu/33907842/Nichols_Dynamic_Reading_flor_32_002
Here's the abstract:
Abstract: Digital manuscript and text representation provides such a wealth of information that it is now possible to see the incessant versioning of works like the Roman de la Rose. Using Rose manuscripts of the Bibliothèque municipale de Lyon MS 763 and BM de Dijon MS 525 as examples and drawing on Aristotelian concepts such as energeia, dynamis, and entelecheia, the copiously illustrated article demonstrates how pluripotent circulation allows for “dynamic reading” of such manuscript texts, which takes into consideration the interplay between image, text, and the context of other texts transmitted in the same manuscript.
What caught my attention was his statement about the unexpected impact of digital technology. It made it possible, for the first time, to examine a number of different codices of the same title and to compare them. And THAT led to a sea-change in understanding of what a text is. The normative concept of the Urtext as the author's original version is in trouble. What happens to the so-called critical edition? Thus (p. 22):
that the critical edition represents a construct based on selected evidence is neither exceptional nor particularly shocking. More problematic is the fact that expediency decrees that manuscript mass be accorded short shrift. Not all manuscripts are equal in this scenario. Indeed, the purpose of manuscript selection—the choice by the editor of a small number of manuscripts deemed reliable — lay precisely in minimizing the number of manuscripts. The more versions an editor could eliminate as defective or uninteresting, the greater the probability that one had located the few copies closest to an original or early version of a work. The select copies could then be closely scrutinized for variant readings. And ‘variant’ meant precisely that: readings of lines or passages differing from what the editor determined to be the normative text. It was in reaction to such a restrictive treatment of manuscript variation that New Philology emerged. Initially, we argued that manuscript copies bore witness to a dialectical process of transmission where individual versions might have the same historical authority as that represented by the critical edition.
And so (pp. 24-25):
Perhaps the most startling question posed by the specular confrontation of manuscripts concerns the status of textuality itself. With unerring perspicuity, Jacqueline Cerquiglini-Toulet pinpoints the issue by asking the simple, but trenchant question: “what, exactly, is ‘a text’ in the Middle Ages, and how do we locate it in a manuscript culture where each codex is unique? [. . .] More radically still,” she continues, “we might legitimately ask just where we’re supposed to nd the text in the manuscript. How does it come to instantiate itself materially as object? And how is its literary identity realized?”If such questions seem disorienting, it is because they underline how much print editions of medieval works have shaped our expectations. We have grown accustomed to finding the ‘text’ of a medieval work before our eyes whenever we open an edition. In the critical edition, the text is a given; that is why the work is called ‘textual scholarship.’ The editor works hard to establish a text on the basis of painstaking study of the manuscripts that he or she determines to be authoritative. The point, of course, is to circumscribe or close the text to from continuing to generate additions or variants. As we know, that is a modern practice grounded in concepts of scientific text editing.But as Jacqueline Cerquiglini-Toulet observes, the very concept of a definitive text, a text incapable of generating new versions, is an illusion propagated by its own methodology. Authentic medieval texts, she observes, are never closed, nor, I would add, would their mode of transmission allow them to remain static. And, as a corollary, she observes: “Where are the boundaries?” How do we “identify the borders of a text”? She means that the manuscript folio has a very different ecology from the page of a printed edition. Textual space on a folio is not exclusive, but shared with other systems of representation, or — why not? — other kinds of ‘texts.’ These include rubrics, miniature paintings, decorated or historiated initials, bas-de-page images, marginal glosses, decorative programmes, and so on. In other words, the medieval manuscript page is not simply complex but, above all, an inter-artistic space navigated by visual cues.
We are far from the world of "distant reading" a large corpus of texts and thereby beginning to see patterns in literary history that had been but dimly envisioned before. But the change is equally profound. For example (26-27):
To understand the astonishing virtuosity and variety we find in manuscript versions of the ‘same’ work — such as the Roman de la Rose, for example, for which we have some 250 extant manuscripts produced between the end of the thirteenth and the beginning of the sixteenth century — we need to identify imminent factors responsible for generating multiple versions of a given work throughout the period. Here again, digital manuscript study offers reasons to move beyond conventional explanations.Whereas increased manuscript production might intuitively be explained by such external causes as rising literacy among the merchant and artisan classes and the growth in the number of booksellers, the great variation we see in manuscripts, even those contemporaneous with one another, suggests the possibility of inherent forces of variation at work. Put another way, whereas the increase in literacy and leisure certainly contributed to the growing market for manuscripts to which Parisian booksellers responded, the efficient cause generating multiple manuscripts of a given work lay in the nature of the manuscript matrix itself.It is not by chance that versions of a given work vary. Literary prestige derived in part from a work’s ability to renew itself from generation to generation by a dynamic process of differential repetition.
And so it goes. And we bring in Artistotle (p. 30): "But whereas we might think of striving for perfection as linear and directed, Aristotle sees it as continuous and open-ended." Is Nichols going to be arguing, then, that the production of version after version is a "striving for perfection" the extends through a population of scribes and readers? I suppose that's what I'll find out as I continue reading.
Thus, p. 32: "In other words, manuscripts are, by their very nature as eidos, ergon, and energeia, predisposed towards actualizing the works they convey not as invariant but as versions in an ever-evolving process of representation. Against those who would see manuscript copies as regressions from an authoritative original to ever fainter avatars of that primal moment, we must recall Aristotle’s notion of form as atemporal actuality. "
Thus, p. 32: "In other words, manuscripts are, by their very nature as eidos, ergon, and energeia, predisposed towards actualizing the works they convey not as invariant but as versions in an ever-evolving process of representation. Against those who would see manuscript copies as regressions from an authoritative original to ever fainter avatars of that primal moment, we must recall Aristotle’s notion of form as atemporal actuality. "
* * * * *
Here's an earlier post about Nichols: Mutable stability in the transmission of medieval texts. And here's a post about the three texts of Hamlet that's relevant: Journey into Shakespeare, a tedious adventure – Will the real Hamlet stand up?
I am currently excavating the forgotten early history of computer story generation. More info here: https://t.co/drKKyMlmE7. pic.twitter.com/OMZ8ltvhYz— James Ryan (@xfoml) June 7, 2017
And so he's been digging up all sorts of interesting things, not just computer storytelling. Here's some recent stuff he's dug up.
the invention of the camera raised copyright issues that seem weird today—it was not clear whether the camera or the photographer was author pic.twitter.com/OlfaIEHxww— James Ryan (@xfoml) July 17, 2017
in 1971, 1000 feet of computer-generated poetry was dropped from a helicopter onto an experimental arts center in Burbank, California pic.twitter.com/IKjduyOcjw— James Ryan (@xfoml) July 14, 2017
a 1962 issue of the Librascope company newsletter featured an "interview" with AUTO-BEATNIK, the computer poet developed by employees there pic.twitter.com/CkLnliYEVf— James Ryan (@xfoml) July 14, 2017
Sunday, July 16, 2017
Friday, July 14, 2017
I first became aware of Lawfare through a wonderful March 3 post by Benjamin Wittes and Quinta Jurecic, What Happens When We Don’t Believe the President’s Oath? It seems that a lot of people discovered Lawfare about the same time and its readership has blossomed until
Today is the day and this morning is the morning during which @lawfareblog's 2017 traffic will pass that of the site's whole prior history. pic.twitter.com/cq2cAflZd3— Benjamin Wittes (@benjaminwittes) July 14, 2017
Obviously it is the Presidency of Donald Trump that made Lawfare's commentary so salient. Trump's bull-in-a-china-shop style begged for informed legal analysis, and Lawfare was there to provide it.
Congratulations Ben Wittes, Robert Chesney, Jack Goldsmith and the rest of the team!
Writing in the LA Review of Books, Bruce Robbins reviews Joseph North, Literary Criticism: A Concise Political History (Harvard 2017). An interesting review of what sounds like an interesting book. Robbins reads the recent politics of lit crit as conservative rather than radical, which is how such criticism styles itself; and we get once more universals.
The broad strokes of his narrative are familiar enough, at least to literature professors. As everyone knows, the radicals of 1968, when they turned their attention to the university, insisted that academic attention be paid to race, gender, sexuality, colonialism, and other measures of historically inflicted injury. In literary criticism, these were contexts that had been missing from the everyday practice of interpretation. Moving into the ’70s and ’80s, it became obvious to much or most of the discipline that to read a work of past literature without asking what sort of society the work emerged from was as reprehensible, in its way, as ignoring those who were currently suffering injustice all around you. This is how close reading, little by little, went out of fashion — a momentous shift that, like so much else that later came to be associated with the ’60s, I was somehow living through but not really registering.Most of the academics who advocated for historicism thought of themselves as radicalizing an apolitical or even crypto-conservative discipline. In North’s view, though, this gets the story backward. The politicization of the discipline that seemed to follow the eclipse of close reading was actually its depoliticization. In the period that began in the late 1970s “and continues through to the present,” North writes, “the project of ‘criticism’ was rejected as necessarily elitist, dehistoricizing, depoliticizing, and so forth; the idea of the ‘aesthetic’ was rejected as necessarily Kantian, idealist, and universalizing.” Yetit was in fact quite wrong to reject the project of criticism as if its motivating concept, the aesthetic, could only ever be thought through in idealist terms. What was being elided here was the fact that modern disciplinary criticism had been founded on an aesthetics of just the opposite kind. In our own period, this historical amnesia has allowed a programmatic retreat from the critical project of intervening in the culture, back toward the project of analyzing the culture, without any mandate for intervention.The newer style of interpretation recognized context, oppression, and injustice, yes, but it also masked a movement away from “criticism” and toward what North calls “scholarship.” Criticism, as he sees it, aspires to intervene in social life. Scholarship, as he sees it, is knowledge-production that has no such aspiration. Scholarship gets off on interpreting the world but can’t be bothered to do anything non-scholarly to change it. Since close reading, as North sees it, was a way of changing the world, if only reader by reader, what looked like a lurch to the left was actually a subtle move to the right.For North, the production of analytic knowledge about the past, whatever its political motives, amounts to complacent non-interference. It’s a way of comfortably inhabiting a present that we ought to see, ethically speaking, as unfit for human habitation, hence requiring us to get up from our desks to do something about.
OK, so all the political critics have been hoisted on their own petards as it where. A call for revolution uttered from the comfort of one’s study is no call at all. Let’s just leave that alone.
Thursday, July 13, 2017
Walter Murch is perhaps best-known for his work on Apocalypse Now, where he did the sound design (for which he won and Oscar) and much of the editing. This is a passage from an interview about his craft and his career that he did with Emily Buder in 2015:
To be an editor, you have to be the kind of person who can be in a room for 16 hours at a time. You are working alone a lot of the time, but there are also times when you’re working with a director in the room. You have to be able to accommodate that. For feature-length pictures, it’s like running a marathon. You have to pace yourself over a year. When I’m considering a film, that’s in the back of my mind. You have to really like the project. Also, you are frequently away from home. You go where the director is. I was working in Argentina for a year, a number of years ago. Before that, I was in Romania, and before that I was in London, and then after that about 2 years ago I was in New York for a year. If you’re married, you have to find ways of coping with that and that’s a whole chapter unto itself.At the end of the film, it can be very disorienting when the work is suddenly finished. This is not exclusive to film editing; I’m sure it’s true of many other areas of human activity. Soldiers have this problem, actors who are acting in a play when the play is suddenly over, it’s like you’ve been cut loose: “Now what?!” This was never explained to me at film school. So when it first happened, I felt something was wrong with me. It’s the equivalent of a kind of seasickness; if you’ve never been on a ship before and somebody warns you about it, it’s okay. You’ll still feel just as sick, but you won’t feel like killing yourself. This is not that intense, but it is that kind of disorientation. And it passes, but it takes anywhere from two to six weeks to go away. During that time I would be very reluctant to try to decide what to do next. It’s like a love affair where you don’t want to bounce from one relationship to another; that’s dangerous. So, you should just let that project fade away and get back to normal, and then you can decide what to do next. We frequently don’t have the luxury of that, but that’s a goal.
That seems like a kind of mourning. When you work that long and with that intensity, you become attached to the film. When it's over, you've got to unattach yourself. That requires something very like mourning.
Wednesday, July 12, 2017
Just a quick take: We know that prior to becoming President Donald Trump was doing business in Russia. We now know that the Trump campaign – DJ Jr., Kushner, & Manafort – had a conversation with well-connected Russians about dirt on Hillary Clinton. We don’t yet know whether or not anything illegal has been done – expert opinion seems divided. But at the very least, it’s unseemly. Is this how to make American great again, collaborate with a nation that, not so long ago, was America’s fiercest rival?
But is this about nations, or just about the oligarchs and plutocrats that run them? We know that any self-respecting Russian oligarch is going to have an apartment in London, or New York, perhaps Singapore, or Dubai? The Chinese too? And folks on Jersey City, across the Hudson from Manhattan, have been getting exercised at son-in-law Jared’s sister dangling HB-5 visas before potential Chinese investors in their projects.
It’s looking like “Make America Great Again” is just the brand name under which a loose transnational gaggle of oligarchs manipulates politics in the USofA.
Meanwhile, I keep reading these articles about the waning of the nation-state as a vehicle for governance. The most recent of these talk about how states and cities in America are going around the federal government on climate change. That is to say, on this issue, they’ve decided to conduct their own foreign policy and foreign policy, we know, has traditionally be the prerogative of the nation-state. That’s why nation-states exist, to conduct foreign affairs.
What’s it all mean?
Monday, July 10, 2017
Ted Underwood on Intellectual Genealogies: Distant Reading is Social-Science, Not Digital Humanities [#DH]
Ted Underwood, “A Genealogy of Distant Reading”, DHQ Vol. 11, No. 2, 2017:
Abstract: It has recently become common to describe all empirical approaches to literature as subfields of digital humanities. This essay argues that distant reading has a largely distinct genealogy stretching back many decades before the advent of the internet – a genealogy that is not for the most part centrally concerned with computers. It would be better to understand this field as a conversation between literary studies and social science, inititated by scholars like Raymond Williams and Janice Radway, and moving slowly toward an explicitly experimental method. Candor about the social-scientific dimension of distant reading is needed now, in order to refocus a research agenda that can drift into diffuse exploration of digital tools. Clarity on this topic might also reduce miscommunication between distant readers and digital humanists.
Rather than attempt to summarize it myself, I’ll present a set of tweets by Alan Liu, starting with this:
Made 41 annotations in my copy of @tedunderwood's new, crucial DHQ piece, "A Genealogy of Distant Reading" https://t.co/CIuXj7uAdH.— Alan Liu (@alanyliu) July 10, 2017
Liu continues with a long series of tweets, which I’ll present as quotes without the Twitter format. Along the way I will present brief comments of my own, thus inserting my own concerns into the argument.
Here are my top 13 quotes--a kind of thirteen ways of looking at distant reading, cited by paragraph number.(As it were: "Among twenty snowy mountains of texts, The only moving thing Was the eye of the distant reader"):¶5: "The questions posed by distant readers were originally framed by scholars (like Raymond Williams and Janice Radway) who worked on the boundary between literary history and social science."¶10: "these projects … pose broad historical questions about literature, and answer them by studying samples of social or textual evidence. I want to highlight the underlying project of experimenting on samples, and the premise that samples … have to be constructed"¶21 "The crucial underlying similarity between [Radway & Moretti's] works, which has made both of them durably productive models for other scholars, is simply the decision to organize critical inquiry as an experiment."¶22 "Distant reading is a historical science, and it will need to draw on something like Carol Cleland’s definition of scientific method, which embraces not only future-oriented interventions, but any systematic test that seeks 'to protect … from misleading confirmations.'"¶22 "Literary historians who use numbers will have to somehow combine rigor with simplicity, and prune back a thicket of fiddly details that would be fatal to our reason for caring about the subject."¶24 "I try not to join any debate about the representativeness of different samples until I have seen some evidence that the debate makes a difference to the historical question under discussion…. [S]amples are provisional, purpose-built things. They are not canons. It makes no sense to argue about their representativeness in the abstract, before a question is defined."¶27 "Instead of interpreting distant reading as a normative argument about the discipline, it would be better to judge it simply by asking whether the blind spot it identified is turning out to contain anything interesting."¶28 "Consensus about new evidence emerges very slowly: inventing an air-pump doesn’t immediately convince readers that vacuums exist…. But at this point, there is no doubt in my mind that literary scholarship turned out to have a blind spot. Many important patterns in literary history are still poorly understood, because they weren’t easily grasped at the scale of individual reading."
From an essay by Walten Benn Michaels, The Myth of ‘Cultural Appropriation’:
The logic is on vivid display in a TV ad for Ancestry.com featuring a woman named Kim who pays her money, gets her DNA scan, and is thrilled to discover that she’s 23-percent Native American. Now, she says, while standing in front of some culturally appropriate pottery, "I want to know more about my Native American heritage." If the choice of Southwest-style cultural artifacts seems a little arbitrary, that’s because, as the Ancestry.com website warns you, the technology isn’t yet advanced enough to tell you whether you’re part Navajo or part Sioux. But, of course, that arbitrariness is less puzzling than the deployment of any artifacts at all.The point of Kim’s surprise is that she has no Native Americancultural connection whatsoever; the point of those pots is that they become culturally appropriate only when they’re revealed to be genetically appropriate.As befits an ad, Kim’s story is a happy one. But it could have gone differently. The genetic transmission of an appreciation for Navajo pottery could just as easily have turned out to be a genetically traumatic relation to the catastrophe of the Long Walk. What if Sam Durant had gotten himself an Ancestry.com saliva test and discovered that he, too, was part Native American? The bad news: Thirty-eight of his ancestors had been unjustly hanged; the good news: their hanging was part of his story after all.
Later, writing about sociologist Alice Hoffman, who'd done fieldwork in a black neighborhood in Philadelphia:
Even when the experiences really are shared — when something actually did happen to us — we don’t think that autobiographical accounts of people’s own experiences are necessarily more true than other people’s accounts of those same experiences, or that only we have a right to tell our stories. No one thinks that either Goffman or the men she wrote about are the final authorities on their lives. My version of my life is just my version; no one is under any obligation to agree with it, much less refrain from offering his or her own.So even our own stories don’t belong to us — no stories belong to anyone. Rather, we’re all in the position of historians, trying to figure out what actually happened. Interestingly, even if the logic of their position would seem to require it, the defenders of a racialized past haven’t been all that interested in confining historians to what are supposed to be their own stories. Maybe that’s because history (at least if it isn’t cultural) makes it harder to draw the needed lines. You obviously can’t understand the political economy of Jim Crow without understanding the actions of both white and black people. And you can’t understand the actions of those white and black people without reading the work of historians like (the white) Judith Stein and (the black) Adolph Reed.
Not THAT's an interesting argument.The students at elite American universities come overwhelmingly from the upper class. The job of the faculty is to help them rise within (or at least not fall out of) that class. And one of the particular responsibilities of the humanities and social-science faculty is to help make sure that the students who take our courses come out not just richer than everyone else but also more virtuous. (It’s like adding insult to injury, but the opposite.)Identity crimes — both the phantasmatic ones, like cultural theft, and the real ones, like racism and sexism — are perfect for this purpose, since, unlike the downward redistribution of wealth, opposing them leaves the class structure intact. [...]The problem is not that rich people can’t feel poor people’s pain; you don’t have to be the victim of inequality to want to eliminate inequality. And the problem is not that the story of the poor doesn’t belong to the rich; the relevant question about our stories is not whether they reveal someone’s privilege but whether they’re true. The problem is that the whole idea of cultural identity is incoherent, and that the dramas of appropriation it makes possible provide an increasingly economically stratified society with a model of social justice that addresses everything except that economic stratification.
Sunday, July 9, 2017
Moral philosophers in the analytic tradition like to run thought experiments of a kind known as the trolley problem:
There is a runaway trolley barreling down the railway tracks. Ahead, on the tracks, there are five people tied up and unable to move. The trolley is headed straight for them. You are standing some distance off in the train yard, next to a lever. If you pull this lever, the trolley will switch to a different set of tracks. However, you notice that there is one person on the side track. You have two options:1. Do nothing, and the trolley kills the five people on the main track.2. Pull the lever, diverting the trolley onto the side track where it will kill one person
A recent movie, Eye in the Sky, posed a problem with a similar form. Instead of five people tied to a track we have five terrorists having a meeting inside a private house in Nairobi. Instead of a person tied to a sidetrack we have an innocent young girl selling bread on the street outside that same house. An explosion that kills the terrorist will likely kill the girl as well. Do we do it?
That’s the basic situation. In fact, things are more complicated. Three of the terrorists are hold high-level leadership roles in the organization. The other two are suicide bombers who have just donned explosive vests. Presumably when the meeting is over they are going to public places and kill themselves, along with tens if not hundreds of others. So we can’t wait for the girl leave. But, of course, we don’t really know about the timing of things.
As for We, we are several levels of military and civilian leadership in Britain and American plus the remote pilot who flies the drone and who is the one who actually executes the order to bomb the house. The drama lies in the back and forth decision-making and buck-passing running in counterpoint with events on the ground.
The house is bombed and the girl dies, but only by seconds. If she’d been a bit quicker, if the bomb had been released half a minute later, she’d have lived while the terrorists would still have been killed.
It's a good film.
Joshua Sokol interviews Jessica Flack at the Santa Fe Institute. She says:
Collective computation is about how adaptive systems solve problems. All systems are about extracting energy and doing work, and physical systems in particular are about that. When you move to adaptive systems, you’ve got the additional influence of information processing, which we think allows a system to extract energy more efficiently even though it has to expend a little extra energy to do the information processing. Components of adaptive systems look out at the world, and they try to discover the regularities. It’s a noisy process.Unlike in computer science where you have a program you have written, which has to produce a desired output, in adaptive systems this is a process that is being refined over evolutionary or learning time. The system produces an output, and it might be a good output for the environment or it might not. And then over time it hopefully gets better and better.
For example, the human brain:
The human brain contains roughly 86 billion neurons, making our brains the ultimate collectives. Every decision we make can be thought of as the outcome of a neural collective computation. In the case of our study, which was lead by my colleague Bryan Daniels, the data we analyzed were collected during an experiment by Bill Newsome’s group at Stanford from macaques who had to decide whether a group of dots moving across a screen was traveling left or right. Data on neural firing patterns were recorded while the monkey was performing this task. We found that as the monkey initially processes the data, a few single neurons have strong opinions about what the decision should be. But this is not enough: If we want to anticipate what the monkey will decide, we have to poll many neurons to get a good prediction of the monkey’s decision. Then, as the decision point approaches, this pattern shifts. The neurons start to agree, and eventually each one on its own is maximally predictive.We have this principle of collective computation that seems to involve these two phases. The neurons go out and semi-independently collect information about the noisy input, and that’s like neural crowdsourcing. Then they come together and come to some consensus about what the decision should be. And this principle of information accumulation and consensus applies to some monkey societies also.
Saturday, July 8, 2017
Ryan Cordell has an interesting post, Objectivity and Distant Reading, in which he comments on Objectivity (2010) by Lorraine Daston and Peter Galison:
Objectivity attempts to trace the emergence of scientific objectivity as a concept, ideal, and moral framework for researchers during the nineteenth century. In particular, the book focuses on shifting ideas about scientific images during the period. In the eighteenth and early nineteenth centuries, Daston and Galison argue, the scientific ideal was “truth-to-nature,” in which particular examples are primarily useful for the ways in which they reflect and help construct an ideal type: not this leaf, specifically, but this type of leaf. Under this regime scientific illustrations did not attempt to reconstruct individual, imperfect specimens, but instead to generalize from specimens and portray a perfect type.Objectivity shows how, as the nineteenth century progressed and new image technologies such as photography shifted the possibilities for scientific imagery, truth-to-nature fell out of favor, while objectivity rose to prominence.
And that's what interests me, the focus on images, and the rise of photography:
In debates about the virtues of illustration versus photography, for instance, illustration was touted as superior to the relative primitivism of photography—technologies such as drawing and engraving simply allowed finer detail than blurry nineteenth century photography could. Nevertheless photography increasingly dominated scientific images because it was seen as less susceptible to manipulation, less dependent on the imagination of the artist (or, indeed, of the scientist).
Images, of course, are clearly distinct from the prose in which they are (often) set. Images are a form of objectification, though it takes more than objectification to yield objectivity.
Cordell then goes on to discuss computational criticism (aka distant reading), where "computation is invoked as a solution to problems of will that are quite familiar from decades of humanistic scholarship." Computational critics
might argue that methods such as distant reading or macroanalysis seek to bypass the human will that constructed such canons through a kind of mechanical objectivity. While human beings choose what to focus on for all kinds of reasons, many of them suspect, the computer will look for patterns unencumbered any of those reasons. The machine is less susceptible to the social, political, or identity manipulations of canon formation.
Interesting stuff. I've got two comments:
1) Consider one of my touchstone passages by Sydney Lamb, a linguist of Chomsky’s generation but of a very different intellectual temperament. Lamb cut his intellectual teeth on computer models of language processes and was concerned about the neural plausibility of such models. In his major systematic statement, Pathways of the Brain: The Neurocognitive Basis of Language (John Benjamins 1999) remarked on importance of visual notation (p. 274): “... it is precisely because we are talking about ordinary language that we need to adopt a notation as different from ordinary language as possible, to keep us from getting lost in confusion between the object of description and the means of description.” That is, we need the visual notation in order to objectify language mechanisms.
I note that, I think of objectification (in the sense immediately above) as a prerequisite for objectivity, but it is by no means a guarantee of it. That requires empirical evidence. A computer model will give us objectification, but no more.
2) Tyler Cowen has an interesting and wide-ranging interview with Jill Lepore in which she notes that Frederick Douglass was the most widely photographed man of 19th century America: "In the 1860s, he writes all these essays about photography in which he argues that photography is the most democratic art. And he means portrait photography. And that no white man will ever make a true likeness of a black man because he’s been represented in caricature — the kind of runaway slave ad with the guy, the little figure, silhouette of the black figure carrying a sack."
Friday, July 7, 2017
But the Paris decision may also reshape the world for the better, or at least the very different. Consider: A few days after Trump’s Rose Garden reveal, California Governor Jerry Brown was in China, conducting what looked a lot like an official state visit. He posed with pandas, attended banquets—and sat down for a one-on-one meeting with President Xi Jinping, which produced a series of agreements on climate cooperation between China and California. (Trump’s secretary of energy, Rick Perry, was in Beijing the same week: no pandas, no sit-down with Xi.) It was almost as if California were another country. Call it a nation-state—a nation-state that has talked about launching its own satellites to monitor melting polar ice. A nation-state that has joined New York and a dozen others in a climate alliance to announce they will meet the targets set in the Paris accord on their own. A nation-state that already holds joint auctions with Quebec in its carbon cap-and-trade program. A nation-state that is convening hundreds of other “subnational actors” from around the world next year to pledge to keep the rise in global temperature below 2 degrees Celsius.
It’s ironic that global warming might be the wedge issue for the rise of “subnationalism.” After all, if you ever wanted an argument for world government, climate change provides it. But the United Nations has been trying to stop global warming since the days when we called it the greenhouse effect. And national governments, hijacked by the fossil fuel industry, have intervened again and again to obstruct any progress: The Kyoto treaty more or less collapsed, as did the Copenhagen talks. Paris “succeeded,” but only if you squint: The world’s nations vowed to keep the planet’s temperature increase to under 2 degrees Celsius, but their promises actually add up to a world that will grow 3.5 degrees hotter. The real hope was that the accord would spur private investment in renewable energy: And as the price of solar panels plummeted, in fact, China and India started to exceed their pledges.Even that modest progress alarmed what energy expert Michael Klare calls the Big Three carbon powers: the United States, Saudi Arabia, and Russia. (Trump’s foreign policy looks more coherent, by the way, when viewed through this prism.) The United States has now pulled out of Paris, and an aide to Vladimir Putin has said the withdrawal makes it “perfectly evident” the pact is now “unworkable.”So what’s a state like California to do? It can’t ignore climate change, which threatens its very existence. [...]If you want to know who is serious about forging a new path on global warming, ignore all the airy proclamations about meeting the Paris targets—and instead pay attention to the cities and states making the very real and measurable pledge to go 100 percent renewable. California’s senate just passed such a commitment by a 2–1 margin. More dramatically, the day after Trump said he had been elected to serve “Pittsburgh, not Paris,” Mayor Bill Peduto announced that Pittsburgh will run entirely on clean energy by 2035. “If you are a mayor and not preparing for the impacts of climate change,” Peduto said, “you aren’t doing your job.” All told, 27 cities in 17 states have pledged to go 100 percent renewable—a move that puts them at direct odds with federal policy. Call them “climate sanctuaries.” San Francisco, Boulder, and Burlington won’t surprise you—but Atlanta and Salt Lake City and San Diego have done the same.If you want to know who is serious about forging a new path on global warming, ignore all the airy proclamations about meeting the Paris targets—and instead pay attention to the cities and states making the very real and measurable pledge to go 100 percent renewable. California’s senate just passed such a commitment by a 2–1 margin. More dramatically, the day after Trump said he had been elected to serve “Pittsburgh, not Paris,” Mayor Bill Peduto announced that Pittsburgh will run entirely on clean energy by 2035. “If you are a mayor and not preparing for the impacts of climate change,” Peduto said, “you aren’t doing your job.” All told, 27 cities in 17 states have pledged to go 100 percent renewable—a move that puts them at direct odds with federal policy. Call them “climate sanctuaries.” San Francisco, Boulder, and Burlington won’t surprise you—but Atlanta and Salt Lake City and San Diego have done the same.
Thursday, July 6, 2017
For as long as I can remember such things – back to my undergraduate years in the 1960s – humanists have been defending themselves and their work against all comers: politicians, scientists of all kinds, and disgruntled letter writers. And always the defense comes down to this: we provide a holistic and integrated view of what it is to be human in a world that is, well, just what IS the world like anyhow?
It’s a mugs game and I refuse to play it. I was trained in the human sciences: hermeneutics AND cognitive science, history AND social science, and I’ve played jazz and rhythm and blues in seedy nightclubs, ritzy weddings, and outdoors before thousands. It’s all good. It’s all come into play as I’ve investigated the human mind through music and literature.
In this essay I look at literature. First I consider literary form as displayed in ring form texts. Then I review the historical problem posed by Shakespeare and the rise of the European novel. My general point is that we need all our conceptual resources to deal with these problems. But let’s begin with an analogy: how do we understand, say, a cathedral?
The Cathedral Problem
Cathedrals are made of stone blocks, mortar, pieces of stained glass, lead strips, metal fittings, wooden beams and boards, and so forth. You can go through a cathedral and count and label every block and locate them on a (3D) map. You can do the same for the doors and cabinets, the plumbing, heating fixtures, and wiring, and so forth. You will now, in some sense, have described the cathedral. But you won't have captured its design. That’s difficult and those how focus on it often use vague language, not because they like vagueness, but because, at the moment, that’s all that’s available.
And so it goes with literature and newer psychologies: cognitive science, evolutionary psychology, and neuroscience. My humanist colleagues keep hearing that they should get on board with the cognitive revolution and the decade of the brain. But it all sounds like trying to explain a cathedral by counting the building blocks, measuring the pitch of the roof, and analyzing the refractive properties of pieces of colored glass.
The advice may be well meant, but it isn’t terribly useful. It takes our attention away from the problem – how the whole shebang works – and asks us to settle for a pile of things we already know. Almost.
Ring Forms in Literature
I first learned of ring form in an article published in PMLA – the oldest literary journal published in the United States – back in 1976: “Measure and Symmetry in Literature” by R. G. Peterson. The idea is a simple one, that some texts, or parts of texts, are symmetrically arranged about a center point:
A, B … X … B’, A’
He produced many examples, from Iliad through Shakespeare’s Hamlet to the “Author’s Prologue” to Dylan Thomas, Collected Poems. But my interests, like those of most literary critics, were elsewhere and so I merely noted the article and went on about my business.
I was reminded of this work some years ago when I entered into correspondence with the late Mary Douglas, a British anthropologist who rose to academic stardom – such as it was back in ancient times – after the 1966 publication of Purity and Danger: An Analysis of Concepts of Pollution and Taboo. She spent the last decade of her career immersed in the arcana of classical and Biblical studies, publishing monographs on the Book of Leviticus and the Book of Numbers and, in 2007, Thinking in Circles: An Essay on Ring Composition, based on a series of lectures she had delivered at Yale. Among other things, she argues that such forms aren’t special to the ancient world, that they continue in modern times – she offers Sterne’s Tristram Shandy as an example.
She opens her 10th chapter by referring to Roman Jakobson, one of the pioneering linguistics of the 20th Century, who believed, on the basis of extensive study, that such patterns reflect “a faculty inherent in the relation among language, grammar, and brain.” But why are such patterns so very difficult to recognize if they are so natural to us?
Wednesday, July 5, 2017
Will the citizen's of New York decide to amend the state's constitution to give the state greater independence from the federal government? From the NYTimes:
Every 20 years, New Yorkers have the chance to vote whether they want to hold a constitutional convention to amend, tweak or otherwise improve the founding document of the state.
For the past half-century, voters have demurred. This year, however, academics, good-government groups and others believe the outcome of the ballot question in November may be different. And — perhaps no surprise — it has something to do with the current occupant of the White House.
“Trump’s election emphasizes how valuable it is for states to chart their own course,” said Peter J. Galie, author of “Ordered Liberty: A Constitutional History of New York” and a professor of political science at Canisius College in Buffalo. “We can put a right to clean air and water in our Constitution. If we want to add more labor protections, we can do it. That’s the beauty of federalism.”
What about New York City separating from the rest of the state?
John Bergener Jr., a retiree who lives outside Albany, would like to see the separation of New York City from the rest of the state. As chairman of Divide NYS Caucus, a political committee, he believes a constitutional convention is the best mechanism to achieve that.
Upstate areas, he said, have suffered economically from excessive business regulations and unfunded mandates. His vision — and he claims thousands of supporters — calls for two or three autonomous regions, each with its own regional governor and legislature. (The upstate region, north of the lower Hudson Valley, would be called New Amsterdam.) A statewide governor would be titular, with the same “powers as the queen of England.”