Pages in this blog

Friday, September 30, 2016

Hunting the wild myth, cultural phylogeny

Julien d'Huy, Scientists Trace Society’s Myths to Primordial Origins, Scientific American, 29 September 2016; here's the opening:
The Greek version of a familiar myth starts with Artemis, goddess of the hunt and fierce protectress of innocent young women. Artemis demands that Callisto, “the most beautiful,” and her other handmaidens take a vow of chastity. Zeus tricks Callisto into giving up her virginity, and she gives birth to a son, Arcas. Zeus’ jealous wife, Hera, turns Callisto into a bear and banishes her to the mountains. Meanwhile Arcas grows up to become a hunter and one day happens on a bear that greets him with outstretched arms. Not recognizing his mother, he takes aim with his spear, but Zeus comes to the rescue. He transforms Callisto into the constellation Ursa Major, or “great bear,” and places Arcas nearby as Ursa Minor, the “little bear.”

As the Iroquois of the northeastern U.S. tell it, three hunters pursue a bear; the blood of the wounded animal colors the leaves of the autumnal forest. The bear then climbs a mountain and leaps into the sky. The hunters and the animal become the constellation Ursa Major. Among the Chukchi, a Siberian people, the constellation Orion is a hunter who pursues a reindeer, Cassiopeia. Among the Finno-Ugric tribes of Siberia, the pursued animal is an elk and takes the form of Ursa Major.

Although the animals and the constellations may differ, the basic structure of the story does not. These sagas all belong to a family of myths known as the Cosmic Hunt that spread far and wide in Africa, Europe, Asia and the Americas among people who lived more than 15,000 years ago. Every version of the Cosmic Hunt shares a core story line—a man or an animal pursues or kills one or more animals, and the creatures are changed into constellations.

Folklorists, anthropologists, ethnologists and linguists have long puzzled over why complex mythical stories that surface in cultures widely separated in space and time are strikingly similar. In recent years a promising scientific approach to comparative mythology has emerged in which researchers apply conceptual tools that biologists use to decipher the evolution of living species. In the hands of those who analyze myths, the method, known as phylogenetic analysis, consists of connecting successive versions of a mythical story and constructing a family tree that traces the evolution of the myth over time.

“Pardon these anti-humanistic intrusions, Madam” [Mark Rose's diagrams]

Mark Rose’s slender volume, Shakespearean Design, has been on my mind recently, as I’ve been considering doing a book on ring-composition. Rose devotes a chapter to Hamlet and all but argues that it’s a ring-composition. But that–ring-composition–is not what this is about.

This is about an incidental remark in the Preface, something of an apology (p. viii):
A critic attempting to talk concretely about Shakespearean structure has two choices. He can create an artificial language of his own, which has the advantage of precision; or he can make do with whatever words seem most useful at each stage in the argument, which has the advantage of comprehensibility. In general, I have chosen the latter course.

The little charts and diagrams may initially give a false impression. I included these  charts only reluctantly, deciding that, inelegant as they are, they provide an economical way of making certain matters clear. The numbers, usually line totals, sprinkled throughout may also give a false impression of exactness. I indicate line totals only to give a rough idea of the general proportions of a particular scene or segment.
Here’s two of those offending diagrams, from the Hamlet chapter, pages 97 and 103 respectively; you can see the line counts in parentheses:

Rose 97

Rose 113

What’s the fuss about? And there aren’t many of them. These are simple diagrams and, yes, without them, Rose’s accounts would be more difficult to understand. Indeed, without them, a reader would be tempted to sketch their own diagrams on convenient scraps of paper.

I figure Rose’s misgivings about the numbers are about humanistic ideology, we don’t do numbers. His willies about the diagrams may be that as well, but I think there’s something more there. The diagrams themselves are problematic simply because they ARE diagrams. They intrude too deeply into the inner workings of the humanistic mind. It’s like throwing a spanner into a machine; it gums-up the works.

It’s one thing to have pictures in an illustrated edition of, say, Hamlet, pictures depicting a scene in the play. That’s fine, for it’s consistent with the narrative flow. And it’s fine to have illustrations in, say, an article about the Elizabethan theatre, where you need to depict the stage layout or the relationship between the stage and the seating. Such illustrations are consistent with the ongoing flow of thought.

Those diagrams, however regrettably necessary, are different. It’s not that they’re inconsistent with the flow of thought. They’re not. They’re essential to it. But they indicate that this kind of thinking is not quite kosher. Why not? How do these simple diagrams intrude on the humanistic mind, while more elaborate images of the type mentioned in the previous paragraph, while those images are fine?

I think it’s because the humanistic mind has somehow become fundamentally discursive–see my post, Humanistic Thought as Prose-Centric Thought. Illustrations are not problematic, because they depict whatever it is you’re thinking about. Rose’s diagrams, simple as they are, are different. Yes, they depict structure in Shakespeare’s plays, but it’s structure that is not apprehended visually. What you see on the page is one word after another, page after page. You never see scenes juxtaposed. What you see on the stage is talk after talk, action after action, but you don’t see a tableau of three or five scenes arrayed as in a multi-panel altarpiece. No, Rose’s diagrams aren’t images you think about, they’re images you are to think with, and that’s the problem. That’s what makes them intrusive.

Just why this is so, is not at all clear to me. But I’d bet it has to do with the brain. The cerebral cortex consists of some 100s of loosely distinct functional areas, each specialized for a different mental operation. I’d warrant that thinking with a diagram calls on a configuration of these functional areas that’s not normally part of the configuration used in humanistic thought. This is a matter of behavioral mode, as I’ve discussed it here and there. And that’s beyond the scope of this little note.

Thursday, September 29, 2016

Describing literary form, a quick note

It seems there’s growing interest in (mere) description among literary critics. I, of course, am interested in description as well. Have been for a long time.

Particularly the description of form.

What I’m wondering is if that’s ALL there is to describe.

Can you describe meaning? You can translate it in various ways, summary, paraphrase, and interpretation chief among them. But I don’t see how you can actually DESCRIBE meaning.

Oh, you can describe semantic structure. But that’s a rather different thing. To do that you need a model in which to represented (hypothecated) semantic structure.

But if literary form is all you can describe, does that mean that the whole text is, in some way, form?

Has the nation-state become obsolete?

Nation states cause some of our biggest problems, from civil war to climate inaction. Science suggests there are better ways to run a planet
Try, for a moment, to envisage a world without countries. Imagine a map not divided into neat, coloured patches, each with clear borders, governments, laws. Try to describe anything our society does – trade, travel, science, sport, maintaining peace and security – without mentioning countries. Try to describe yourself: you have a right to at least one nationality, and the right to change it, but not the right to have none.

Those coloured patches on the map may be democracies, dictatorships or too chaotic to be either, but virtually all claim to be one thing: a nation state, the sovereign territory of a “people” or nation who are entitled to self-determination within a self-governing state. So says the United Nations, which now numbers 193 of them.

And more and more peoples want their own state, from Scots voting for independence to jihadis declaring a new state in the Middle East. Many of the big news stories of the day, from conflicts in Gaza and Ukraine to rows over immigration and membership of the European Union, are linked to nation states in some way.

Even as our economies globalise, nation states remain the planet’s premier political institution. Large votes for nationalist parties in this year’s EU elections prove nationalism remains alive – even as the EU tries to transcend it.

Yet there is a growing feeling among economists, political scientists and even national governments that the nation state is not necessarily the best scale on which to run our affairs. We must manage vital matters like food supply and climate on a global scale, yet national agendas repeatedly trump the global good. At a smaller scale, city and regional administrations often seem to serve people better than national governments.

How, then, should we organise ourselves? Is the nation state a natural, inevitable institution? Or is it a dangerous anachronism in a globalised world?

These are not normally scientific questions – but that is changing. Complexity theorists, social scientists and historians are addressing them using new techniques, and the answers are not always what you might expect. Far from timeless, the nation state is a recent phenomenon. And as complexity keeps rising, it is already mutating into novel political structures. Get set for neo-medievalism.

Election Special: The Blues House

Since it's a Presidential election year it's time to bring this out again, it's Dizzy Gillespie's stump speech from his 1964 Presidential  run. I wonder what he would have thought about the out-going President, Barack Hussein Obama?

You wanna make government a barrel of fun? Vote Dizzy! Vole Dizzy!Your Politics Oughtta Be A Groovier Thing? Vote Dizzy! Vole Dizzy!...Nobody knows the trouble I've seen, nobody knows the sorrows.
* * * * *

Which is not at all the same as the House of Blues. No, the Blues House is what the White House would have been if John Birks Gillespie had been elected President back in 1964, when he ran for the office. John Birks Gillespie, of course, was better known as Dizzy. He was from Cheraw, South Carolina, and was one of the finest trumpeters and most important jazz musicians of the 20th Century.

His Presidential run was at one and the same time not entirely serious and completely and utterly serious. A certain amount of irony was involved, which is perhaps why the lyrics to the theme song were set to “Salt Peanuts” - a tune Diz would one day perform in the White House with President Jimmy Carter.

He developed a standard stump speech which eventually made its way into his autobiography, To Be or Not to Bop (Doubleday 1979 pp. 457-458). It's full of jazz references that will be obscure to those who don't know the music, and various contemporary references are likely to be lost as well. Though I never heard Gillespie give this speech, I've heard him speak on several musical occasions and his comic timing is superb. That is utterly lost in this transcription, though those familiar with his vocal patterns can - in some small measure - supply them as they read his words. Here they are.


When I am elected President of the United States, my first executive order will be to change the name of the White House! To the Blues House.

Income tax must be abolished, and we plan to legalize 'numbers' - you know, the same way they brought jazz into the concert halls and made it respectable. We refuse to be influenced by the warnings of one NAACP official who claims that making this particular aspect of big business legal would upset the nation's economy disastrously.

One of the ways we can cut down governmental expenditures is to disband the FBI and have the Senate Internal Security Committee investigate everything under white sheets for un-American activities. Understand, we won't take no 'sheet' off anybody!

All U.S. Attorneys and judges in the South will be our people so we can get some redress. 'One Man-One Vote' - that's our motto. We might even disenfranchise women and let them run the country. They'll do it anyhow.

The Army and Navy will be combined so no promoter can take too big a cut off the top of the 'double-gig' setup they have now.

The National Labor Relations Board will rule that people applying for jobs have to wear sheets over their heads so bosses won't know what they are until after they've been hired. The sheets, of course, will all be colored!

We're going to recall every U.S. ambassador except Chester Bowles and give the assignments to jazz musicians because they really 'know where it is.'

Wednesday, September 28, 2016

Sharing Experience: Computation, Form, and Meaning in the Work of Literature

I’ve uploaded another document: Sharing Experience: Computation, Form, and Meaning in the Work of Literature. You can download it from Academia.edu:


It’s considerably revised from a text I’d uploaded a month ago: Form, Event, and Text in an Age of Computation. You might also look at my post, Obama’s Affective Trajectory in His Eulogy for Clementa Pinckney, which could have been included in the article, but I’m up against a maximum word count as I am submitting the article for publication. You might also look at the post, Words, Binding, and Conversation as Computation, which figured heavily in my rethinking.

Here’s the abstract of the new article, followed by the TOC and the introduction:

Abstract

It is by virtue of its form that a literary work constrains meaning so that it can be a vehicle for sharing experience. Form is thus an intermediary in Latour’s sense, while meaning is a mediator. Using fragments of a cognitive network model for Shakespeare’s Sonnet 129 we can distinguish between (1) the mind/brain cognitive system, (2) the text considered merely as a string of signifiers, and (3) the path one computes through (1) under constraints imposed by (2). As a text, Obama’s Eulogy for Clementa Pinckney is a ring-composition; as a performance, the central section is clearly marked by audience response. Recent work on synchronization of movement and neural activity across communicating individuals affords insight into the physical substrate of intersubjectivity. The ring-form description is juxtaposed to the performative meaning identified by Glenn Loury and John McWhorter.

Contents

Introduction: Speculative Engineering 2
Form: Macpherson & Attridge to Latour 3
Computational Semantics: Network and Text 6
Obama’s Pinckney Eulogy as Text 10
Obama’s Pinckney Eulogy as Performance 13
Meaning, History, and Attachment 18
Coda: Form and Sharability in the Private Text 20

Introduction: Speculative Engineering

The conjunction of computation and literature is not so strange as it once was, not in this era of digital humanities. But my sense of the conjunction is differs from that of computational critics. They regard computation as a reservoir of tools to be employed in investigating texts, typically a large corpus of texts. That is fine [1].

Digital critics, however, have little interest in computation as a process one enacts while reading a text, the sense that interests me. As the psychologist Ulric Neisser pointed out four decades ago, it was computation that drove the so-called cognitive revolution [2]. Much of the work in cognitive science is conducted in a vocabulary derived computing and, in many cases, involves computer simulations. Prior to the computer metaphor we populated the mind with sensations, perceptions, concepts, ideas, feelings, drives, desires, signs, Freudian hydraulics, and so forth, but we had no explicit accounts of how these things worked, of how perceptions gave way to concepts, or how desire led to action. The computer metaphor gave us conceptual tools for constructing models with differentiated components and processes meshing like, well, clockwork. Moreover, so far as I know, computation of one kind or another provides the only working models we have for language processes.

My purpose in this essay is to recover the concept of computation for thinking about literary processes. For this purpose it is unnecessary either to believe or to deny that the brain (with its mind) is a digital computer. There is an obvious sense in which it is not a digital computer: brains are parts of living organisms; digital computers are not. Beyond that, the issue is a philosophical quagmire. I propose only that the idea of computation is a useful heuristic: it helps us think about and systematically describe literary form in ways we haven’t done before.

Though it might appear that I advocate a scientific approach to literary criticism, that is misleading. Speculative engineering is a better characterization. Engineering is about design and construction, perhaps even Latourian composition [3]. Think of it as reverse-engineering: we’ve got the finished result (a performance, a script) and we examine it to determine how it was made [4]. It is speculative because it must be; our ignorance is too great. The speculative engineer builds a bridge from here to there and only then can we find out if the bridge is able to support sustained investigation.

Caveat emptor: This bridge is of complex construction. I start with form, move to computation, with Shakespeare’s Sonnet 129 as my example, and then to President Obama’s Eulogy for Clementa Pinckney. After describing its structure (ring-composition) I consider the performance situation in which Obama delivered it, arguing that those present constituted a single physical system in which for sharing experience. I conclude by discussing meaning, history, and attachment.

References

[1] William Benzon, “The Only Game in Town: Digital Criticism Comes of Age,” 3 Quarks Daily, May 5, 2014, http://www.3quarksdaily.com/3quarksdaily/2014/05/the-only-game-in-town-digital-criticism-comes-of-age.html

[2] Ulric Neisser, Cognition and Reality: Principles and Implications of Cognitive Psychology (San Francisco: W. H. Freeman, 1976), 5-6.

[3] Bruno Latour, “An Attempt at a ‘Compositionist Manifesto’,” New Literary History 41 (2010), 471-490.

[4] For example, see Steven Pinker, How the Mind Works (New York: W.W. Norton & company, Inc., 1997), 21 ff.

Tuesday, September 27, 2016

My Early Jazz Education 6: Dave Dysert

I started taking trumpet lessons in the fourth grade. These were group lessons, taught at school. As I recall, I was grouped with two clarinetists; I even think they were my good friends, Jackie Barto and Billy Cover, but I’m not sure of that. Why with two clarinetists? Because the clarinet, like the trumpet, is a B-flat instrument. At some point, after weeks, more likely months, I began to get behind. Don’t know why; didn’t practice, most likely.

And then my parents decided I should take private lessons, likely prompted by the teacher at school. So some guy came to house for my lessons. Don’t remember a thing about him except that he was blind. Nor do I recall how long I took lessons with him, but by the time I was in the sixth grade, I believe, I was taking lessons with Dave Dysert.

His principle instrument was the piano, but he was trained in lots of instruments, as was the norm. He gave lessons out of a studio in his basement, Saturday mornings I believe. But it doesn’t matter much.

The lessons lasted a half hour and followed the same format. At the beginning of the lesson I’d play through the material I’d been practicing for the last week. This was usually a page of exercises of one kind or another and some little tune–to make things interesting. Mr. Dysert would comment as appropriate and then he’d select the material I was to practice the following week and I’d play through it. He’d make helpful comments as I hacked my way through the material.

I was supposed to practice half-an-hour a day. And I did so, but reluctantly, very reluctantly. I forget just how my parents got me to do this, but they did. And I did, sorta.

Then one day when I was 13, I believe, Mr. Dysert couldn’t take it anymore and read me the riot act. I was stunned. I’ve long since forgotten just what he said, but not his anger. I was wasting my time and his, he told me. I had talent, more than most of his students. When I read through the material for the next week’s lesson, I played it better than most students did after they’d been practicing for a week. That surprised me–must’ve been a whole lotta’ hacking going on in the studio is all I can say.

Obama’s Affective Trajectory in His Eulogy for Clementa Pinckney

This occurred to me while I was completing the draft to “Sharing Experience: Computation, Form, and Meaning in the Work of Literature,” originally entitled “Form, Event, and Text: Literary Study in an Age of Computation.” These are crude initial thoughts. I don’t know whether I believe them. You can download a PDF of this post at Academia.edu: https://www.academia.edu/28726228/Obamas_Affective_Trajectory_in_His_Eulogy_for_Clementa_Pinckney

Introduction: The Mechanism of Ring-Composition

I have argued the President Obama’s Eulogy for Clementa Pinckney exhibits ring composition, as follows [1]:
(1) Prologue (paragraphs 1-5)
(2) Pinckney & Church (¶6-16)
(3) Nation (¶17-20)
(X) Violation and Grace (¶21-27)
(3’) Nation (¶28-39)
(2’) Pinckney & Families (¶40-44)
(1’) Closing (¶45-48)
Such structures have a central section that is flanked by a symmetrical arrangement of units such that the first one is echoed/complemented/completed by the last, the second by the penultimate, and so forth. No one doubts the existence of such structures in small scale texts (sentences, paragraphs or stanzas) where the arrangement is typically known as chiasmus.

Large-scale deployment, in this case a text of 3000 words, is more problematic. Is the structure real or is it the product of the critic’s imagination? If it is real, is it the product conscious deliberation? If not, how could such an arrangement have just happened? Any sort of arrangement is possible if the writer consciously conceives of it, but we have little or no evidence of conscious deliberation for these texts. In the case of the Pinckney eulogy, so far as I know, Obama has said nothing about conscious deliberation [2]. So, if he didn’t consciously and deliberately create this design, where did it come from?

These notes are directed at that question.

Grace and Love

Let’s take a look at the central section, where grace enters the eulogy. Here’s five of the seven paragraphs:
23.) He didn't know he was being used by God. (Applause.) Blinded by hatred, the alleged killer could not see the grace surrounding Reverend Pinckney and that Bible study group – the light of love that shone as they opened the church doors and invited a stranger to join in their prayer circle. The alleged killer could have never anticipated the way the families of the fallen would respond when they saw him in court – in the midst of unspeakable grief, with words of forgiveness. He couldn't imagine that. (Applause.)

24.) The alleged killer could not imagine how the city of Charleston, under the good and wise leadership of Mayor Riley – (applause) – how the state of South Carolina, how the United States of America would respond – not merely with revulsion at his evil act, but with big-hearted generosity and, more importantly, with a thoughtful introspection and self-examination that we so rarely see in public life.

25.) Blinded by hatred, he failed to comprehend what Reverend Pinckney so well understood – the power of God's grace. (Applause.)

26.) This whole week, I've been reflecting on this idea of grace. (Applause.) The grace of the families who lost loved ones. The grace that Reverend Pinckney would preach about in his sermons. The grace described in one of my favorite hymnals – the one we all know: Amazing grace, how sweet the sound that saved a wretch like me. (Applause.) I once was lost, but now I'm found; was blind but now I see. (Applause.)

27.) According to the Christian tradition, grace is not earned. Grace is not merited. It's not something we deserve. Rather, grace is the free and benevolent favor of God – (applause) – as manifested in the salvation of sinners and the bestowal of blessings. Grace.
In particular, note Obama’s references to the forgiveness the families of the slain showed to the unnamed killer (Dylann Roof) and his assertion, in paragraph 27, that “grace is the free and benevolent favor of God.”

That strikes me as being like a mother’s love for a child and that love is mediated by the attachment system, as analyzed by John Bowlby and his students [3]. In an essay on Coleridge’s “This Lime-Tree Bower My Prison” I show how Coleridge activated the attachment in tracing the relations between himself, his friends, and the natural world [4]. Obama, I believe, is doing the same thing in this eulogy.

Now consider these three paragraphs, the last before the eulogy’s final phase, where Obama sings “Amazing Grace” and rings the names of those who’d been slain:
42.) Reverend Pinckney once said, "Across the South, we have a deep appreciation of history – we haven't always had a deep appreciation of each other's history." (Applause.) What is true in the South is true for America. Clem understood that justice grows out of recognition of ourselves in each other. [...] He knew that the path of grace involves an open mind – but, more importantly, an open heart.

43.) That's what I've felt this week – an open heart. That, more than any particular policy or analysis, is what's called upon right now, I think – what a friend of mine, the writer Marilyn Robinson, calls “that reservoir of goodness, beyond, and of another kind, that we are able to do each other in the ordinary cause of things.”

44.) That reservoir of goodness. If we can find that grace, anything is possible. (Applause.) If we can tap that grace, everything can change. (Applause.)
Now he’s telling us what he feels, that he has “an open heart.” He is no longer talking about what happened a few days ago, nor about God’s relation to humans, but about himself, here and now, and about what we must all do, now and in the future.

Conceptual Metaphor Wiki Online

Available HERE as of August 2016.

* * * * *

MetaNet Metaphor Wiki

Overview

The ongoing objective of the MetaNet Project is to systematize metaphor analysis in a computational way. As part of this work, MetaNet has developed formal representations of metaphors as mappings from one domain (the Source domain) to another (the Target domain). Both Source and Target domains are represented as frames, which are schematic representations of different kinds of experiences, objects, and events. Using this formalization, MetaNet has built a large repository of networks of interrelated metaphors, as well as networks of semantic frames that act as source and target domains of metaphors.

Contents

This MetaNet wiki contains a searchable, publicly-accessible subset of the full MetaNet metaphor and frame repository; this beta-version will be revised and expanded on an ongoing basis.
  • The list of currently available metaphors can be found here: Metaphors
  • The list of currently available frames can be found here: Frames
  • Definitions of commonly used terms, and descriptions of the different fields found on the metaphor and frame pages can be found here: Glossary
By selecting an individual frame or metaphor from this list, you can view its internal structure, along with its specified relations to other frames and metaphors within the network. Because this is only a subset of the full repository, some of the links on these pages (links in red) are not live links.

While none of the pages are editable, viewers are invited to leave comments using the 'Discussion' tab for a given page.

Further information

The MetaNet Metaphor Wiki is currently housed at the International Computer Science Institute in Berkeley, California.

Further information about the MetaNet project, as well as links to selected publications can be accessed via the MetaNet webpage: https://metanet.icsi.berkeley.edu/metanet/

The end of cinema, NOT!

David Bordwell has a column about that film journalism favorite, the article proclaiming the end of cinema. He observes, quite rightly, that it's become a cliché unrelated to reality. The whole article is worth reading, but check this out:
In talking about “our” cinema, I’ve been too glib, though this angle fits with an assumption of the death-knoll critics (“Movies as We Know Them”). Of course, Jacobs, Raftery, and Burr all acknowledge that Hollywood isn’t making movies just for us; it’s a world industry. People elsewhere (many recently arrived in the local equivalent of the middle class) seem keen to participate in American popular culture, with fashion, music, TV, and websites. Hollywood entertainment, lame as it often is, is part of being cosmopolitan.

Still, maybe it’s time to admit that we don’t own Hollywood. Maybe we never did, but it seems clear that with globalization “our” popular cinema is becoming something else–not exactly “theirs,” but not wholly ours either. Now You See Me 2 may have attracted only mild interest here: little cultural chitchat, except maybe among magicians, and $65 million box office (less than Lt. Robin Crusoe, USN). But it garnered $266 million internationally. Nearly a hundred million of that came from China, perhaps partly owing to long stretches set in Macau and short stretches featuring Jay Chou Kit-lun. And the director was Asian-American Jon M. Chu. ...

This won’t stop. One of the most astonishing and puzzling facts of contemporary cinema gets almost no press, maybe because it contravenes the death-of-film narrative. Over the last ten years, there has been a huge rise in the number of feature films.

In 2001, the world produced about 3800 features annually. The number passed 4000 in 2002, passed 5000 in 2007, and passed 6000 in 2011. In 2014, IHS estimates, over 7300 feature films were made in the world. There are now fifteen countries that produce over 100 features a year. As a result, only 18% of the world’s features come from North America. The boom took place despite the rise of home video, cable, satellite, DVD, Blu-ray, VOD, and streaming. And it happened despite the fact that American blockbusters rule nearly every national market. This may be a bubble, or it may be genuine growth. In any case, we ought to investigate the reasons that a great many people around the world stubbornly persist in making two-hour films. They don’t appear to care if We sense a summer slump.

Two Current Flicks: Kubo and the Two Strings, The Magnificent Seven

On Thursday I saw Kubo and the Two Strings. I saw The Magnificent Seven yesterday (Friday). They are very different films, obviously. Seven is live-action and a remake of a remake. Kubo is stop-time animation and utterly original. I liked them both, though the Tomatometer (@ Rotten Tomatoes, natch) puts Kuba at 97% and Seven at 64%, which is fair.



The Magnificient Seven looks good; Denzel is fine in the lead. It’s your standard action flick, with guts, grit, special effects bursting all over the damn place and gorgeous wide shots of the West. But it’s no Mad Max: Fury Road. And it’s probably not as good as the Yule Brenner The Magnificient Seven (1960), its immediate model, or the 1954 Seven Samurai by Akira Kurosawa. I’ve seen both, but that was so long ago that I can’t claim to make a live comparison with the current film.

I was particularly paying attention to the music. Like, many I’m sure, I was wondering if there would be anything on the sound track as catchy as the soaring Elmer Bernstein theme that we all know so well, with that driving rhythmic riff behind a theme that doesn’t soar so much as it sweeps the horizon. There isn’t, but there is a theme that is obviously modeled after Bernstein’s–and they surely know that we recognize that–and, interestingly, Bernstein’s rhythm riff actually appears in this film, several times. And then, at the very end, as the roll the credits, we get Bernstein’s theme. Clever.



Kubo and the Two Strings is the best-looking film I’ve seen since, well, Mad Max: Fury Road. Yes, better looking that Pixar, which looks plastic and fruity by comparison. LAIKA has combined stop-motion animation with now-tradition 3D CGI to achieve a look that is haunting, glowing, and restful (if not serene) as the occasion requires.

The story is an adventure quest. Young Kubo lives with his (declining) mother in a cliff-top cave near a village. During the day he goes to the village square where he tells stories while strumming his shamisen as paper forms itself into origami figures that act out a story. Well, perhaps it’s THE story, one he tells over and over. It’s the story of his father, who died so long ago that Kubo has little memory of him.

And then, one day, he stays in town after dark–something his mother told him never to do. And so his quest begins. He is under attack and the only way he can save himself is by finding the armor he tells about in his story. The story is perhaps a bit complex, and the metaphysical shape of this world is a bit obscure (the evil is there, but its clear why), but in the end it’s a riff out of the old ouroboros, a snake swallowing its tail.

Here’s how they did it:


Friday, September 23, 2016

Chicago's Millennium Park the summer It Opened (2004)

I bought my first camera, a Canon PowerShot A75, so I could take these photos.

IMG_0238

IMG_0232

IMG_0375

IMG_0368

IMG_0301

In psychology, the times they are a changin'

Andrew Gelman has a long and very interesting post on the replication crisis in psychology, including a chronology of the major events that goes back to the 1960s (the passage is full of hyperlinks in the original:
1960s-1970s: Paul Meehl argues that the standard paradigm of experimental psychology doesn’t work, that “a zealous and clever investigator can slowly wend his way through a tenuous nomological network, performing a long series of related experiments which appear to the uncritical reader as a fine example of ‘an integrated research program,’ without ever once refuting or corroborating so much as a single strand of the network.”

Psychologists all knew who Paul Meehl was, but they pretty much ignored his warnings. For example, Robert Rosenthal wrote an influential paper on the “file drawer problem” but if anything this distracts from the larger problems of the find-statistical-signficance-any-way-you-can-and-declare-victory paradigm.

1960s: Jacob Cohen studies statistical power, spreading the idea that design and data collection are central to good research in psychology, and culminating in his book, Statistical Power Analysis for the Behavioral Sciences, The research community incorporates Cohen’s methods and terminology into its practice but sidesteps the most important issue by drastically overestimating real-world effect sizes....
2011: Various episodes of scientific misconduct hit the news. Diederik Stapel is kicked out of the pscyhology department at Tilburg University and Marc Hauser leaves the psychology department at Harvard. These and other episodes bring attention to the Retraction Watch blog. I see a connection between scientific fraud, sloppiness, and plain old incompetence: in all cases I see researchers who are true believers in their hypotheses, which in turn are vague enough to support any evidence thrown at them. Recall Clarke’s Law.

2012: Gregory Francis publishes “Too good to be true,” leading off a series of papers arguing that repeated statistically significant results (that is, standard practice in published psychology papers) can be a sign of selection bias. PubPeer starts up.

2013: Katherine Button, John Ioannidis, Claire Mokrysz, Brian Nosek, Jonathan Flint, Emma Robinson, and Marcus Munafo publish the article, “Power failure: Why small sample size undermines the reliability of neuroscience,” which closes the loop from Cohen’s power analysis to Meehl’s more general despair, with the connection being selection and overestimates of effect sizes....

Also, the replication movement gains steam and a series of high-profile failed replications come out. First there’s the entirely unsurprising lack of replication of Bem’s ESP work—Bem himself wrote a paper claiming successful replication, but his meta-analysis included various studies that were not replications at all—and then came the unsuccessful replications of embodied cognition, ego depletion, and various other respected findings from social pscyhology.

2015: Many different concerns with research quality and the scientific publication process converge in the “power pose” research of Dana Carney, Amy Cuddy, and Andy Yap, which received adoring media coverage but which suffered from the now-familiar problems of massive uncontrolled researcher degrees of freedom (see this discussion by Uri Simonsohn), and which failed to reappear in a replication attempt by Eva Ranehill, Anna Dreber, Magnus Johannesson, Susanne Leiberg, Sunhae Sul, and Roberto Weber....

2016: Brian Nosek and others organize a large collaborative replication project. Lots of prominent studies don’t replicate. The replication project gets lots of attention among scientists and in the news, moving psychology, and maybe scientific research, down a notch when it comes to public trust. There are some rearguard attempts to pooh-pooh the failed replication but they are not convincing.

Late 2016: We have now reached the “emperor has no clothes” phase. When seemingly solid findings in social psychology turn out not to replicate, we’re no longer surprised.
Gelman notes, however, that though the problems had been noticed long ago, it wasn't until quite recently the discipline started to see a crisis in its foundations. The rest of the post looks at two things: 1) the way the epistemological and methodological foundations of psychology have changed, and 2) how the rise of social media has taken the discussion outside the formal literature, which is to some extent controlled by the old guard.

On the first:
A paradigm that should’ve been dead back in the 1960s when Meehl was writing on all this, but which in the wake of Simonsohn, Button et al., Nosek et al., is certainly dead today. It’s the paradigm of the open-ended theory, of publication in top journals and promotion in the popular and business press, based on “p less than .05” results obtained using abundant researcher degrees of freedom. It’s the paradigm of the theory that in the words of sociologist Jeremy Freese, is “more vampirical than empirical—unable to be killed by mere data.” It’s the paradigm followed by Roy Baumeister and John Bargh, two prominent social psychologists who were on the wrong end of some replication failures and just can’t handle it.

I’m not saying that none of Fiske’s work would replicate or that most of it won’t replicate or even that a third of it won’t replicate. I have no idea; I’ve done no survey. I’m saying that the approach to research demonstrated by Fiske in her response to criticism of that work of hers is an style that, ten years ago, was standard in psychology but is not so much anymore. So again, her discomfort with the modern world is understandable.
On the second:
Fiske is annoyed with social media, and I can understand that. She’s sitting at the top of traditional media. She can publish an article in the APS Observer and get all this discussion without having to go through peer review; she has the power to approve articles for the prestigious Proceedings of the National Academy of Sciences; work by herself and har colleagues is featured in national newspapers, TV, radio, and even Ted talks, or so I’ve heard. Top-down media are Susan Fiske’s friend. Social media, though, she has no control over. That’s must be frustrating, and as a successful practioner of traditional media myself (yes, I too have published in scholarly journals), I too can get annoyed when newcomers circumvent the traditional channels of publication. People such as Fiske and myself spend our professional lives building up a small fortune of coin in the form of publications and citations, and it’s painful to see that devalued, or to think that there’s another sort of scrip in circulation that can buy things that our old-school money cannot.

But let’s forget about careers for a moment and instead talk science.

When it comes to pointing out errors in published work, social media have been necessary. There just has been no reasonable alternative. Yes, it’s sometimes possible to publish peer-reviewed letters in journals criticizing published work, but it can be a huge amount of effort. Journals and authors often apply massive resistance to bury criticisms.
H/t Alex Tabarrok.

Thursday, September 22, 2016

23 Big Macs for 2016: More special sauce for the elite few

Back in 2013 I did a series of articles on the MacArthur Fellows Program (collected as The Genius Chronicles) arguing that the Academy of Big Mac (aka the MacArthur Foundation) was copping out by giving the majority of its awards to people who don’t really need them because they had secure employment at prestigious institutions. If they wanted to be true to their original mandate, to seek out those not normally graced by the award fairies, they should avoid those institutions entirely. But they don’t listen. How could they? They’ve got to feed the vanity of the elite institutions on which they depend for advice, personnel, and approval.

Back in 1992 Anne Matthews wrote a full-dress review of the program for The New York Times Magazine, “The MacArthur Truffle Hunt,” in which she observed: “Officials at other foundations note the MacArthur fellows program has never really decided if its job is to reward creativity or to stimulate it, if it wants to be an American Nobel Prize or a fairy godmother to talents unappreciated by mainstream society.” Their solution seems to have been to aim for the Nobels while appearing to be a fairy godmother. So they favor those firmly entrenched in elite institutions, spawning ground for Nobels, but who have not yet reached the highest levels in those institutions, though some of them are pretty high indeed, with named professorships.

They’ve just announced their class of 2016 and they’re following true to form: 23 awards, of which 13 go to people who have lifetime employment at good universities. That’s 57%. They may well be fine and innovative people, probably are, but why not give those awards to people who work temp gigs, fast-food or low to mid-level office gigs, any kind of make-do gig, to support their creative efforts in the evenings and on weekends? Why not? Because it’s too hard to find them, requires too much imagination and a taste for risk, that’s why.

Waffle Tallies

Here’s the Big Mac “waffle” tally (awards to people with secure gigs) for the last four years:

2013: 63%
2014: 52%
2015: 54%
2016: 57%



MacArthur Fellowships: Let the Geniuses Free – This is the original post in the series and tallies the winners for 2013.

Wednesday, September 21, 2016

Emotional arousal of a drama increases social bonding

Royal Society Open Science

Emotional arousal when watching drama increases pain threshold and social bonding

R. I. M. Dunbar, Ben Teasdale, Jackie Thompson, Felix Budelmann, Sophie Duncan, Evert van Emde Boas, Laurie Maguire
Published 21 September 2016. DOI: 10.1098/rsos.160288

Abstract

Fiction, whether in the form of storytelling or plays, has a particular attraction for us: we repeatedly return to it and are willing to invest money and time in doing so. Why this is so is an evolutionary enigma that has been surprisingly underexplored. We hypothesize that emotionally arousing drama, in particular, triggers the same neurobiological mechanism (the endorphin system, reflected in increased pain thresholds) that underpins anthropoid primate and human social bonding. We show that, compared to subjects who watch an emotionally neutral film, subjects who watch an emotionally arousing film have increased pain thresholds and an increased sense of group bonding.

Introduction

Fiction, in the form of both storytelling and drama, is an important feature of human society, common to all cultures. Though widely studied in the humanities, the reasons why we become so engrossed in fiction, and the likely functions for this, have attracted very little attention from either psychologists or behavioural biologists. Yet, it is evident that people are willing to spend a great deal of time, and often money, to be entertained in this way, whether casually in social contexts or formally in the theatre or cinema, often incurring significant costs when doing so. Storytelling forms a major component of evening conversations around the campfire in hunter–gatherer societies [1]. One important function is that it enables us to pass on, in the form of origin stories or a corpus of commonly held folktales and folk knowledge, the cultural ideologies that create a sense of community. Shared knowledge forms part of the mechanism that binds friends [2–5] as well as communities [6,7].

As important as these cognitive aspects of storytelling may be for community bonding, they do not explain why we are willing to return again and again to be entertained by storytellers and dramatists. One plausible explanation for our enjoyment of comedy might be that comedy makes us laugh, and laughter activates the endorphin system [8–11], thereby providing a sense of reward and pleasure. Endorphins act as analgesics and increase tolerance of pain [12], being responsible for well-known phenomena like the ‘runner's high’ [13]. As a result, comedy that makes us laugh out loud results in an increase in pain threshold [8–10]. But why should we be just as engaged by emotionally stirring plots that ‘reduce us to tears’ (i.e. tragedies)? One possibility is that the emotional arousal triggered by such stories also activates the endorphin system, because the same areas of the brain that support or respond to physical pain are also involved in psychological pain [14–17]. There is now an extensive literature suggesting that social rejection or viewing emotionally valenced pictures, and even just musically induced mood change, elevate pain thresholds, thereby seeming to allow subjects to attenuate their responses to negative emotional experiences [18–22]. There is even some suggestion that watching a dramatic film increases pain threshold, albeit with small samples and somewhat mixed results [23,24].

While the cognitive component of social bonding is important in maintaining relationships through time in humans, primate social relationships and the bonding of social groups in humans, it is also underpinned by a psychopharmacological mechanism in what is effectively a dual mechanism process [25]. Endorphins, while part of the brain's pain management system [12,26–29], also play a central role in social bonding in anthropoid primates [30–33]. This latter effect is mediated through the afferent c-tactile neural system [34] by the light stroking that occurs during social grooming, and PET imaging has confirmed that this behaviour activates the endorphin system in humans [35]. It seems that a number of other social activities, including laughter [8], singing [36] and dancing [37], also activate this system and, through this, enhance the sense of bonding to the other individuals present.

We used live audiences to test the hypothesis that emotionally arousing film drama triggers an endorphin response (indexed by change in pain threshold) and, at the same time, increases the sense of belonging to the group (social bonding).

Tuesday, September 20, 2016

Why I Love Disney’s Nutcracker Suite

Another post from 2013, this one about one of my all-time favorites. I watched this many times as a child. It fascinated me on black-and-white TV. It entrances me in color.
* * * * *

The Nutcracker Suite episode in Walt Disney's Fantasia is one of my favorite episodes on film and, I believe, one of the most beautiful films ever made. Whatever beauty is, this film has got it. I've written about about its formal elegance, a ring form which is the same forwards as backwards. But that would mean little if it didn’t look good.

And it does look good.

The following frame is, well, “typical” isn’t quite the word I want, perhaps “illustrative.” Yes, the following frame is illustrative:


It’s a close-up of pine needles during a snow fall. The twig is maybe two, three, four inches long. The whole segment consists of such close-ups: intimate observations of the natural world. But also, this frame is monochromatic. While The Nutcracker Suite episode of Fantasia displays a full palette, there are many sequences where the palette is subtle, rather than using highly-saturated “cartoony” colors. In this frame, it’s all blues.

This next frame, in contrast, is in browns and greys, with some green and gold.


Here we see whites and grey-greens cut with green needles:


Of this “milkweed ballet” Disney said: “If you have a pod, and the fairies touch it, all the seeds fly out almost as if they’re alive . . . I think there’s something beautiful in those seeds ballet-ing through the air; I like to use them because we can get off the ground and have our ballet in air” (Culhane, p. 62). Their flight leads to another almost monochromatic scene:


A Brief for Description

An old one, from 2013, riding my favorite hobbyhorse (one of them, at any rate): description.
* * * * *

Writing at Edge, Ursula Martin praises description:
Once upon a time such observation, description and illustration was the bread and butter of professional and amateur scientists. My eight volume flora, on heavy paper with lovely illustrations that are now collectors' items, was well-thumbed by the original owner, a nineteenth century lady of leisure. It claims to be written for the "unscientific", but the content differs from a modern flora only by the inclusion of quantities of folklore, anecdotes and literary references.

Darwin's books and letters are full of careful descriptions. The amateur struggling with a field guide may take comfort reading how he frets over the difference between a stem with two leaves and a leaf with two leaflets. Darwin seems to have had a soft spot for fumitories, giving wonderfully detailed descriptions of the different varieties, whether and under what conditions they attracted insects, and how the geometry and flexibility of the different parts of the flower affected how pollen was carried off by visiting bees. He was looking for mechanisms that ensured evolutionary variability by making it likely that bees would occasionally transfer pollen from one flower to another, giving rise to occasional crosses—analysis later reflected in the Origin of Species....

No amount of image analysis or data mining can yet take the place of the attention and precision practiced by Darwin and thousands of other professional and amateur naturalists and ecologists.

Monday, September 19, 2016

Martha Mills: Defending civil and voting rights in Mississippi @3QD

My friend, Martha A. Mills, is a very distinguished trial attorney and judge. Early in her career she worked in Mississippi and later Illinois as a civil rights attorney. She tangled with Grand Imperial Wizards, an Exalted Cyclops or two, good old boys on their worst behavior, and won some and lost some. She also directed a choir, was city attorney in Fayette, tried to explain “Sock it to me, baby!” to a racist judge, sweated the Mississippi bar exam, and took kids to swim in the pool at the Sun ‘N Sands Motel, prompting the locals to triple the dose of chlorine. She’s just published a memoir of those years, Lawyer, Activist, Judge: Fighting for Civil and Voting Rights in Mississippi and Illinois (2015). I’ve reviewed it around the corner at 3 Quarks Daily.

The first case she tried involved Joseph Smith, president of the Holmes Country NAACP. He was accused of running a red light. It was his four witnesses against the ticketing highway patrolman. The case was tried before a justice of the peace, who had no legal training (Mississippi doesn’t require it of JPs). Here’s how that went (112-113).

* * * * *

When we got to the town hall, Joseph Smith, myself, and the four witnesses were told to sit down and wait a few minutes. A police officer came over and asked if it was okay if he gave the oaths to the witnesses, as the JP did not know how. I said it was fine. The trial started with the officer intoning “Hear Ye, Hear Ye” and all that (just like an old British movie) and swearing in the witnesses. And then the JP looked at me and at the highway patrolman who, in addition to having written the ticket, was also acting as prosecutor.

“What am I supposed to do next?”

I answered, “The normal procedure would be for the state to present its case first, and then us.”

“That sounds fine, carry on,” he smiled.

The highway patrolman went on to tell his story–adding that he did not give the ticket because of race or anything like that.

I then put on our witnesses who gave uncontradicted testimony that they knew Smith and his car, were right in the vicinity where they could see everything perfectly, and they saw Smith come to a complete stop behind the stoplight. Smith, of course, personally denied running the stoplight. At that point, both the highway patrolman and I said we were finished. The JP and the patrolman got up and started to walk off, discussing the case.

I overheard the JP, “Now son, how do you think I ought to decide this here case?”

Upon hearing that I followed them, “You honor, this is all highly improper. I have to be present at any conferences you have about this case!”

“That’s fine,” both men nodded at me, but it did not temper their conversation at all.

After some argument between us, the highway patrolman said if I did not think his case was strong enough, he would put on another witness. The witness was the police officer who had administered the oaths. He testified that he was in the vicinity of the violation but that he did not see whether Smith stopped or not. That added evidence seemed to convince the JP, and he gave Smith a fine. We immediately posted an appeal bond. I felt like I was in a Gilbert and Sullivan operetta. It was an unbelievable farce.

Saturday, September 17, 2016

Consciousness, once more around the merry-go-round

HYPOTHESIS & THEORY ARTICLE

Front. Psychol., 03 September 2013 | http://dx.doi.org/10.3389/fpsyg.2013.00574

The wild ways of conscious will: what we do, how we do it, and why it has meaning

  • Director, Institute for Prospective Cognition, Department of Psychology, Institute for Prospective Cognition, Illinois State University, Normal, IL, USA
ABSTRACT: It is becoming increasingly mainstream to claim that conscious will is an illusion. This assertion is based on a host of findings that indicate conscious will does not share an efficient-cause relationship with actions. As an alternative, the present paper will propose that conscious will is not about causing actions, but rather, about constraining action systems toward producing outcomes. In addition, it will be proposed that we generate and sustain multiple outcomes simultaneously because the multi-scale dynamics by which we do so are, themselves, self-sustaining. Finally, it will be proposed that self-sustaining dynamics entail meaning (i.e., conscious content) because they naturally and necessarily constitute embodiments of context.
While the present paper addresses the relationship between consciousness and action control, its ultimate goal is to propose that terms such as “action” and “consciousness” are scientifically inadequate and, in the end, may have to be replaced in a scientific account of what we do, how we do it, and why it has meaning. This is because, as I will argue, the current conceptual framework used in cognitive science (e.g., perception, cognition, action, attention, intention, and consciousness) is not capable of addressing the complex array of causal regularities that have been discovered in cognitive science over the past 30 years.
In addition, the current conceptual framework has yet to give rise to a scientific conception of how we do what we do that renders the phenomenon of “consciousness” a necessary aspect of the causal story. That is, consciousness is described as either identical with the physical (i.e., identity theory), emergent from the physical (i.e., emergentism), as an informational property of causal relations (i.e., functionalism), or as an aspect of reality other than the physical (i.e., double-aspect theory and property dualism). In all of these positions, consciousness is not a logically necessary aspect of the causal story. That is, the scientific, causal description of how we do what we do is able to disregard consciousness as a causal factor.
While the notion that consciousness might not be logically necessary is certainly popular, one might also take it to indicate the need for an approach to “how we do what we do” that renders consciousness causal (i.e., non-ephiphenomenal). In what follows, I present Wild Systems Theory as an approach to causality and consciousness that renders the latter logically necessary. To be sure, by the time this has been explicated, the term “consciousness” will mean something different that what is referred to via constructs such as Access Consciousness, Metacognition, and Phenomenal Consciousness (Block, 19952001Cleeremans, 2005).

* * * * *

Full article HERE.

More on Chomsky

From a short note by Michael Covington, "Has Chomsky been blown out of the water?":
Chomsky’s first major contribution to linguistics was item 1 in the outline, mechanisms to describe syntax precisely. Before he came on the scene, linguists widely accepted the behaviorist dictum that science can only refer to observed behavior. Some took this so far as to forbid that the study of meaning (which is only observable within the mind), which greatly impeded the study of sentence structure. Chomsky argued cogently that abstract models are as appropriate in linguistics as in physics. He allowed syntactic theory to be abstract enough to do its job. 
Besides the tree structures that are now familiar, Chomsky introduced transformations, which are rules that turn one tree structure into another. At first these were used to turn one kind of sentence into another, such as declaratives into questions. They made the grammar more concise; after you accounted for one kind of sentence, the transformation gave you another kind of sentence with no extra work. 
Very soon, transformations took on a different function, to build observed sentences from more abstract structures that are not observed. This made it possible to simplify the grammar and make it more general. 
Since Chomsky's early work linguists have developed many methods for precisely describing syntax, some inspired by Chomsky, some not. More than anyone else, he got that ball rolling.

What's in doubt is his postulation of universal grammar:
He proposed that the principles of grammar are the same in all languages; the brain is pre-programmed for universal grammar. To learn a particular language, all you have to learn is the vocabulary and a set of “parameters” that establish how the grammar rules play out in that particular language. That is how children manage to learn to talk so quickly and efficiently, without being taught, from hearing incomplete examples of speech. [...] 
The second is advances in cognitive psychology. Back in 1957, there was almost no cognitive psychology. Chomsky was rebelling against Skinnerian behaviorism. Most psychologists at the time advocated an impossibly simple theory of how the mind works. Against that backdrop, he had to postulate inborn universal grammar in order to do everything that simple stimulus-response associations couldn’t do. Nowadays we know that the human mind has complex, powerful, abstract capabilities in many areas, not just language, and new possibilities are opening up for explaining language from general mechanisms of thinking and learning.
This judgment seems right to me:
What is pervasive is the expectation that other linguists will investigate questions that Chomsky raised, whether to prove him right, prove him wrong, or follow the evidence in some other direction entirely. That’s how science is done. This expectation is warranted because Chomsky has raised undeniably important questions.

Friday, September 16, 2016

Adam Smith on Music and Dance

Courtesy of Colwyn Trevarthen:

After the pleasures which arise from gratification of the bodily appetites, there seems to be none more natural to man than Music and Dancing. In the progress of art and improvement they are, perhaps, the first and earliest pleasures of his own invention; for those which arise from the gratification of the bodily appetites cannot be said to be his own invention.
Adam Smith ([1777] 1982: 187)

Time and measure are to instrumental Music what order and method are to discourse; they break it into proper parts and divisions, by which we are enabled both to remember better what has gone before, and frequently to forsee somewhat of what is to come after: we frequently forsee the return of a period which we know must correspond to another which we remember to have gone before; and according to the saying of an ancient philosopher and musician, the enjoyment of Music arises partly from memory and partly from foresight.
Adam Smith ([1777] 1982: 204)

Smith, A. ([1777] 1982). Of the Nature of that Imitation which takes place in what are called the Imitative Arts. In W. P. D. Wightman & J. C. Bryce (eds.) with Dugald Stewart’s account of Adam Smith (ed. I. S. Ross), D. D. Raphael & A. S. Skinner (General eds.), Essays on Philosophical Subjects (pp. 176–213). Indianapolis: Liberty Fund.

Thursday, September 15, 2016

Malick's "Voyage of Time", was Disney there first in "Fantasia"?



Richard Brody in The New Yorker, "Terrence Malick's Metaphysical Journey into Nature":
It’s a sort of vast and visually overwhelming nature documentary, albeit with brief acted sequences, and, as such, it’s an easy film to parody and to mock—say, as the Terrence Malick Science-Wonder Visual Encyclopedia, or “The Tree of Life” with the funny bits cut out. But that’s true of any intensely serious work of art. “Voyage of Time” inhabits a rarefied plane of thought, detached from the practicalities of daily life, that leave it open to a facile and utterly unjustified dismissal, given the breathtaking intensity of its stylistic unity and the immediate, firsthand force of its philosophical reflections.

“Voyage of Time” is, as its title suggests, a sort of cinematic cosmogony, a lyrical collage that looks at a broad spectrum of natural phenomena artistically and imaginatively. In “The Tree of Life,” macroscopic and telescopic images as well as C.G.I. reconstructions of prehistoric times recapture the wonder of a childhood contemplation of science, as inspired by the popular-science books of Malick’s own youth. “Voyage of Time,” working with similar images, recomposes them to an altogether different, yet crucially related, purpose: it seeks the very source of that wonder not in the child’s imagination but in the essence of nature, in the fundamental building blocks of the universe itself.
Sounds like what Disney was up to in (parts of) Fantasia. See my working paper: Walt Disney's Fantasia: The Cosmos and the Mind in Two Hours.

And: "For that matter, it’s often thrillingly difficult, even impossible, to tell what’s intergalactic and what’s intracellular, what’s infinitesimal and what’s immense, what’s biological and what’s astronomical." Check out Eames, Powers of Ten.

McWhorter on Shakespeare: Should he be rewritten in modern English?

McWhorter has argued that Shakespeare's language is so difficult that it should be "adjusted" into modern English for modern readers and theatre-goers. I'm sympathetic. Yesterday I started watching the Zeferelli movie version of Hamlet, with Mel Gibson in the title role and Glenn Close as Gertrude, and at times the language just lost me. Here's a podcast where he discusses the subject with John Lynch.

Here's a post at The New Republic where McWhorter makes his case. I quoted passages from that post in an old post at The Valve and it generated a bit of discussion, including a comment from Kent Richmond, who has rendered five plays into modern English.