Tuesday, December 29, 2020

Ray Bradbury and the emergence of science fiction in mid-century America

 Sam Weller and Dana Gioia discuss the impact of Ray Bradbury in the Los Angeles Review of Books.

Weller: Is there any way to measure Ray’s impact on popular culture?

Gioia: Let me offer one perspective. If you compiled a list in 1950 of the biggest grossing movies ever made, it would have contained no science fiction films and only one fantasy film, The Wizard of Oz. In Hollywood, science fiction films were low-budget stuff for kids. The mainstream market was, broadly speaking, “realistic” — romances, comedies, historical epics, dramas, war films, and adventure stories.

If you look at a similar list today, all but three of the top films — Titanic and two Fast and Furious sequels — are science fiction or fantasy. That is 94 percent of the hits. That means in a 70-year period, American popular culture (and to a great degree world popular culture) went from “realism” to fantasy and science fiction. The kids’ stuff became everybody’s stuff. How did that happen? There were many significant factors, but there is no doubt that Ray Bradbury was the most influential writer involved.

It’s interesting you say this because you don’t seem to be afraid — some critics don’t want to connect popular culture or mass culture with literature or with high intellectual arts. You seem to say that Bradbury is one of those people who brought these two things to the crossroads.

In my academic training, I was inculcated in the tradition of the psychological and social realist novel, the so-called “Great Tradition.” This was an extraordinary literary lineage — Austen, Eliot, Dickens, Conrad, James, Cather, Hemingway, not to mention Balzac, Stendhal, Flaubert, Tolstoy, Chekhov, and Dostoyevsky. The realist novel was one of the great achievements of Western literature. It provided a powerful means to articulate and understand personal and social relations of enormous complexity. Three cheers for realism! Maybe even four.

But there are different modes of storytelling. The most primitive is myth, where natural forces become personified in narratives. The next historical development was romance. In romantic narratives, we have the world not as it is but as we wish or fear it to be. This was the mode of medieval and Renaissance narratives. (Centuries later it also became the mode of science fiction, fantasy, horror, Gothic romance, and old-school mysteries.)

Realism is the mode that emerged last. Although the realist novel quickly became the dominant narrative form, its popularity only dates back about 400 years. The realist novel had a particular power that made it very attractive. The realist mode allowed one to see the world simultaneously from the inside and the outside. It compared — usually with a great deal of irony — the subjective experience of characters and the exterior world that surrounds them. Great novels mediate these two realities with tremendous finesse.

But realism is not the only way to tell a story, and the romantic mode never vanished. Even some of the realist masters, such as Dostoyevsky, Dickens, and Balzac, found themselves exploring the mode of romance to represent certain human possibilities. Romance remained very strong in American literature with some of our most original writers — Edgar Allan Poe and Nathaniel Hawthorne. But it never became academically respectable. It smacked of popular or children’s literature. As a senior at Stanford, I had to ask permission to add Mary Shelley to my reading list! (Another student asked to read H. P. Lovecraft and got a stern lecture.)

What books are you thinking about here? What do you consider Bradbury’s best period?

Sam, you’ll probably disagree with me — but I think Bradbury’s best work was mostly done in a 10-year period in the early part of his career. In one remarkable decade he wrote: The Martian Chronicles (1950), The Illustrated Man (1951), The Golden Apples of the Sun (1953), Fahrenheit 451 (1953), The October Country (1955), Dandelion Wine (1957), and A Medicine for the Melancholy (1959). The books came one right after the other, and he created a new mode of speculative fiction.

The culture immediately recognized his achievement. Suddenly, major mainstream journals published his fiction, and producers adapted his work for movies, radio, and TV. Millions of readers, who would not have read pulp fiction, came to his work. He also became the first science fiction author to attract a large female readership. ...

For 10 years, he was Joe DiMaggio. Every time he went to bat, there was a good chance he would hit the ball, sometimes out of the park. It’s significant that Ray’s great hitting streak came in the 1950s, a period of national optimism. Despite the anxiety, darkness, and anger in his work, Bradbury always wrote in a spirit of hope and reconciliation. He never believed humanity was beyond redemption. Perhaps as America shifted into the late 1960s and beyond, he lost touch with the culture.

Friday, December 25, 2020

What's up with GPT-3, neural nets, and machine learning?

This is a long rambling discussion, over three hours. But useful. Pro, con, down the middle, you name it. Dip in an out as you see fit.  
 
Connor Leahy, for example, is a true believer in machine learning etc., but he's right that: 1) GPT-3, after all, does something and we need to take it seriously, and 2) that it may even indicate something about the universe. Not sure he can give a useful explanation of the second, but I've taken my own crack at it in GPT-3: Waterloo or Rubicon? Here be Dragons, Version 2, which began as a series of blog posts at New Savanna. A big later, I think it is, Scarfe, and Duggar have a discussion of the distinction between pattern matching and reasoning that's worth a listen. They should take a look at this paper from the Olden Days:
Yevick, Miriam Lipschutz (1975) Holographic or Fourier logic. Pattern Recognition 7: 197-213.
https://sci-hub.tw/10.1016/0031-3203(75)90005-9

Abstract: A tentative model of a system whose objects are patterns on transparencies and whose primitive operations are those of holography is presented. A formalism is developed in which a variety of operations is expressed in terms of two primitives: recording the hologram and filtering. Some elements of a holographic algebra of sets are given. Some distinctive concepts of a holographic logic are examined, such as holographic identity, equality, containment and “association”. It is argued that a logic in which objects are defined by their “associations” is more akin to visual apprehension than description in terms of sequential strings of symbols.
Yes, it was published in 1975, which is ancient times in the world of artificial intelligence. It seems to me that what Yevick called holographic logic is similar in spirit, and even in mathematics in some respects, to current work on neural networks, while, in contrast, ordinary logic is as the abstract has it, “description in terms of sequential strings of symbols.” That gives us a starting point to think about the contrast between pattern matching and reasoning. I say a bit more about Yevick's work, and what David Hays and I made of it, in this post, Showdown at the AI Corral, or: What kinds of mental structures are constructable by current ML/neural-net methods? [& Miriam Yevick 1975].

Overall the video confirms my belief that we don't have a useful framework in which to think about minds, machine, and intelligence. We're constantly being surprised, and don't really know why anything works. Contrast this, for example, with the framework we have for thinking about manned expeditions to Mars. Elon Musk not withstanding we may not venture there for decades, who knows? But we can think about it in a detailed, systematic, and coherent way. We can't do that with machine intelligence.

Festive Table – Look but don't touch [Longwood Gardens 2012]

IMGP2531

IMGP2527

Thursday, December 24, 2020

City in the distance

DeepMind's MuZero

Abstract of linked article:

Constructing agents with planning capabilities has long been one of the main challenges in the pursuit of artificial intelligence. Tree-based planning methods have enjoyed huge success in challenging domains, such as chess1 and Go2, where a perfect simulator is available. However, in real-world problems, the dynamics governing the environment are often complex and unknown. Here we present the MuZero algorithm, which, by combining a tree-based search with a learned model, achieves superhuman performance in a range of challenging and visually complex domains, without any knowledge of their underlying dynamics. The MuZero algorithm learns an iterable model that produces predictions relevant to planning: the action-selection policy, the value function and the reward. When evaluated on 57 different Atari games3—the canonical video game environment for testing artificial intelligence techniques, in which model-based planning approaches have historically struggled4—the MuZero algorithm achieved state-of-the-art performance. When evaluated on Go, chess and shogi—canonical environments for high-performance planning—the MuZero algorithm matched, without any knowledge of the game dynamics, the superhuman performance of the AlphaZero algorithm5 that was supplied with the rules of the game.

Mapping characters in the Gospels [digital humanities]

Sunday, December 20, 2020

Themo [old]

Seinfeld likes the Marx Brothers [so do I]

The NYTimes just ran a piece where Jerry Seinfeld talks about his current reading: “So the book that’s on my stand right now that I’m really, really enjoying is called Four of the Three Musketeers, by Robert Bader, and it’s a very, very long, detailed history of the Marx Brothers.” He goes on to remark:

This Bader — it’s a ridiculous book. I don’t know what this guy was thinking, that he would spend this much time and do this much work on the Marx Brothers. This is not an old book, it’s from like 2016. But he did the most incredible job of research. And he’s also a very good writer. And you know, to me, the history of the Marx Brothers is kind of the invention of comedy, not just as a substance but as a business. And also, if I can put this the right way: They emerged as Jews in New York City, just kind of coming out and going, “By the way, we’re better at this than anybody.” And ever since then you can trace — I mean, you can trace comedy obviously back to the Greeks — but in terms of what you see now, in the world, the Tigris and Euphrates to me is the Marx Brothers and vaudeville in the teens and early ’20s of the last century.

I share Seinfeld’s admiration and affection for the Marx Brothers. Years ago in my late teens I read an autobiography by Groucho, Groucho and Me I believe it was called, and one by Harpo, Harpo Speaks. Both of course talked about their early days in vaudeville, which I found fascinating. I remember that Harpo also talked about playing golf. Groucho may have talked about golf as well, but it’s Harpo I remember, in particular how, for whatever reason, he played a round in the nude.

I’d grown up watching Groucho on TV. He had a quiz show called “You Bet Your Life.” The quizzing wasn’t very serious – the money was peanuts. It was a vehicle for Groucho’s wit. My father loved the show. He loved language and Groucho was a master of word play, puns in particular. I associate the term “ad lib” with my father’s remarks about Groucho. It was only later that I’d associate the term with jazz performance.

Father also loved their movies, but I didn’t get to see any of them until I went to college. Seinfeld mentions their first film:

The first Marx Brothers movie was “The Cocoanuts,” which was originally a play that they did. And talkies had just happened, and a couple of years later Hollywood was looking around for anybody that could talk on film, and obviously grabbed the Marx Brothers. And so they made this movie. Nobody really knew how to make a movie; they just kind of shot the play, onstage. And the brothers hated it. They hated it so much, they wanted to buy it back so it wouldn’t be released. Then it was a gi-monster hit and it made a fortune. That’s so much fun to read, stuff like that. People do things and they hate them and the public

loves them, and then they have to change their thinking on it.

I’ve seen it, at least once in college, and probably once or twice on TV. I’ve also seen Horse Feathers, Duck Soup and A Night at the Opera (I own that on DVD). I’ve probably seen the others, but I don’t have specific memories. Aside from general antics, each film would have a set piece where Chico played the piano and Harpo played the Harp.

My favorite scene from the films is a one from Duck Soup where Groucho and Harpo mirror one another. Harpo is disguised as Groucho. The mirroring starts at 2:04, but you should watch from the beginning to get some idea of the context.

Groucho is in the foreground and Harpo is the mirror image. He gives the game away at c. 3:41 when he has the wrong hat behind him – it’s his standard “Harpo” hat. Chico joins them in a three-way question.

In response to a question, Seinfeld remarks that while he’d like to see the Marx Brothers adapted for TV or movies

But I don’t know if people would be interested. And you could never do it. You could never recreate these guys. Remember they did that Three Stooges movie? My manager, George Shapiro, says trying to get someone to act like a comedian is like trying to get them to act like a baseball player. It’s almost impossible. There’s so many tiny polished movements they have that the best actors struggle to replicate.

What about the effort it took for Groucho and Harpo to imitate one another?

Monday, December 14, 2020

Heron, reflections, shadings

Jerry Seinfield on his career and craft [Progress in harnessing the mind]

Skip the first 7:35; the host, Tim Ferriss, reads advertisements for his sponsors. The rest (except for another commercial interruption at about 30:50) is a fascinating conversation with Jerry Seinfield, who's just published a new book about his career and craft, Is This Anything? One of the most interesting lines of discussion is about how Seinfield 'trains' is brain to write. Yes, he's a stand-up comedian performing for a live audience, but he creates his material through writing.

He repeatedly uses talks of the brain as a dog that needs to be trained. The training must be systematic, and the system(s) must be simple. THe first talks about writing at about 12:30 or so, though the dog metaphor doesn't show up until somewhat later.

First of all he emphasizes that writing is difficult, very difficult. There are two phases (c. 15:00 or so): 1) free-form creative, 2) polish and construction. Writing is 95% rewriting. Once he's polished a bit to the point where it sounds pleasing to his (inner) ear, he takes it on stage. He registers audience response and uses it to guide more rewriting. C. 17:42:

Creating, fixing, jettisoning, it's extremely occupying, it's never boring, it's, the frustration I'm so used to it at this point I don't even notice it, And, it's just work time.... I like the way athletes talk about "I gotta' get my work in."

A bit later he uses the phrase, "the systemization of the brain and creative endeavor." That's clearly something Seinfield has thought about a lot, and over his whole career. And then 18:34: So basically it's on stage and off stage, the desk and then the stage, and then back to the desk, and then back to the stage, and that's endless." So we've got two desk phases, creative and rewriting, and then back and forth between desk and stage. Seinfield then does on to talk about being cranky and irritable. That's the source of those little insights around which he builds his bits.

Skipping over various material, including his TV show, his younger years when he looked into (c. 30:00), "yoga, Zen, a little Scientology, Transcendental Meditation, Buddhism, I read a lot of stuff...I was looking for a working philosophy" and from that he created his own "operating system" his term. "It's very pragmatic. It's not faith-based in any way."

Seinfield then starts talking about his daughter, who has a "creative gift." (c. 33:57): "When you have a creative gift, it's like someone just gave you a horse. Now you have to learn how to ride it...You either learn to ride this thing, or it's going to kill you." And now we get back to writing (c. 33:41):

If you're going to writing, make yourself a writing session What's the writing session? I'm going to work on this problem. Well, how long are you going to work on it? Don't just sit down with an open-ended "I'm gonna work on this problem." That's a ridiculous torture to put on a human being's head... You've got to control what your brain can take... You have to have an end time to your writing session. If you're gonna sit down at a desk with a problem and do nothin' else, you gotta' get a reward for that. And the reward is the alarm goes off and you're done. You get up and walk away a go have some cookies and milk...That's the beginning of a system.

He goes on and gets around to exercise, mentioning a book, Body for Life, by Bill Phillips, which he praises for the way it systematizes exercise. And now we come to it, the dog, (c. 38:10): "You gotta' treat your brain like a dog you just got. You got it; it's so stupid. The mind is infinte in wisdom. The brain is a stupid little dog that is easily trained....Do not confused the mind with the brain."

Though it is obvious enough, notice Seinfield's distinction between mind and brain. Though he no doubt knows that the mind is somehow in the brain, he doesn't talk about it. He talks about the brain. And he talks about training is as though one trains one's body or as one trains a stallion or a "stupid little dog." This training is being executed by an operating system. His instructions about writing remind me of the routine of Anthony Trollop, the Victorian novelist, who set himself to writing 250 words every 15 minutes for three hours a day between 5:30 and 8:30 each morning. Following this routine he wrote more than a novel a year for 30 years.

Let's return to Seinfeld. He tells us that we should never tell anyone about what we wrote on the day we wrote it. Why? Because they might criticize it and that would rob you of the satisfaction of having done your writing for the day. That satisfaction is very important. It's your reward. (c. 40:34): "The key to being a good writer is to treat yourself a baby, very extremely nurturing and loving, and then switch over to Lou Gossett in Officer and a Gentleman and just be a harsh prick ball-busting son-of-a-bitch about that is just not good enough." You switch back and forth between these two modes, which Seinfeld likens to two "quadrants" of the brain.

Ferriss then asks him about performing and whether or not, when he's just finished a (killer) set, he asks for feedback from other comics. Before Ferriss finished Seinfeld remarks that "you just got feedback" (the audience's reaction); you don't need anything more: "You don't have to ask anyone anything."

About performance, Seinfeld goes on (c. 42:44):

There's no greater reward than that state of mind you're in when that set is working. If you can extricate yourself from your Self, which is the goal in all sports and performance arts, if you get out of your mind, and are able to just function ... there is no greater reward. But, you know, if you want to have an ice cream Sunday, go ahead. It's going to pale in comparison.

We're now only about halfway through the interview, but it's time for me to wrap this up. Feel free to listen to the whole thing.

It seems to me that Seinfeld is talking about behavioral modes, in the sense that I've written about them in many posts here at New Savanna. The idea of behavior mode comes from Warren McCulloch and it refers to distinct pattern of global brain activity that supports a particular kind of behavior. McCulloch is was interested in such things as hunting, sleeping, exploration, courtship and so forth. Seinfeld talks 1) creativity (generating ideas0, 2) critiquing, editing, and rewriting, and 3) performing. His operating system moves the brain (a stupid little dog) from one mode to another. Notice his emphasis on boundaries. The writing session has a definite beginning and a definite ending. So does a performance. The trick is to learn how to move the brain from one mode to another.

Finally, I note that being able to talk and write about the mind-brain in these terms, for a general audience and without mystification, counts a human progress as surely as DeepMind's recent breakthrough in protein folding. Why? Because, ultimately, a human behavior and hence all progress depends on the mind. Learning to harness the mind's activities is the deepest and most difficult task before us. Without that, little else can happen.

* * * * *

Addendum: Somewhat later in the discussion the Seinfeld raises the subject of depression, nothing that he still gets depressed, saying nothing about how often, for how long, nor how severely. Ferriss notes that he too gets depressed. Seinfeld mentions that about 20 years ago he read that depression seems to accompany creativity; that gave him a sense of relief. He does not, however, belief that creativity comes out of depression (as he said before, it comes from irritability and crankiness). You might want to take a look at this post from August, Perhaps these conditions aren't mental disorders at all (anxiety, depression, PTSD).

Sunday, December 13, 2020

Red Leaves

Star Trek Deep Space Nine, Politics and Matter [Media Notes 52]

I’ve watched all of the Star Trek franchise except for the animated series and for the most recent ones: Discovery, Short Treks, Picard, and Lower Decks. I’m currently well into season six of a re-watch of Deep Space Nine, which is, for me, those most interesting of the Trek series I’ve watched.

This is a speculative post about the myth-logic (think of Lévi-Strauss) operating in the series and, in particular, is a query about the relationship between Odo and the political nature of DS9.

Unlike the others in the series, DS9 centers on a single world: the station, Deep Space Nine, Bajor, the planet it orbits, and the wormhole that is near them. The others all move from place to place in the galaxy. To be sure DS9 moves about as well and there are many episodes that aren’t on either the station or Bajor, but the station is the central locus. More or less, I believe, as a consequence DS9 was able to develop longer story arcs, with the final two seasons being an extended story about an inter-world battle between the various people and empires in the series. Thus the series had a more political focus, though the politics tended to be relatively simple power politics ultimately realized through armed combat.

The first two Trek series, the original series and Next Generation, featured a character dominated by intellect, Spock and Data respectively. Voyager had a similar character, the holographic doctor. DS9 lacks such a character. To be sure, Doctor Bashir was very intelligent as a consequence of genetic enhancement in childhood (something we didn’t learn about until, what? the fifth season), but he carried his intelligence differently than Spock or Data. For example, he didn’t long after humanity in the way Data did.

But DS9 does have Constable Odo, who is trying to figure who and what he is. What he is is a Changling, a creature who has no intrinsic for and is able to take on a variety of different forms, not only of other humanoids, but or animals and even inanimate objects. He is in some sense the opposite of Spock and Data. They are Mind, while he is Matter.

As the series moves along we learn that there are other Changlings; indeed, there is a planet of them. And those Changlings are, in some mysterious fashion, the heart of the Dominion, an empire located in the Gamma Quadrant on the far side of the wormhole. First we learn of the Jem’Hadar, drug-addicted warriors. Then we learn of the Vorta, who command the Jem’Hadar. And finally we learn of the Founders, who run the Dominion. These Founders are Changlings, like Odo. His relationship with them will turn out to be important in the dynamics of that war that dominates the final seasons of the series.

Here’s my query, as best I can formulate it: Within the scope of the myth logic operating in the franchise, what is the nature of the relationship between 1) the opposition between Odo (as Matter), on the one hand, and Spock, Data, and The Doctor (as Mind), on the other and 2) the episodic and relatively light politics in those three series, on the one hand, and the extended stories and political interactions of DS9, on the other hand? Is it one of mere contingency, or is there an element of necessity, or myth-logic in that relationship? I suspect that latter, but I don’t really know, nor am I prepared to argue it at this point.

Addendum: Think of Odo in relation to the emergence of Object-Oriented Ontology.

Thursday, December 10, 2020

The end of the music business [as we know it]

Monday, December 7, 2020

October moon

To the Moon! Jacob Collier @3QD

Several times during my undergraduate years I had experienced something you might call “the true thought is the afterthought”: I would write a paper, turn it in, and only then would I come to understand what I’d been writing about, what I’d been seeking. So it is with my current piece at 3 Quarks Daily: Jacob Collier, a 21st Century Mozart? – https://3quarksdaily.com/3quarksdaily/2020/12/jacob-collier-a-21st-century-mozart.html.

After giving the piece its provocative title I said nothing about that title until the very end, where I did say something, but not much, certainly not as much as I’d had in mind when planning the piece. But even what I’d planned would have missed the point, which is a subtle one.

Why even suggest such a comparison – Jacob Collier and Wolfgang Amadeus Mozart – when I know, and stated, that it’s pointless on the face of it? I was certainly playing to a widespread mentality that likes to rank and order things, all kinds of things, certainly including artistic accomplishment, as a way of measuring excellence. In Collier’s case we do not and cannot know – which I more or less said in the article.

What I did not say is why, nonetheless, I felt more or less compelled to offer the comparison in the first place. Mozart, more than any other classical composer, has become, rightly or wrongly, a figure for prodigiousness. That is what Collier may well be, seems to be, about. What seems so remarkable about Collier is the combination of musical sophistication and skill with (relative) accessibility and popularity. As one musician – I forget who – remarked in a video, musicians with the kind of intricate sophistication Collier exhibits do not get nominated for Album of the Year, as Collier just has. How’d that happen?

This comparison should not be understood as a device of logic and reason. It functions as a device of metaphor and indirection.

When I think of the Collier phenomenon, if you will, the music, but also its reception, the variety of musicians who think well of him, I conclude that I have no way of judging him. I’ve not seen anything quite like this phenomenon and so, in consequence, offer up this absurd history-breaking comparison about which I cannot be serious: Mozart | Collier.

I suppose the comparison could as easily have been with Beethoven or Bach or, for that matter, Armstrong, Ellington or perhaps the Beatles. Why not? Except that those names are not so thoroughly absorbed into figurative usage as “Mozart.” Someone of one of the video’s I watched suggested that Collier is a generational musician. Perhaps that is it. Which to say, that is the scope of the question. It will be awhile a verdict becomes apparent.

* * * * *

Let us return to earth and listen to the arrangement which earned Collier the 2020 Grammy for an instrumental or a cappella arrangement, Moon River:

The music is almost static for the first minute and twenty seconds. Tones come and go as faces appear and vanish in the video. What a strange almost meditative way to open an arrangement.

Collier had asked well a hundred fifty-one people to make short videos of themselves singing “moon.” He then collaged those “moons” into the opening minute and twenty-seconds. Here’s a list of them:

Suzie Collier, Sophie Collier, Ella Collier, Ben Bloomberg, Herbie Hancock, Quincy Jones, Eric Whitacre, Hans Zimmer, Steve Vai, Ty Dolla $ign, Chris Martin, Charlie Puth, Lianne La Havas, Tori Kelly, David Crosby, Chris Thile, Daniel Caesar, Kimbra, Laura Mvula, MARO, Cory Henry, dodie, Becca Stevens, Jack Conte, Nataly Dawn, Oumou Sangare, Jules Buckley, Jamie Cullum, Tank, Beardyman, Genevieve Artadi, Sam Wilkes, Greg Phillinganes, Michael League, Hamid El Kasri, Avery Wilson, Jojo, Jonah Nilsson, Tom Misch, Darwin Deez, June Lee, Kathryn Tickell, Merrill Garbus, Nikki Yanofsky, Sam Amidon, Alvin Chea, Claude McKnight, Mark Kibble, Khristian Dentley, David Thomas, Joel Kibble, Andrea Haines, El Cockerham, Blake Morgan, Barney Smith, Chris Wardle, Jonathan Pacey, Rob Clark, Sam Dressel, Barak Schmool, Pedro Martins, Jake Sherman, Jonathan Dove, Brian Mayton, Fred Harris, Nicola Hadley, Steve Mulligan, Clyde Lawrence, Gracie Lawrence, Sumner Becker, Jordan Cohen, Thomas Gould, Gareth Lockrane, Gwilym Simcock, Jason Rebello, James Maddren, Nick Smart, Pete Churchill, Tom Cawley, Umar Hossain, Mischa Stevens, Jose Ortega, Claudio Somigli, Alessandro Melchior, Christian Euman, Rob Mullarkey, Adam Fell, Michael LaTorre, Michael Peha, James Wright, Noah Simon, Matthew Celia, Rocky Borders, Josh Helfferich, Robert Watts, Ewa Zbyszynska, Arend Liefkes, Jasper van Rosmalen, Murk Jiskoot, Ruben Margarita, Aleigha Durand, Allayna O'Quinn, André Smith, Asya Bookal, Briana Marshall, Catherina Lagredelle, Celine Sylvester, Chad Lupoe, Chesroleeysia B, Cleavon Davis, Cole Henry, Danielle Cornwall, Haley Flemons, Holland Sampson, Jason Max Ferdinand, Jourdan Bardo, JP Scavella, Kashaé Whyte, Keviez Wilson, Kobe Brown, Kristin Hall, Leonard Brown, Lincoln Liburd, Louis Cleare, Maia Foster, Malik George, Malik McHayle, Marissa Wright, Matthew Cordner, Mykel Inez, Naomi Parchment, Natrickie Louissaint, Patricia Williams, Roddley Point-du-Jour, Samella Carryl, Terell Francis-Clarke, Zaren Bennett, Leo Janssen

The first three on the list are his mother and two sisters, respectively. Beyond that it’s a miscellaneous collection of people Collier knows. Some are well known musicians. I recognize a few names and they span an interesting range of musical kinds. The range is no doubt larger than I know. Many of the people are just people Collier knows, just friends. The list itself makes few distinctions.

Here’s a video where Collier explains how he constructed the video. He devotes the first 12 minutes to that opening collage.

Sunday, December 6, 2020

Glenn Loury & James Heckman discuss parenting, schooling, race, and inequality

As you may know, Glenn Loury is an economist at Brown. Heckman is a Nobel Laureate in economics at the University of Chicago. The first 19 minutes consists of chitchat about Chicago; Glenn grew up there and Heckman's spent most of his life there. It's interesting, but skippable. The rest of the discussion is fascinating:

19:32 How do you improve a human being?
27:29 What "The Bell Curve" missed about human development
36:12 How teaching and empowering parents positively impacts children
44:00 The taboo of family-focused anti-poverty efforts
54:17 What is the source of implicit bias?
1:03:55 James: Politicians on both sides cultivate and exploit racial turmoil

Saturday, December 5, 2020

On the cultural contrast between computer science and the humanities, and why CS is in a position to win the hearts and minds of digital humanists

The rest of VanZandt's tweet stream:

(2/n) Hot take time.

CS depts, on the whole, are WAY better than HUMA depts at undergrad community building. Group projects as the standard, hackathons, huge undergrad TA+RA culture, active (and numerous) clubs, programming contests, undergrad conference culture, ... 3:15 PM · Dec 5, 2020·Twitter for Android

(3/n) HUMA depts do some of these things, and some of them well, but they're still largely outclassed.

HUMA has huge structural disadvantages in this "competition." Solitary research culture? That's a KILLING BLOW. How do you build community when collaboration isn't the norm?

(4/n) HUMA also has methodological disadvantages: CS students all speak a coherent foundational language of algorithms, time/space efficiency, software craftsmanship, not to mention programming languages.

In HUMA? What we share is surely important, but...

(5/n) ...it largely fails to be articulable and leverageable for community-building among, say, sophomores.

Grad students eventually learn to speak enough of our discipline's varied critical and methodological dialects, but that's far too late.

(6/n) To return to the CS+HUMA majors unaware of HDS:

They have two communities vying for their scarce time, and it's something less than a fair fight. So at places where HDS is HUMA-based, CS+HUMA majors can end up too under-engaged by HUMA to learn of and get involved in HDS.

(7/n) HUMA depts are more than capable of cultivating wonderfully engaging communities, and they oftentimes succeed in doing so to truly magical effect. But the structural advantages CS enjoys are so great that even deeply committed HUMA depts sometimes struggle

(n/n) I loved my undergrad English dept, and I loved being an English major. Spiritually, I'm English major. But the CS dept made it so easy to find community and belonging. English had my heart, but CS had my time and my presence.

FWIW, the English Department at SUNY Buffalo was an extraordinary place when I was there in the mid-1970s, and was able to create a strong sense of community within the department for various reasons, including graduate student participation in department governance, and several special programs within the department representing disciplinary specialization (philosophy, psychology, society). Still, my best experience was in the computational linguistics research group David Hays ran in the Linguistics Department. I suspect, however, that had as much to do with Hays himself as with the institutional cultural of linguistics and computation.

Friday, December 4, 2020

"My Favorite Things", from Broadway hit to jazz classic [in a revolutionary new mode]

This is an excellent video that follows "My Favorite Things" from its origins in a mid-century Broadway musical, The Sound of Music, to a hit album by John Coltrane. In the middle of the video Adam Neeley explains how Coltrane was able to exploit the song's unusual form – AAA'B rather than the standard AABA – to his own musical ends, making it a classic of modal jazz.

Incidentally, this video makes an interesting contrast with the hypeharmonic virtuosity Jacob Collier displays in his treatment of "Twinkle, Twinkle, Little Star" and "The Christmas Song."

Jacob Collier working out an arrangement of "Twinkle, Twinkle, Little Star"

This is a long video, almost three and a half hours, but it's also fascinating on three counts: 1) you see a skilled musician tinker around, refining an arrangement, 2) you get insight into Collier's harmonic imagination, and 3) in particular, you get to hear his use of microtones. Bonus: Listen to Collier's bass lines.

Tuesday, December 1, 2020

Medieval depictions of the moon

Sunday, November 29, 2020

Will Jacob Collier win the Grammy for Best Album? [Oh those kids]

I surely don’t know. He’s already won four Grammies, all for arranging (Wikipedia).

Here’s what he has to say about the nomination, from his Facebook page:

The last 24 hours have turned my world upside down. To find myself Grammy-nominated for Album Of The Year is nearly impossible to wrap my head around...an unthinkable, unspeakable, epic honour. Three years ago, when I set about dreaming up a self-produced quadruple album, I barely imagined it would be possible to create, let alone that it would carry me into such unfathomable territory as this! To be considered alongside such legendary nominees for this award is a bizarre and tremendous privilege. I am forever grateful to Mahalia Music,Ty Dolla $ign, Daniel Caesar, Jessie Reyez, Kiana Ledé ,T-Pain, Kimbra, Tank and the Bangas, Tori Kelly & Rapsody for bringing such magic to the album... to the wonderful Ben Bloomberg, Emily Lazar, and Chris Allgood for helping me shape it sonically... and to all at the Recording Academy who voted for me. This is going to take some time to fathom... but in the meantime, I’m sending so much love to you all from here in London, and looking forward to wherever the path may lead us.

I’m just barely familiar with his music. I’ve seen perhaps a half-dozen to a dozen of his YouTube clips – he got his start on YouTube in 2012. They’re interesting and impressive, and I’ll be watching more. I note that Herbie "Watermelon Man" Hancock thinks he’s a genius and that Quincy "Thriller" Jones has signed on as his manager. I’ve read, and have heard from a friend, that he gives an impressive live show. No doubt.

Still....

Here’s his most recent video, an arrangement of Mel Tormé’s “The Christmas Song” (“Chestnuts Roasting on an Open Fire”), a song I know fairly well and like a lot:

If you’re interested in the musical particulars, here’s a transcription by Jason Fisher:

Impressive? Surely. Tremendous knowledge of music, great skill in execution, and mastery of technology. Very pretty. But compelling music? I’m not sure. Do I like it? Yes.

As I said in an email to a friend:

I’ve listened to between a half and a dozen of his videos. Impressive stuff. But they all seem a bit the same. When you throw all that STUFF at a tune, what emerges is mostly the stuff and all the tunes become racks on which to hang the stuff.

But he’s still young. Not even 30. Maybe he’ll pile up more and more stuff and then figure out how to get rid of most of it. Then we’ll see. I figure I’ll keep tabs on him but I’m not going to buy his albums, not yet. I’d like to hear what he can do with a simple blues. And since he’s a vocalist, I want to hear what he can sang, if you get my drift.

* * * * *

 Collier explains harmony on five levels, from a seven-year old to Herbie Hancock:

Composer David Bruce comments on Collier's microtonal virtuosity:

* * * * *

An interesting article about his background: “I’m the eldest of three children – Sophie and Ella are my two younger sisters and they are amazing, we sing Bach chorales together as family – it’s just so much fun.”

Saturday, November 28, 2020

The dissolution of a comprehensive epistemic regime in America and the world [the center doesn't hold]

David Brooks has an interesting opinion piece, The Rotting of the Republican Mind, NYTimes, Nov. 26, 20. He declines to blame the internet, for "Why would the internet have corrupted Republicans so much more than Democrats, the global right more than the global left?" He goes on to note

... a remarkable essay that Jonathan Rauch wrote for National Affairs in 2018 called “The Constitution of Knowledge.” Rauch pointed out that every society has an epistemic regime, a marketplace of ideas where people collectively hammer out what’s real. In democratic, nontheocratic societies, this regime is a decentralized ecosystem of academics, clergy members, teachers, journalists and others who disagree about a lot but agree on a shared system of rules for weighing evidence and building knowledge.

This ecosystem, Rauch wrote, operates as a funnel. It allows a wide volume of ideas to get floated, but only a narrow group of ideas survive collective scrutiny. “We let alt-truth talk,” Rauch said, “but we don’t let it write textbooks, receive tenure, bypass peer review, set the research agenda, dominate the front pages, give expert testimony or dictate the flow of public dollars.”

That's the core of what I find interesting, simply that such an epistemic regime exists. He'll go on to argue that it has broken down. Note that, in acknowledging this regime, Brooks is deep in "the social construction of reality" territory.

Over the past decades the information age has created a lot more people who make their living working with ideas, who are professional members of this epistemic process. The information economy has increasingly rewarded them with money and status. It has increasingly concentrated them in ever more prosperous metro areas.

While these cities have been prospering, places where fewer people have college degrees have been spiraling down: flatter incomes, decimated families, dissolved communities. In 1972, people without college degrees were nearly as happy as those with college degrees. Now those without a degree are far more unhappy about their lives.

People need a secure order to feel safe. Deprived of that, people legitimately feel cynicism and distrust, alienation and anomie. This precarity has created, in nation after nation, intense populist backlashes against the highly educated folks who have migrated to the cities and accrued significant economic, cultural and political power.

And so, Brooks argues, new regimes arise, ones the oppose the dominant regime and provide alternative stories: "Paradoxically, conspiracy theories have become the most effective community bonding mechanisms of the 21st century."

Under Trump, the Republican identity is defined not by a set of policy beliefs but by a paranoid mind-set. He and his media allies simply ignore the rules of the epistemic regime and have set up a rival trolling regime. The internet is an ideal medium for untested information to get around traditional gatekeepers, but it is an accelerant of the paranoia, not its source. Distrust and precarity, caused by economic, cultural and spiritual threat, are the source.

Bergen Arches, double view

Tribalism in the current AI wars, a tweet stream

The rest of the tweet stream:

The intensity of the #AI battles varies depending on many factors.

They can boil around concrete examples of algorithms.

They even can explode when some concrete people either second or criticize them. It's either white or black, it's difficult to find gray tones in between. 2/

In what follows, I'll be adding some examples of what I've been observing in #AI discussions in recent years Down pointing backhand index 3/

Some of the arguments are highly biased, extremely controversial, and tremendously speculative of what #AI "could be" or "will be doing" vs. what it "can actually do."

That ignites the #hype. 4/

Other arguments are well-intended in principle, but they are also ill-formulated or not justified well.

Others are very constructive, but not taken as such by the supposed opponents.

Some are full of hope that we will construct better #AI systems in the future. 5/

See, for example:

When GPT-3 fails or doesn't deliver the correct answer: "Oh, well, that one was not correct."

When it does: "OMG, brilliant! Look at this, it's incredible! Just amazingly awesome! That's #AGI or on the path to it! Long live GPT-3!" 6/

There is a hidden fear to criticize anything related to #AI in general and GPT-3 in particular because "what could 'the cooler guys' think of me if I do it! So better to not say anything about an arguably evident truth because I could risk not being considered 'cool' anymore." 7/

Or the sick propensity for rejecting/downplaying anything that resembles symbolic #AI.

It's repulsive the level of arrogance with which some folks express themselves, elevating sub-symbolic #AI to the highest realms or "only" path on the way to "truly" intelligent artefacts. 8/

I've seen huge inconsistencies when people analyze what #AI, say GPT-3, actually cannot do vs. when they imagine what it "may probably be doing behind the scenes", "attributed behaviours," what it "could supposedly be happening."

Magic bullet powers.

Anthropomorphism, too. 9/

Thursday, November 26, 2020

Population, pandemics, Malthus, and progress

Down in the Arches

Has European politics been asleep at the wheel since WWII?

 Leopost Aschenbrenner, Europe's Political Stupor, For Our Posterity, 23 Nov. 2020:

After WWII, the political was banished from the European Continent. It had caused too much harm in European hands. Lively debate was subdued, and technocratic administrators took charge. Europeans were left to project their fantasies of a real political debate on America. And so a cross-Atlantic homogeneity has taken root, with the American Left’s cultural dominance in the U.S. extending to Europe.

But a homogenous West means a stagnant West. As the ideals of classical liberalism are once-again being challenged, we need new ideas and a diversity of approaches to reinvigorate and reinvent liberalism. It might be time to reassert the political in Europe and wake the Continent from its stupor.

Consider the case of Germany:

What is the cause of this extraordinary European obsession with American politics? I think it has to do with a underlying, perhaps subconscious, yearning for democracy—not in the nominal sense of having elections, but in the more visceral sense, the sense that the body politic’s destiny lies in the citizen’s hands.

On the surface, German conventional wisdom decries the political divisions in the U.S.; it trumpets the supposed moral superiority of the German way over the American health care system or American foreign policy; it holds German democracy to be infinitely superior to American democracy (which, if you believe German media coverage, is on the verge of collapse and paralleled only by the Weimar Republic in 1933). But what this arrogance masks—and perhaps is deliberately intended to obscure—is the underlying reality of European “politics”: namely, that it is bereft of politics.

For the German voter has basically no say over his country’s fate. Sure, he may cast a vote in an election for parliament. But in the end, the same centrist parties seem to hold a majority in parliament, the same centrist parties form a coalition government, and the same party leaders remain in charge, making policy mostly through backroom deals rubber-stamped by the parliament. Besides relatively minor policy tweaks, the elections don’t seem to matter much.

And for all the German media’s handwringing about a “peaceful transfer of power” in the U.S., most Germans under, say, 30, have never witnessed a transfer of power in Germany! It’s always been Merkel. And really, the guy before her—even though he was from the opposing political camp—wasn’t all that distinguishable. [...]

The contrast to the recent American presidential elections could not be starker. There was a crystal-clear choice offered to voters. And the election was ultimately decided by a fraction of a percent. Every vote really mattered. Voters could reasonably believe that the course of world history was in their hands.

That the citizens had this real choice is the other side of the often-decried political division. Yes, a wide-open, lively politics can yield someone like Trump—but it can also yield someone like Obama. Someone like him, with a father from Kenya and promising hope and change, would likely have no chance of rising the ranks of German politics.

 The price of peace?

Perhaps, then, the Western monoculture is the price we pay for peace. This is worth taking seriously. But the Germany of today is not the Germany of the early 20th century; the Europe of today is not the Europe of the early 20th century. The Continent has been reshaped along liberal lines. It is now a stalwart of the ideals of liberty and peaceful coexistence.

The banishment of the political was intended to subdue the impulses of nationalism and demagoguery. But if the European mainstream continues to deny the citizenry a true democratic debate, that may well pave the way for an authoritarian strongman who promises the citizens renewed control of their nation’s destiny. We already see inklings of this in Poland, Hungary, France, the Brexit vote to “take back control,” and a resurgent far-right in Germany that is blasting open the previously narrow confines of political debate. The Continent is ripe for awakening. If liberalism does not lead this charge, illiberal authoritarianism will. (German politics in particular is open for disruption, in my opinion, a subject which I hope to return to in a later post.)

The end of physics as we have come to know? [And the new physics?]

 Posted at Not Even Wrong, Nov. 24, 2020:

In a remarkable article entitled Contemplating the End of Physics posted today at Quanta magazine, Robbert Dijkgraaf (the director of the IAS) more or less announces the arrival of the scenario that John Horgan predicted for physics back in 1996. Horgan argued that physics was reaching the end of its ability to progress by finding new fundamental laws. Research trying to find new fundamental constituents of the universe and new laws governing them was destined to reach an endpoint where no more progress was possible. This is pretty much how Dijkgraaf now sees the field going forward:

Confronted with the endless number of physical systems we could fabricate out of the currently known fundamental pieces of the universe, I begin to imagine an upside-down view of physics. Instead of studying a natural phenomenon, and subsequently discovering a law of nature, one could first design a new law and then reverse engineer a system that actually displays the phenomena described by the law. For example, physics has moved far beyond the simple phases of matter of high school courses — solid, liquid, gas. Many potential “exotic” phases, made possible by the bizarre consequences of quantum mechanics, have been cataloged in theoretical explorations, and we can now start realizing these possibilities in the lab with specially designed materials.

All of this is part of a much larger shift in the very scope of science, from studying what is to what could be. In the 20th century, scientists sought out the building blocks of reality: the molecules, atoms and elementary particles out of which all matter is made; the cells, proteins and genes that make life possible; the bits, algorithms and networks that form the foundation of information and intelligence, both human and artificial. This century, instead, we will begin to explore all there is to be made with these building blocks.

In brief, as far as physics goes, elementary particle physics is over, from now on it’s pretty much just going to be condensed matter physics, where there at least is an infinity of potential effective field theory models to play with.

Be sure to read the comments for some push-back on Dijkgraaf..

Sunday, November 22, 2020

Tool innovation and childhood


From the article,Toddlers, Tools, and Tech: The Cognitive Ontogenesis of

Highlights: Innovation, trends in Cognitive Sciences, 19 November 2020, by Bruce Rawlings and Christine H. Hegare, https://doi.org/10.1016/j.tics.2020.10.006: Human culture is unparalleled in technological complexity, yet most children fail simple tool innovation challenges.

We explain how multiple cognitive mechanisms, including causal reasoning, problem solving, creativity, executive functions, and social learning work in concert to scaffold the development of tool innovation over childhood.

We describe the role these mechanisms play in three core steps of tool innovation; recognizing the problem, generating solutions, and the social transmission of innovations.

Using commonly used measures of children’s tool innovation as examples, we detail the role each of these mechanisms plays in the development of tool innovation.

We show how understanding the cognitive ontogeny of innovation will help us understand cognitive and cultural evolution.

Introduction: The development of tool innovation presents a paradox. How do humans have such diverse and complex technology, ranging from smartphones to aircraft, and yet young children find even simple tool innovation challenges, such as fashioning a hook to retrieve a basket from a tube, remarkably difficult? We propose that the solution to this paradox is the cognitive ontogenesis of tool innovation. Using a common measure of children’s tool innovation, we describe how multiple cognitive mechanisms work in concert at each step of its process: recognizing the problem, generating appropriate solutions, and the social transmission of innovations. We discuss what the ontogeny of this skill tells us about cognitive and cultural evolution and provide recommendations for future research.

Retreating chairs

Wednesday, November 18, 2020

Trump, reality-free politics, and the reinvention of America [?]

Bruno Maçães, How Trump Almost Broke the Bounds of Reality, NYTimes, Nov. 12, 2020.

What Mr. Trump promised was the power to create imaginary worlds and the freedom to unleash a selfish and extravagant fantasy life, free of the constraints of political correctness or even good manners, the limits imposed by climate change and the international rules tying America to the ground. This extreme form of freedom — call it hyperfreedom — appealed to Greenwich, Conn., financiers no less than to West Virginia coal miners. It was also, as we found out in the election, attractive to some minorities.

In the traditional way to think about freedom, we want to limit or even eliminate obstacles to individual choice, but ultimately we must deal with reality. Mr. Trump’s example is to take it an extra step: Why not be free from reality as well? Indeed, this may be the ultimate goal of contemporary America: a society that is pure fantasy life, free from reality.

What Joe Biden seemed to understand before everyone else was that the fantasy was about to collapse, and voters weren’t ready for an alternative liberal fiction. The main binary in American politics now may not be between left and right, but between fiction and reality. At some point, fictions must be revealed as no more than fictions — and they must be switched off. [...]

In this view, Mr. Biden is the kill switch. He promised to remove Mr. Trump and switch the channel to something less risky.

After the election, a verdict is being widely shared: Mr. Trump may leave, but Trumpism is here to stay. This may be true, but it won’t be in the way people think.

What survived the election was not Trumpism as a policy platform but the fantasy politics of the last four years. Those are as powerful and addictive as ever, but they will look very different once the current executive producer has left the job.

The return to reality is but one stage in developing new fantasies. It is a way to wipe clean the canvas before departing again in search of new adventures. The search could well be resumed on the left, where there are also many powerful instincts to fight against the limits imposed by reality.

Compositionality [language notes]

Monday, November 16, 2020

35 years of Moore's Law [chip design]

Wednesday, November 11, 2020

Sky, leaves, heron, reflections

The Trump administration's refusal to recognize the vote is an attack on democracy

Writing in Lawfare, Nov. 11, 2020, Benjamin Wittes asks, How Hard is it to Overturn an American Election? He explains that, while the vote count seems secure and definitive, the Trump administration's refusal to concede the result is dangerous. It should not be dismissed as mere sour grapes. It is an attempt to undermine the legitimacy of the democratic process and, ultimately, of the government.

And so it has come to this: the president of the United States is trying to overturn the results of a national election he unambiguously lost with a combination of petulant whining, spiteful and flailing executive action, and magic.

No, it’s not ultimately going to work, at least not if working is defined as allowing President Trump to maintain power in the face of expressed voter will.

But it is working better than I would have believed possible: in undermining confidence in American democratic processes, in damaging President-elect Joe Biden’s ability to govern in the short term, and in raising questions in the minds of the faithful as to whether Trump’s defeat was real.

Wittes concludes:

In short, the harm here is almost certainly not the threat to the fact of the transition of power. Every day that goes by, more votes come in, and states get closer to certification of results that have their own logic and momentum and will lead to Biden’s taking the oath of office on January 20.

The harm here, rather, has several dimensions.

First, it is a harm to the orderly transition of power. Merely raising the spectre of not honoring the results of an election, merely inducing democratic anxiety such that as serious-minded a person as Bill Kristol could write a piece like the one quoted above, is a democratic harm. Denying information to the Biden transition makes it harder to govern coming in. Conveying uncertainty to foreign actors is dangerous; it invites misunderstanding, and misunderstanding can be deadly.

Second, like so many aspects of Trump’s presidency, Trump’s response to the election has unsettled public expectations of what it means to lose a presidential election in the United States, what a patriotic transition looks like and what an outgoing president owes to his successor—and to the public. This is that old, pesky issue of norms. And once again, Trump is showing that presidential behaviors taken for granted are actually voluntary acts on the part of successive holders of the office—things that presidents do because they’re the things presidents have always done. [...]

Third, the president’s behavior will undermine trust among many people in the integrity of the election. It already has. An astonishing 70 percent of Republicans polled by Morning Consult report not believing the election was free and fair. Sustained campaigns to undermine trust run by entire political movements tend to work. And Biden will suffer from a lost perception of legitimacy among a major segment of the electorate as a result of this one.

Finally, fourth, there’s the chance that I’m wrong that Biden’s prevailing in the election’s aftermath—that the automatic processes I have described are just a little bit less automatic than I think they are. There’s the chance that Republicans, having dug themselves into the Trump hole, don’t stop digging when the results are certified, that they don’t quite know how to back down. There’s the chance that state legislatures are little more aggressively partisan than I imagine, or that a few courts go off the deep end.

There’s a chance, in other words, that things spin out of control.