Tuesday, November 20, 2018

Mind-Culture Coevolution: Major Transitions in the Development of Human Culture and Society

I've now turned this into a PDF file which you can download here: https://www.academia.edu/37815917/Mind-Culture_Coevolution_Major_Transitions_in_the_Development_of_Human_Culture_and_Society

* * * * *
This is revised from the introduction to a website I put up in the old days of web 1.0, all in hand-coded HTML. Where I’ve since uploaded downloadable versions of the documents I’ve used those links in this revised introduction, but you’re welcome to access the online versions from the old introduction. Note: This version supersedes an older post at New Savanna.
Mind and Culture

A central phenomenon of the human presence on earth is that, over the long term, we have gained ever more capacity to understand and manipulate the physical world and, though some would debate this, the human worlds of psyche and society. The major purpose of the theory which the late David Hays and I have developed (and which I continue to develop) is to understand the mental structures and processes underlying that increased capacity. While more conventional students of history and of cultural evolution have much to say about what happened and when and what was influenced by what else, few have much to say about the conceptual and affective mechanisms in which these increased capacities are embedded. That is the story we have been endeavoring to tell.

Our theory is thus about processes in the human mind. Those processes evolve in tandem with culture. They require culture for their support while they enable culture through their capacities. In particular, we believe that the genetic elements of culture are to be found in the external world, in the properties of artifacts and behaviors, not inside human heads. Hays first articulated this idea in his book on the evolution of technology and I have developed it in my papers Culture as an Evolutionary Arena, Culture's Evolutionary Landscape, in my book on music, Beethoven's Anvil: Music in Mind and Culture, and in various posts at New Savanna and one for the National Humanities Center which I have aggregated into four working papers:
This puts our work at odds with some students of cultural evolution, especially those who identify with memetics, who tend to think of culture's genetic elements as residing in nervous systems.

We have aspired to a system of thought in which the mechanisms of mind and feeling have discernible form and specificity rather than being the airy nothings of philosophical wish and theological hope. We would be happy to see computer simulations of the mechanisms we've been proposing. Unfortunately neither the computational art nor our thinking have been up to this task. But that, together with the neuropsychologist's workbench, is the arena in which these matters must eventually find representation investigation, and a long way down the line, resolution. The point is that, however vague our ideas about mechanisms currently may be, it is our conviction that the phenomenon under investigation, culture and its implementation in the human brain, is not vague and formless, nor is it, any more, beyond our ken.

For a glossary of terms, see the page Cultural Evolution Terms.

Major Transitions

The story we tell is one of cultural paradigms existing at four levels of sophistication, which we call ranks. In the terminology of current evolutionary biology, these ranks represent major transitions in cultural life. Rank 1 paradigms emerged when the first humans appeared on the savannas of Africa speaking language as we currently know it. Those paradigms structured the lives of primitive which societies emerged perhaps 50,000 to 100,000 years ago. Around 5,000 to 10,000 years ago Rank 2 paradigms emerged in relatively large stable human societies with people subsisting on systematic agriculture, living in walled cities and reading written texts. Rank 3 paradigms first emerged in Europe during the Renaissance and gave European cultures the capacity to dominate, in a sense, to create, world history over the last 500 years. This century has begun to see the emergence of Rank 4 paradigms.

How is Facebook going to regulate hate speech on its platform? An independent Supreme Court of Facebook?

Mark Zuckerberg has suggested that an independent oversight body should "determine the boundaries of acceptable speech on the platform". This raises a host of issues, most centrally:
What standards, past decisions and values will it consider when evaluating, for example, whether a particular post is “hate speech”?

This is not an easy question. Indeed, the difficulty of answering that question seems to be one of the reasons Zuckerberg wanted such an independent body in the first place. In March 2018, Zuckerberg told Recode, “I feel fundamentally uncomfortable sitting here in California at an office, making content policy decisions for people around the world. … [T]hings like where is the line on hate speech? I mean, who chose me to be the person that [decides]?” No doubt his unease with this situation was only furthered when he sparked off controversy by suggesting in a later interview that he didn’t think Holocaust deniers should be removed from Facebook—a perfect example of the difficulty Facebook faces. The U.S. has a famously expansive interpretation of free speech, and the court rulings that the First Amendment protected the right of Nazis to march in Skokie is remembered as one of the “truly great victories” in American legal history. By contrast, Holocaust denial is a crime in Germany. Putting aside the wisdom of either position, how should Facebook—a global platform connecting over two billion monthly users—respect conflicting standards of free speech, of which the example of Holocaust denial is only one?

Unfortunately, Zuckerberg’s Nov.15 Facebook post suggests he hasn’t given this issue enough attention. His post itself suggests several, sometimes contradictory, options. When he writes of the forthcoming independent body, he says, “How do we ensure their independence from Facebook, but also their commitment to the principles they must uphold?”—implying that the values in question are Facebook’s. These are embodied in the company’s Community Standards—which, along with its internal guidelines, are the rules that determine what content is allowed on the platform and which the 30,000 content reviewers use to make individual calls. Given these are the rules that the first-instance decision-maker will be applying, it makes sense that the tribunal should also be guided by them. This is consistent with Facebook’s goal that the standards “apply around the world to all types of content.”

But in his post, Zuckerberg also notes that “services must respect local content laws.” So will the Supreme Court of Facebook be charged with interpreting this local law? In deciding whether a post was justifiably taken down, will it interpret Thailand’s Lèse-Majesté laws prohibiting criticism of the Thai monarchy? Will it try and interpret the sometimes differing decisions of German regional courts on what is hate speech under German law?

Zuckerberg also suggests that “it's important for society to agree on how to reduce [harmful content] to a minimum—and where the lines should be drawn between free expression and safety.” If it’s society that decides the lines for free expression, how will the independent body determine what society’s views are? Will it take polls? If so, will those polls be national, regional or global? Will Facebook take into consideration national voting ages? Furthermore, doesn’t leaving the decisions to “society” risk undermining protection of minorities?

These options by no means exhaust the possibilities raised by Zuckerberg’s proposal in his post.
Though Zuckerberg appears to be seriously pursuing the idea, currently his conception of the independent body is more soundbite than substance. When he says that the SCOF [Supreme Court of Facebook] will “ultimately make the final judgment call on what should be acceptable speech in a community that reflects the social norms and values of people all around the world,” he sets an impossible goal. There is no homogenous global community whose norms can be reflected in the decisions of a single body deciding contentious issues. But that doesn’t mean the proposed body cannot be an important development in online governance, creating a venue for appeal and redress, transparency and dialogue, and through which the idea of free speech in the online global community develops a greater substantive meaning than simply “whatever the platform says it is.”

How the independent body is set up will determine whether it furthers or hinders rights to freedom of expression and due process. There is a rich literature in comparative law showing that decisions of institutional design can have significant impacts not only on outcomes but the entire stability and legitimacy of a governance structure.

Take me to your leader


Even modest musical training enhances cognitive function, especially executive ability

Most of those enhanced abilities were limited to what they referred to as "music experts"—people who started training early in life, and kept at it for at least a decade. But one very important skillset, "executive functioning," was also bolstered for lightly trained amateur players.

This suggests that even limited training and practice can provide significant cognitive benefits.

In recent years, many studies have concluded that musical training enhances brain function. The goal of this new research was to confirm that link using the National Institute of Health's Toolbox Cognition Battery, a standardized set of tests that measure the key cognitive functions that together constitute fluid intelligence.

These include focus, processing speed, working memory (the ability to temporarily retain information and use it to learn, reason, or make informed decisions), and executive function (the ability to plan, organize, and accomplish goals).

The participants were 72 college undergraduates, who were grouped into three categories: Musical experts (people who began formal training at age 10 or younger, and kept up their practice for at least a decade); musical amateurs (those with at least one year of musical training); and non-musicians.

Combining the results of all the tests, "musicians with extensive experience scored significantly higher than non-musicians and less-trained musicians," the researchers write in the journal Psychology of Music. Specifically, they did better on four of the five cognitive skills that the tests measured.

These included attention ("ensemble performance requires the ability to focus on one's own part without being distracted by other parts," the researchers note); working memory (presumably strengthened by the process of memorizing music); and processing speed (which is enhanced by learning to react rapidly to the demands of the music, as well as to those of collaborators).

The results of the executive-function test, which involved rapidly sorting pictures by shape and color, were arguably the most intriguing, in that modestly trained musicians performed significantly better than non-musicians (although not as well as highly trained musicians).

If that finding is confirmed using a larger sample, "then as a society, we should be interested in universal musical education, perhaps starting in elementary and pre-school-aged children," the researchers argue.

Monday, November 19, 2018

TALENT SEARCH: Tyler Cowen on the value of sole proprietor pop-up philanthropic shops [plus a widely shared blind spot in his thinking and a plug for a NASA administrator]

A week or so ago Tyler Cowen ran up a post on his philanthropic method, The philosophy and practicality of Emergent Ventures. He notes that traditional philanthropies have large staffs “which means relatively conservative, consensus-oriented proposals emerge at the end of the process.” Moreover “the high fixed costs of processing any request discriminate against very small proposals” and such foundations tend to become “captured by their staffs”, who tend to be treated as valued proxies for the foundation’s audience and thus further increase the conservative insularity of the decision process.

A sole proprietor pop-up philanthropic shop

All of which makes sense to me. In contrast, his approach in Emergent Ventures is quite different. He has no staff. Though he may seek advice from others, he makes all the decisions. The process is quick and cheap and the “arrangement also can promise donors 100% transmission of their money to recipients, or close to that.”

And so:
The solo evaluator — if he or she has the right skills of temperament and judgment — can take risks with the proposals, unencumbered by the need to cover fixed costs and keep “the foundation” up and running. Think of it as a “pop-up foundation,” akin to a pop-up restaurant, and you know who is the chef in the kitchen. It is analogous to a Singaporean food stall, namely with low fixed costs, small staff, and the chef’s ability to impose his or her own vision on the food.
Once a fixed sum of money is given away, and the mission of the project (beneficial social change) has been furthered, “the foundation” goes away. No one is laid off. Rather than crying over a vanquished institutional empire and laid off friends/co-workers, the solo evaluator in fact has a chance to get back to personally profitable work. It was “lean and mean” all along, except it wasn’t mean.
What’s not to like?

He goes on to suggest: “In my view, at least two percent of philanthropy should be run this way, and right now in the foundation world it is about zero percent.” He goes on to suggest: “The ideal scaling is that other, competing ‘chefs’ set up their own pop-up foundations.” YES to all of this.

In particular, what I like about this last suggestion is that it speaks to a blind spot in Cowen’s perception of what he’s up to, a perception that seems almost universally shared by people in the philanthropy business. What is that blind spot? Simple, that what they’re looking for is an attribute of individuals.

Talent, whatever that is, may well be an attribute of individuals. But, to the extent that Cowen is trying to increase innovation by identifying individuals whose work will be widely valued and used, he is in fact looking for a GOOD FIT between individual talent and social need and capacity. Let me repeat that in slightly different terms. Cowen is looking for individuals with a talent that has the capacity to fill a socio-cultural need. What’s missing from his formulation is explicit recognition of that fit, of the importance of socio-cultural context in determining whether or not individual talent will flourish.

Many sowers, many seeds

I’ll say a bit more about that later, but first I want to explain why his suggestion of limited-term sole-proprietor pop-up philanthropic shops speaks to that blindness. It’s simple, really. We are in an era of tremendous social cultural change. It’s pretty clear, at least to many of us, that the future cannot be extrapolated from the past. Something new and different is required. But just what that is, just what will work, no one really knows, though many have opinions. In particular, we don’t know what potentials are latent in the world today.

In that situation it makes sense to sow many seeds widely and quickly. Who should do the sowing? Talented people who are in touch with other talented people. Each of these people will have their own sense of the needs and potentials of the current cultural moment, each will have their own vision about the proper fit between talent and cultural opportunity. But don’t give any one of them too much philanthropic capacity. By endowing many talented people with limited philanthropic capacity you guarantee the placement of many bets over a wide range of future possibilities and potentialities.

Trees on trees



How am I doing at regulating my action at Academia.edu?

On Oct. 31 I posted this chart in a post that was otherwise about Donald Trump and his tweeting activity:

Academia 10-31-18 10AM

The chart depicts action on my page at Academica.edu, my main repository of articles and working papers. The green (upper) line shows how many papers views I got each day while the brown (lower) line shows downloads.

The point of the chart is that there are two periods of markedly increased activity, activity caused by my deliberate and conscious action. I then asked: “What are the chances that I can keep it going? Off hand, not good. I just don’t have that much of what seems to be the right material to keep it going.” That is, I knew what I did to drive the numbers up but I wasn’t prepared to keep doing that.

This chart depicts my action as of November 16 and 10:32 PM:

Aca 11-16-18 1032p

That certainly looks like I’ve managed to keep the action up for roughly the past month. To be sure, the action is spiky but it does appear that, on average, interest is up.

Here’s a chart I grabbed at 5:52 this morning, November 19, 2018:

Academia 11-19-18 552a

We’ve got a new spike there at the right edge of the chart. While it’s low in relation to the action for Nov 4 and 5, the high point of the chart (and I forget just what I did to produce that), it’s high in relation to the chart as a whole and, in particular, it’s in the range of the mid-September action that started this roll (and which is now trailing off the left edge of the graph).

I know exactly what I did this time. I sent the following note to the Humanist Discussion Group:
I don’t know how old you are, Jim (if i may), but I’ll be 71 in a few weeks and was publishing on literature and computation in the mid-1970s:

"Cognitive Networks and Literary Semantics", MLN 91: 1976, 952-982. Here I used a computational model to examine the semantic structure of Shakespeare’s sonnet 129. There’s a downloadable version of that article here: https://www.academia.edu/235111/Cognitive_Networks_and_Literary_Semantics

That same year David Hays and I published this, “Computational Linguistics and the Humanist”, Computers and the Humanities, Vol. 10. 1976, pp. 265-274. There we proposed something we called Prospero, a computer system for reading Shakespeare in some interesting way. Alas, the more we learn about both computing and about the mind/brain, the most distant that Prospero seems, but still, it’s worth thinking about. You can download that here: https://www.academia.edu/1334653/Computational_Linguistics_and_the_Humanist


Bill Benzon
That recent spike consists mostly of interest in those two papers, which is much stronger on views than actual downloads (as is generally the case).

I have no idea what will happen today. Of course, I do expect the numbers to rise above the floor, after all it’s 6 AM here on the East Coast of the USA, but just how high they’ll go, I still don’t know. Nor will I hazard a prediction about the future. But I’m pretty sure that if I want to keep the numbers up, I’ve got to upload a paper to the “sweet zone” pretty soon, and I don’t have anything that’s quite ready for upload.

We’ll see.

Two Tweets of the Day – group differences vs. how people think about them

First there's this:

Then Geoffrey Miller (evolutionary psychologist) asked him to do it with real data. Vaisey replied with this:

H/t Language Log.

Sunday, November 18, 2018

From the old neighborhood




Innovation, stagnation, and the construction of ideas and conceptual systems

Patrick Collison and Michael Nielsen recently published an article that’s been getting a lot of attention in this neck of the woods, “Science Is Getting Less Bang for Its Buck” (The Atlantic, Nov 16, 2018). The purpose of this post is to set their idea in the context of ideas about cognitive evolution in culture that David Hays and I have developed.

I’ve addressed the issue of stagnation in some previous posts. There is a short post from 2014, “Why has progress slowed down?”, where I talk of Roman numerals and their limitations. More recently there is an appendix to “Notes Toward a Naturalist Cultural History (with a new addendum on the paradoxical stagnation of our era)”. All of this speculation takes place, as I’ve indicated, within the general account of the cultural evolution of successive “ranks” of cognitive systems that David Hays and I developed between the mid-1970s and 1990s. I’ve written a general overview of that work HERE; the fundamental paper is “The Evolution of Cognition” (Journal of Social and Behavioral Structures 13(4), 297-320, 1990).

First I comment on a metaphor Collison and Nielsen introduce, that of geographic exploration. Then I offer a metaphor of my own, that of physical construction. I conclude by extending that metaphor to the construction of conceptual systems.

The metaphor of exploration

In developing their argument they offer a metaphor which I’ve used to a similar end:
Suppose we think of science—the exploration of nature—as similar to the exploration of a new continent. In the early days, little is known. Explorers set out and discover major new features with ease. But gradually they fill in knowledge of the new continent. To make significant discoveries explorers must go to ever-more-remote areas, under ever-more-difficult conditions. Exploration gets harder. In this view, science is a limited frontier, requiring ever more effort to “fill in the map.” One day the map will be near-complete, and science will largely be exhausted. In this view, any increase in the difficulty of discovery is intrinsic to the structure of scientific knowledge itself.
However, I would stop with “Exploration gets harder” and elaborate a bit on what happens once initial exploration is complete: pioneering settles move in, communities are established, the land becomes more densely settled, and so forth. That’s a matter of detail. The real issue, though, is to move beyond the metaphor and talk directly about ideas and innovation.

I want to build up to that. But for the moment let’s continue with Collison and Nielsen. They continue with this paragraph:
An archetype for this point of view comes from fundamental physics, where many people have been entranced by the search for a “theory of everything,” a theory explaining all the fundamental particles and forces we see in the world. We can only discover such a theory once. And if you think that’s the primary goal of science, then it is indeed a limited frontier.
Are we discovering theories or constructing them? “Discovering” implies that they’re out there independent of us and all we have to do is talk the right path, turn the right corner, and there it will be, the theory. “Construction” places the emphasis on our activities, our tools, materials, and concepts, whatever it is we use in constructing theories. But these need not be opposed ideas. To extend the metaphor of exploration, it was impossible for us to discover the geography of the Moon’s dark side until we’d constructed the means of investigating it. In this case discovery and construction go hand-in-hand. Without the proper tools, discovery is impossible.

They go on:
But there’s a different point of view, a point of view in which science is an endless frontier, where there are always new phenomena to be discovered, and major new questions to be answered. The possibility of an endless frontier is a consequence of an idea known as emergence. Consider, for example, water. It’s one thing to have equations describing the way a single molecule of water behaves. It’s quite another to understand why rainbows form in the sky, or the crashing of ocean waves, or the origins of the dirty snowballs in space that we call comets. All these are “water,” but at different levels of complexity. Each emerges out of the basic equations describing water, but who would ever have suspected from those equations something so intricate as a rainbow or the crashing of waves?
But how do you get from the equations for a single molecule of water to the equations for the crashing of ocean waves? I’m guessing that the mathematics is by no means self-evident, that quite a bit of construction is necessary. Where to the construction techniques come from?

The metaphor of construction

If you give a competent engineer a set of plans and a pile of materials, she should be able to determine whether or not the device can be build with those materials. Any number of things can be built with a given set of materials, and any given device can be constructed in various ways. But the possibilities are not endless. There must be a match between the materials and the device.

This is obvious enough in the case of material devices, whether they be relatively simple things like axes and clay pots, mechanical devices like a watch or a steam engine, or buildings of all shapes and sizes. I contend that the same is true for ideas of all kinds, but we have but a poor understanding of how ideas are constructed. So let’s continue with the physical world for just a bit.

What do you need to build a skyscraper? Well, of course, there are skyscrapers of all kinds and sizes. But it seems unlikely that we could construct even a small 10-story building out of adobe, or even out of wood – at least wood in its natural state as opposed to the various engineered wood materials that we now being made. And you can’t place a skyscraper just anyplace. The ground must be able to support the weight.

But it’s not just about materials and construction techniques. You also need elevators. They don’t play a role in holding the building up, but they make tall buildings functionally useful. When a building gets beyond six, seven, or eight stories or so, stairs become impractical. Not only does it take too much time to go up and down stairs in a tall building, but climbing stairs is physically challenging.

Twenty years ago I had a room on the 14th floor of a building. There was an emergency that required evacuation. Coming down 14 flights of stairs wasn’t bad but, for some reason, the elevators were unavailable when we were allowed back in. Climbing those 14 flights was a challenge. At the time I was an out of shape middle aged man; had I been in shape the climb wouldn’t have been so bad. Twenty years later I’m 50 pounds heavier and I’m not sure I could do the climb at all, at least not without stopping so often that it would take over an hour. Elevators eliminate that problem, one that simply doesn’t exist for lower buildings.

What other problems do skyscrapers present that don’t exist for lower buildings?

My larger point, though, is that conceptual systems are like these physical systems. They consist of parts of various kinds combined in various ways to perform functions. We just don’t know much about the nature of the parts and how they go together. But we know something.

Mercantilism, arithmetic, logarithms, and the clockwork universe

The Wikipedia tells me that mercantilism “was dominant in modernized parts of Europe from the 16th to the 18th centuries.” Fine. Wikipedia tells us that mercantilism “promotes Government regulation of a nation’s economy for the purpose of augmenting state power at the expense of rival national powers. Mercantilism includes a national economic policy aimed at accumulating monetary reserves through a positive balance of trade, especially of finished goods. Historically, such policies frequently led to war and also motivated colonial expansion.” Again, fine.

What made mercantilism possible? Lots of things I presume. For example:
Mercantilism developed at a time of transition for the European economy. Isolated feudal estates were being replaced by centralized nation-states as the focus of power. Technological changes in shipping and the growth of urban centres led to a rapid increase in international trade. Mercantilism focused on how this trade could best aid the states. Another important change was the introduction of double-entry bookkeeping and modern accounting. This accounting made extremely clear the inflow and outflow of trade, contributing to the close scrutiny given to the balance of trade. Of course, the impact of the discovery of America cannot be ignored... New markets and new mines propelled foreign trade to previously inconceivable volumes, resulting in “the great upward movement in prices” and an increase in “the volume of merchant activity itself”.
There’s a lot of stuff in that one paragraph. I note the importance of double-entry bookkeeping. You can’t run a complex mercantile economy if you can’t keep track of your money.

I note as well the discovery of America. Would it have been possible to exploit that discovery in a world where all calculation was done using Roman numerals? I suggest that it would have been very difficult.


Logarithms, common logarithms.

If we're to survive and thrive with a population of 10B+ then wizards and prophets must collaborate

Saturday, November 17, 2018

In the center




Critical point dynamics in whole-Brain neuronal activity

Ponce-Alvarez et al., Whole-Brain Neuronal Activity Displays Crackling Noise Dynamics, Neuron (2018), https://doi.org/10.1016/j.neuron.2018.10.045.
  • Zebrafish whole-brain activity displays scale-invariant neuronal avalanches 
  • These scale-invariant avalanches are suggestive of critical phenomena 
  • Sensory inputs and self-generated behaviors deviate the dynamics from criticality 
  • Blocking gap junctions disrupts criticality and deteriorates sensory processing

Previous studies suggest that the brain operates at a critical point in which phases of order and disorder coexist, producing emergent patterned dynamics at all scales and optimizing several brain functions. Here, we combined light-sheet microscopy with GCaMP zebrafish larvae to study whole-brain dynamics in vivo at near single-cell resolution. We show that spontaneous activity propagates in the brain’s three-dimensional space, generating scale-invariant neuronal avalanches with time courses and recurrence times that exhibit statistical self-similarity at different magnitude, temporal, and frequency scales. This suggests that the nervous system operates close to a non-equilibrium phase transition, where a large repertoire of spatial, temporal, and interactive modes can be supported. Finally, we show that gap junctions contribute to the maintenance of criticality and that, during interactions with the environment (sensory inputs and self-generated behaviors), the system is transiently displaced to a more ordered regime, conceivably to limit the potential sensory representations and motor outcomes.

What's up in Disney World? For the ultimate "out of this world" vacation...

One of the most anticipated additions to Disney World is not a ride. It’s a hotel — one that will have no windows.

The still-unnamed property, code-named Project Hubble by Disney Imagineering, will simulate what it might be like to sleep on board a luxury “Star Wars” starship as it zooms through the galaxy. Every window will be a video screen offering a “space view.” Guests will be encouraged to dress in “Star Wars” costumes.

The hotel reflects a push by Disney to provide more immersive and personalized experiences. Now even your hotel stay becomes an attraction that is “unique to Disney, that you cannot get down the street,” Mr. Chapek said.

He is applying the same thinking to Disney World transportation. Instead of relying on lumbering buses to get around, for instance, people can now use the Lyft app to hail a polka-dotted S.U.V. called a Minnie Van.

Friday, November 16, 2018

Biological vs. cultural evolution (language and music)

Dan Everett recently posted this tweet:

I responded with the following string of tweets:
On the one hand we have the emergence of language as a phenomenon in biological evolution. That's clear enough. But we also have language change over longish time frames. Can we, should we, think of that as an evolutionary process as well? 1/X

I think so, but, yes, an argument is needed. More than can be put in tweets. But, it's about accounting: What entity is the recipient of, the target of, the evolutionary dynamic? In biological evolution it is, depending on POV, the phenotypic ... 2/X

individual, or (if you are a Dawkinsian) the gene. Dual inheritance theory is about biological evolution, where phenotypic individuals benefit from genetic inheritance and social learning. (Can benefits of social learning be toted up at the level of the gene?). 3/X

Dawkins' idea of the meme is that culture operates in an evolutionary domain where the evolutionary process benefits, not biological phenotypes or genes, but cultural entities he called memes. I think his insight is correct, but the explication of 4/X

meme has been thoroughly and badly botched (due in part to the indefatigable industry of Dan Dennett). I've published an article about music where I attempt to set things straight (more or less). 5/5
Here's that article: “Rhythm Changes” Notes on Some Genetic Elements in Musical Culture, https://www.academia.edu/23287434/_Rhythm_Changes_Notes_on_Some_Genetic_Elements_in_Musical_Culture.
Abstract: An entity known as Rhythm Changes is analyzed as a genetic entity in musical culture. Because it functions to coordinate the activities of musicians who are playing together it can be called a coordinator. It is a complex coordinator in that it is organized on five or six levels, each of which contains coordinators that function in other musical contexts. Musicians do not acquire (that is, learn) such a coordinator through “transfer” from one brain to another. Rather, they learn to construct it from publically available performance materials. This particular entity is derived from George Gershwin’s tune “I Got Rhythm” and is the harmonic trajectory of that tune. But it only attained independent musical status after about two decades of performances. Being a coordinator is thus not intrinsic to the entity itself, but is rather a function of how it comes to be used in the musical system. Recent argument suggests that biological genes are like this as well.

Friday Fotos: Some varieties of light

Ontology notes: What's a "thing"?

Conceptual ontology has been an interest of mine since graduate school. Many ontologies will start with "thing" at the root of the ontology, which seems logical enough. After all, isn't everything some kind of thing? Here's a note I wrote to Language Log's Mark Liberman a decade or so ago:
Dear Mark Liberman,

I few days ago I happened upon "Language Log" and, in particular, I read some of your snippits on ontology. In one of them you assert:

"There's an interesting question to be asked about why people persist in assuming that the world is generally linnaean -- why mostly-hierarchical ontologies are so stubbornly popular -- in the face of several thousand years of small successes and large failures. I have a theory about this, which this post is too short to contain :-) ... It has to do with evolutionary psychology and the advantage of linnaean ontologies for natural kinds -- that's for another post."

Evpsych aside, there's something peculiar going on. From an informatic point of view, the nice thing about linaean structures is that they allow for relatively compact "storage" of information because they allow for inheritance.[1] If you don't know that dogs, cats, rats, and cattle are all mammals, then you've got to store their common features (e.g. general body plan) separately for each. If you know, however, that they are all mammals, then you can store the common features with mammal and allow those features to be inherited by each type of mammal.

But there's something else going on. Language has words for very general classes of things, such as objects, events, attributes, states, and most general of all, things. Whatever else something might be, it is, most assuredly, a thing. So it becomes easy and tempting to start a hierarchy with "thing" at the root and then basic kinds of thing one level down, e.g. "object," "event," "state," and "attribute." Then we might start on the next level and have, say, physical and abstract objects, and physical and abstract events, and physical and abstract . . . And this is beginning to look rather strange.

But what kind of inheritance do we have in this kind of structure? What is it that objects, events, attributes, and states inherit from thing other than thinghood? What is it that physical and abstract objects inherent from objects other than objecthood? Somewhere down there we're going to come to "living thing" and then we can have linnean inheritance from there down. But it's not at all clear to me what's going on between that point and "thing" up there at the root of the tree.

It's a most curious business.
My dictionary informs me that it is "Old English, of Germanic origin; related to German Ding. Early senses included ‘meeting’ and ‘matter, concern’ as well as ‘inanimate object’."And beyond that?

It almost feels like a grammatical function or capability has been turned into a noun and given a name. What is a thing? Well if it can play certain roles in sentences, then it's a thing. As a grammatical category we refer to it as a noun, as an object or phenomenon in the world we refer to it as a thing.

* * * * *

[1] Though I don't have the text in front of me at the moment, I recall that somewhere in The Order of Things Foucault talks about how informatic inheritance was one reason for creating linnean taxonomic trees.

History of the American Revolution, Japanese style from 1861 [Tweet of the Day]

Thursday, November 15, 2018

Near the remains of the old Van Leer chocolate factory, 10 years ago

Jill Lepore on disruption

Evan Goldstein interviews Jill Lepore about her latest book, These Truths (W.W. Norton), a history of America. Here's some remarks on disruption:
Q. The last chapter of These Truths is titled "America, Disrupted," and it traces the rise of ideas from the tech world, like innovation. You point out that innovation was traditionally seen as something to be wary of.

A. It’s true that the last chapter is about disruptive innovation, but it’s also true that the book starts with the history of writing as a technology. Reading "America, Disrupted" in isolation might seem like I have some beef with Silicon Valley — which may or may not be the case — but reading that chapter after the 15 that come before makes it clear that what I have is a deep and abiding interest in technology and communication.Innovation as an idea in America is historically a negative thing. Innovation in politics is what is to be condemned: To experiment recklessly with a political arrangement is fatal to our domestic tranquillity. So there’s a lot of anti-innovation language around the founding, especially because Republicanism — Jeffersonianism — is considered excessively innovative. Innovation doesn’t assume its modern sense until the 1930s, and then only in a specialized literature.

Disruption has a totally different history. It’s a way to avoid the word "progress," which, even when it’s secularized, still implies some kind of moral progress. Disruption emerges in the 1990s as progress without any obligation to notions of goodness. And so "disruptive innovation," which became the buzzword of change in every realm in the first years of the 21st century, including higher education, is basically destroying things because we can and because there can be money made doing so. Before the 1990s, something that was disruptive was like the kid in the class throwing chalk. And that’s what disruptive innovation turned out to really mean. A little less disruptive innovation is called for.

Q. Your first big volley on this topic came in The New Yorker in 2014. Does the innovation mind-set continue to hold such sway within higher education?

I think there’s quite a bit more caution now than when I wrote that essay. That was the high point of heedlessness, when the big thing to be celebrated was blowing up the newspapers. The reason I wrote the essay, after a great deal of unwillingness, was because The New York Times had produced an internal report that was a brief for how the Times needed to become more like BuzzFeed. I thought it was completely bananas. Institutions that mattered to public culture were being dismantled, and institutions in which how we know what we know can be arbitrated — journalism, the academy — were being destroyed.

Q. You mentioned having been unwilling for a long time to write about disruptive innovation. Why?

I was super hesitant because it involved writing about the work of a member of the faculty to which I belong [Clayton M. Christensen], even if the business school is quite a distance intellectually from here.

Also, when I first read all the work, I thought: This is bunk. This doesn’t seem serious enough for me to spend time on. So part of my hesitation was, like, people really buy this stuff? Months passed and then that New York Times report came out and I realized that people buy this so much that the New York Times is remaking itself in the image of this theory.

Fear of death as a driving force in human (cultural) life

Jonathan Jong, Faith and the Fear of Death, The New Atlantis.
The line primus in orbe deos fecit timor — “fear first made gods in the world” — appears in at least two Latin poems in the first century. Earlier it was expressed with great aplomb in Lucretius’s poem On the Nature of Things. For Lucretius, as for many thinkers since, what terrifies us is nature — the fickleness of seed and season, the wrath of storm and sea. At least since Freud, however, the fear of death, or cessation of the self, has been a more common theoretical fascination — “Man’s tomb is the sole birthplace of the gods,” according to Ludwig Feuerbach. I picked up the idea from a group of psychologists working on what they called “terror management theory,” an attempt to explain human behavior in terms of responses to the fear of death. They in turn had picked the idea up from Ernest Becker, an American cultural anthropologist working in the Sixties and early Seventies.

Becker’s book The Denial of Death won the 1974 Pulitzer Prize for General Nonfiction just two months after he died of cancer, aged forty-nine. The book advanced the theory that the knowledge and fear of death is humanity’s central driving force, underlying civilization and all human achievement. According to Becker, we are unique among animals in our awareness of our mortality. This knowledge leads us to construct systems of values — theological, moral, political, cultural, scientific — through which we can deny our finitude. All endeavors within these systems are attempts to obtain immortality, whether literal or symbolic.

The terror management theorists turned Becker’s sweeping analysis into a scientific theory amenable to empirical testing. One experiment in a 1989 study involved twenty-two municipal court judges who were asked to set bail in the case of a hypothetical woman charged with prostitution. The judges were given identical prosecutor’s notes describing the case, but half of the judges, randomly selected, also received instructions to imagine and write about what dying would be like and how these thoughts about death made them feel. The other half were spared any prompted thoughts about mortality. While the judges in the neutral condition set bail at an average of $50, the judges who were asked to contemplate death set bail at $455, over nine times higher. The researchers concluded that this showed that thinking about death made the judges more punitive against someone accused of violating a moral norm, confirming the idea that strengthening moral norms is part of what we do when we are anxious about our finitude.

Since this study, hundreds of further experiments have explored the much broader effects that thinking about death has on our desire to achieve some form of immortality. For example, studies have demonstrated that thinking about death increases our desire to have children and even to name our children after ourselves. It also increases our desire for fame, including the desire to have stars named after us (the astronomical objects, not celebrities). It seems clear from these studies that we want to live on through our offspring and others’ memories of us. This would all be fairly innocuous, except that the vast majority of the research has also shown that thoughts about death can lead us to be more nationalistic, xenophobic, homophobic, ageist, and otherwise prejudiced about those different from us. Confronted with our mortality, we dig our heels in and defend our own communities over and against others.
H/t 3QD.

Wednesday, November 14, 2018

Is there a meaningful technical difference between an language and a dialect (of some language)?

What’s the difference between a language and a dialect? Is there some kind of technical distinction, the way there is between a quasar and a pulsar, or between a rabbit and a hare? Faced with the question, linguists like to repeat the grand old observation of the linguist and Yiddishist Max Weinreich, that “a language is a dialect with an army and a navy.”

But surely the difference is deeper than a snappy aphorism suggests. The very fact that “language” and “dialect” persist as separate concepts implies that linguists can make tidy distinctions for speech varieties worldwide. But in fact, there is no objective difference between the two: Any attempt you make to impose that kind of order on reality falls apart in the face of real evidence.
From his final two paragraphs:
Or, yes, the written dialect will have its words collected in dictionaries. The Oxford English Dictionary does have more words than Archi and Endegen do; the existence of print has allowed English-speakers to curate many of their words instead of letting them come and go with time. But words are only part of what makes human speech: You have to know how to put them together, and knowing how to handle Archi’s words (or Endegen’s) requires its own level of sophistication.

So, what’s the difference between a language and a dialect? In popular usage, a language is written in addition to being spoken, while a dialect is just spoken. But in the scientific sense, the world is buzzing with a cacophony of qualitatively equal “dialects,” often shading into one another like colors (and often mixing, too), all demonstrating how magnificently complicated human speech can be. If either the terms “language” or “dialect” have any objective use, the best anyone can do is to say that there is no such thing as a “language”: Dialects are all there is.
I've not read the whole article (I skipped to the end), but I'm biased in McWhorter's direction. In fact, I believe that all that exists (physically) are individual idiolects. But constant communication among individuals assures that these idiolects are mutually intelligible to a considerable extent, yielding dialects.

Temporal processing in human memory

Qiaoli Huang, Jianrong Jia, Qiming Han, Huan Luo, Fast-backward replay of sequentially memorized items in humans, eLife 2018;7:e35164, DOI: 10.7554/eLife.35164.


Storing temporal sequences of events (i.e., sequence memory) is fundamental to many cognitive functions. However, it is unknown how the sequence order information is maintained and represented in working memory and its behavioral significance, particularly in human subjects. We recorded electroencephalography (EEG) in combination with a temporal response function (TRF) method to dissociate item-specific neuronal reactivations. We demonstrate that serially remembered items are successively reactivated during memory retention. The sequential replay displays two interesting properties compared to the actual sequence. First, the item-by-item reactivation is compressed within a 200 – 400 ms window, suggesting that external events are associated within a plasticity-relevant window to facilitate memory consolidation. Second, the replay is in a temporally reversed order and is strongly related to the recency effect in behavior. This fast-backward replay, previously revealed in rat hippocampus and demonstrated here in human cortical activities, might constitute a general neural mechanism for sequence memory and learning.

eLife digest

Have you ever played the ‘Memory Maze Challenge’ game, or its predecessor from the 1980s, ‘Simon’? Players must memorize a sequence of colored lights, and then reproduce the sequence by tapping the colors on a pad. The sequence becomes longer with each trial, making the task more and more difficult. One wrong response and the game is over.

Storing and retrieving sequences is key to many cognitive processes, from following speech to hitting a tennis ball to recalling what you did last week. Such tasks require memorizing the order in which items occur as well as the items themselves. But how do we hold this information in memory? Huang et al. reveal the answer by using scalp electrodes to record the brain activity of healthy volunteers as they memorize and then recall a sequence.

Memorizing, or encoding, each of the items in the sequence triggered a distinct pattern of brain activity. As the volunteers held the sequence in memory, their brains replayed these activity patterns one after the other. But this replay showed two non-intuitive features. First, it was speeded up relative to the original encoding. In fact, the brain compressed the entire sequence into about 200 to 400 milliseconds. Second, the brain replayed the sequence backwards. The activity pattern corresponding to the last item was replayed first, while that corresponding to the first item was replayed last. This ‘fast-backward’ replay may explain why we tend to recall items at the end of a list better than those in the middle, a phenomenon known as the recency effect.

The results of Huang et al. suggest that when we hold a list of items in memory, the brain does not replay the list in its original form, like an echo. Instead, the brain restructures and reorganizes the list, compressing and reversing it. This process, which is also seen in rodents, helps the brain to incorporate the list of items into existing neuronal networks for memory storage.

UFOs over the Hudson River


Monday, November 12, 2018

Networks and success in art

S. P. Fraiberger et al., Quantifying reputation and success in art, Science 10.1126/science.aau7224 (2018).
Abstrarct: In areas of human activity where performance is difficult to quantify in an objective fashion, reputation and networks of influence play a key role in determining access to resources and rewards. To understand the role of these factors, we reconstructed the exhibition history of half a million artists, mapping out the coexhibition network that captures the movement of art between institutions. Centrality within this network captured institutional prestige, allowing us to explore the career trajectory of individual artists in terms of access to coveted institutions. Early access to prestigious central institutions offered life -long access to high -prestige venues and reduced dropout rate. By contrast, starting at the network periphery resulted in a high dropout rate, limiting access to central institutions. A Markov model predicts the career trajectory of individual artists and documents the strong path and history dependence of valuation in art.
From the article: 
Fig. 1. Coexhibition network. Force - directed layout of the order τ = ∞ coexhibition network, whose nodes are institutions (galleries, museums). Node size is proportional to each institution ’s eigenvector centrality. Nodes are connected if they both exhibited the same artist, with link weights being equal to the number of artists’ coexhibitions. Node colors encode the region in which institutions are located. Links are of the same colors as their end nodes, or gray when end nodes have different colors. For visualization purposes, we only show the 12,238 nodes corresponding to institutions with more than 10 exhibits; we pruned the links by keeping the most statistically significant links ( 20 ) (supplementary text S2.2). We implemented community detection on the pruned network ( 21 ), idendentifying 122 communities (supplementary text S2.30. We highlighted five of them, the full community breakdown being shown in fig. S3. We also show the names of the most prestigious institution for each community.

An old Green villain Mural, GVM021 [#GVM021]




Music Evolving

Eita Nakamura, Kunihiko Kaneko, Statistical Evolutionary Laws in Music Styles, arXiv:1809.05832v1 [physics.soc-ph].
If a cultural feature is transmitted over generations and exposed to stochastic selection when spreading in a population, its evolution may be governed by statistical laws, as in the case of genetic evolution. Music exhibits steady changes of styles over time, with new characteristics developing from traditions. Recent studies have found trends in the evolution of music styles, but little is known about quantitative laws and theories. Here we analyze Western classical music data and find statistical evolutionary laws. For example, distributions of the frequencies of some rare musical events (e.g. dissonant intervals) exhibit steady increase in the mean and standard deviation as well as constancy of their ratio. We then study an evolutionary model where creators learn their data-generation models from past data and generate new data that will be socially selected by evaluators according to novelty and typicality. The model reproduces the observed statistical laws and its predictions are in good agreement with real data. We conclude that some trends in music culture can be formulated as statistical evolutionary laws and explained by the evolutionary model incorporating statistical learning and the novelty-typicality bias.

Placebos work [annals of mind and body]

Gary Greenberg, New York Times Magazine, What if the Placebo Effect Isn’t a Trick?, November 7, 2018.
And after a quarter-century of hard work, they have abundant evidence to prove it. Give people a sugar pill, they have shown, and those patients — especially if they have one of the chronic, stress-related conditions that register the strongest placebo effects and if the treatment is delivered by someone in whom they have confidence — will improve. Tell someone a normal milkshake is a diet beverage, and his gut will respond as if the drink were low fat. Take athletes to the top of the Alps, put them on exercise machines and hook them to an oxygen tank, and they will perform better than when they are breathing room air — even if room air is all that’s in the tank. Wake a patient from surgery and tell him you’ve done an arthroscopic repair, and his knee gets better even if all you did was knock him out and put a couple of incisions in his skin. Give a drug a fancy name, and it works better than if you don’t.

You don’t even have to deceive the patients. You can hand a patient with irritable bowel syndrome a sugar pill, identify it as such and tell her that sugar pills are known to be effective when used as placebos, and she will get better, especially if you take the time to deliver that message with warmth and close attention. Depression, back pain, chemotherapy-related malaise, migraine, post-traumatic stress disorder: The list of conditions that respond to placebos — as well as they do to drugs, with some patients — is long and growing.
And it may not be a 'mere' psychological trick:
Aided by functional magnetic resonance imaging (f.M.R.I.) and other precise surveillance techniques, Kaptchuk and his colleagues have begun to elucidate an ensemble of biochemical processes that may finally account for how placebos work and why they are more effective for some people, and some disorders, than others. The molecules, in other words, appear to be emerging. And their emergence may reveal fundamental flaws in the way we understand the body’s healing mechanisms, and the way we evaluate whether more standard medical interventions in those processes work, or don’t. Long a useful foil for medical science, the placebo effect might soon represent a more fundamental challenge to it.
More in the article, which also goes into the history of the placebo effect.

The Beatles' 'White Album' has turned 50

“The Beatles” is as much a concept album as “Sgt. Pepper,” and the concept is, again, right in the title: a top-to-bottom reinvention of the band as pure abstraction, the two discs, like stone tablets, delivering a new order. (“By packaging 30 new songs in a plain white jacket, so sparsely decorated as to suggest censorship,” Richard Goldstein wrote in his New York Times review, “the Beatles ask us to drop our preconceptions about their ‘evolution’ and to hark back.”) The songs progress through a spectral, mystical, and romantic dimension, the soundscape itself becoming fluid and associative. The Beatles’ ability to conjure orchestras and horns and sound effects and choirs out of thin air imbues the tracks with a dream logic. The juxtaposition of order and disorder, of the ragged and the smooth, of the sublime and the mundane, of the meticulously arranged and the carelessly misplayed, provides what the critic John Harris called “the sense of a world moving beyond rational explanation.” The music seemed to absorb the panic and violence of 1968, the “year of the barricades.” As the Sunday Times critic commented, “Musically, there is beauty, horror, surprise, chaos, order; and that is the world, and that is what the Beatles are on about: created by, creating for, their age.”
The end:
As with 1968’s other impenetrable conversation piece, Stanley Kubrick’s “2001: A Space Odyssey,” the White Album skates off the edge of reality and into the abyss with “Revolution 9,” a Fluxus-inspired montage, beginning with a recording engineer testing the studio’s No. 9 input and ending with what Charles Manson admiringly described as “the sounds of the end of the world.” Only when that nightmare is consummated, closing in screams and roaring flame, can the album’s initial globalism return. Accompanied by George Martin’s orchestra, Ringo’s sweet delivery of Lennon’s final lullaby ends with a whispered “good night” to “everybody, everywhere,” the Beatles apparently floating over the Earth like Kubrick’s “2001” Star Child returned from his journey “beyond the infinite,” or like Apollo 8’s Frank Borman, who, six weeks later, read a Christmas prayer from orbit, prompting one grateful woman to send NASA a telegram to tell Borman that he “saved 1968.”

The remixing and remastering of this new anniversary edition illustrate how constrained the Beatles were by the nineteen-sixties technology that limited the recordings to eight or even four tracks, which had to be “mixed down,” losing clarity each time, in order to add more music. Rebuilt digitally, the album’s enormous soundscape is finally complete: the progressive generational muddiness is gone, revealing the dry snap of Ringo’s snare and Harrison’s full-throated gentle weeping and the thunderous effervescence of McCartney’s bass runs and Lennon’s halting intakes of breath. We can fully hear, at last, what they were trying to do. The formal, holistic creation is complete—unavoidable razor-blade splices and editing errors (exposed by earlier CD editions) are now gone, replaced by the smooth, clean bite of the digital transfers, a final “stripping away” that elevates the material to the Platonic form for which it was conceived.

Bergen Arches @3QD

I’ve got another article at 3 Quarks Daily: Bergen Arches: Living for the City.

This arched door graced the Transportation Building at the 1893 Chicago World’s Fair. The building was designed by Louis Sullivan, who also designed one of the world’s first skyscrapers, the Wainwright Building in St. Louis, Missouri in 1891.

A little over a decade later the Erie Railroad began blasting away at the trap rock of Bergen Hill in New Jersey, giving us the Erie Cut, which is known as the Bergen Arches for the structures that carry traffic over the cut, which bisected the city, dividing the Heights neighborhood from the rest of the city.

We’re standing under the Palisades Avenue bridge, the eastern portal to the Bergen Arches, and looking up and out toward the East. Where’s the city?

The Cut was completed in 1910 and trains went through it for roughly the next 50 years. But that time Jersey City’s days as a port city had ended and rail transport to the bank of the Hudson River was no longer necessary. The Arches were abandoned and closed off.

My 3QD article tells more or less the rest of that story, with a look toward the future.

Sunday, November 11, 2018

Hello, I'm coming to you from the future

Open Source & community-owned infrastructure for the new Academy (the old one is moribund)

A little peek into a long thread:

Saturday, November 10, 2018

The role of the bedroom in the invention of privacy in the Wet

Lisa Hilton reviews (Lauren Elkin's English translation of) Michelle Perrot, The Bedroom in TLS.
“Before the bedroom there was the room, before that almost nothing”, Perrot announces. She chooses to leapfrog over much medieval history and begin with a description of the king’s bedroom at Versailles under Louis XIV – an account for which she relies heavily on the memoirs of the duc de Saint-Simon, who spoke to the Sun King twice. Emmanuel Le Roy Ladurie is more thorough as an interpreter of the stultifying etiquette of the Versailles system, and Nancy Mitford much funnier; there is no fresh research or information in Perrot’s summary, which is at best indecisive as to the significance of the royal bedroom: “the king’s chamber guards its mysteries”. That said, we learn that Perrot intends to trace the origin of the desire for a “room of one’s own”, which mark of individualization is apparently less universal than it might appear. The Japanese had no notion of privacy, we are told, which might have come as news to the ukiyo-e artists of the seventeenth century; nonetheless, Perrot is broadly correct in her account of the evolution of private sleeping space as depending on the movement from curtained or boxed beds to separate chambers during the same period in Europe. This is about as far as the tracing of origins goes – Perrot then makes a brief detour into the communal apartments of Eastern Europe before the collapse of communism (though anyone interested in the psychological consequences of this would do better to read Orlando Figes), before announcing confidently that the conjugal bedroom became customary for the middle classes in the West after 1840, in imitation of Queen Victoria and Prince Albert, who married that year.