Monday, September 23, 2019
Ingrid Robeyns asks that question at Crooked Timber, The most blasphemous idea in contemporary discourse?, Sept. 21, 2019:
I have no idea how he found it, but George Monbiot read an (open access) academic article that I wrote, with the title “What, if Anything, is Wrong with Extreme Wealth?’ In this paper I outline some arguments for the view that there should be an upper limit to how much income and wealth a person can hold, which I called (economic) limitarianism. Monbiot endorses limitarianism, saying that it is inevitable if we want to safeguard life on Earth.
As Monbiot’s piece rightly points out, there are many reasons to believe that there should be a cap on how much money we can have. Having too much money is statistically highly likely to lead to taking much more than one’s fair share from the atmosphere’s greenhouse gasses absorbing capacity and other ecological commons; it is a threat to genuine democracy; it is harmful to the psychological wellbeing of the children of the rich, and to the capacity of the rich to act autonomously when it concerns moral questions (which includes the reduced capacity for empathy of the rich); and, as I’ve argued in a short Dutch book on the topic that I published earlier this year, extreme wealth is hardly ever (if ever at all) deserved. And if those reasons weren’t enough, one can still add the line of Peter Singer and the effective altruists that excess money would have much greater moral and prudential value if it were spent on genuine needs, rather than on frivolous wants.
Monbiot wrote: “This call for a levelling down is perhaps the most blasphemous idea in contemporary discourse.”
I agree that mainstream capitalist societies operate on the assumption that the sky is the limit. But it is important to point out that the idea that there should be a cap on how much we can have, is not at all new. Historically, thinkers from many corners of the world and writing in very different times, have either given reasons why no-one should become excessively rich, or have proposed economic institutions that would have as an effect that no-one would become superrich (I suppose Marx would be in that latter category). Matthias Kramm and I have joint research on this that I’ll happily post on this blog once it is published. But to give a flavour of the range of support for the view that there should be upper limits, here are three very different sources. (I’ll leave out any comments on Socrates and Plato, since John and Belle are the obvious experts on those thinkers).
And so forth.
I posted the following comment:
A book that has influenced my thinking quite a bit is David Boehm's Hierarchy in the Forest: The Evolution of Egalitarian Behavior (1999). Boehm is interested in accounting for the apparent egalitarian behavior of hunter-gatherer bands, the most basic form of human social organization. While individuals can assume a leadership role for specific occasions, e.g. a hunt, there are no permanent leaders in such bands. Boehm does not argue that such bands are egalitarian utopias; on the contrary, primitive egalitarianism is uneasy and fraught with tension. But it is real. Boehm finds this puzzling because, in all likelihood, our immediate primate ancestors had well-developed status hierarchies. Boehm ends up adopting the notion that the hierarchical behavioral patterns of our primate heritage are overlain, but not eradicated or replaced, by a more recent egalitarian social regime. Other than suggesting that this more recent regime is genetic Boehm has little to say about it.What I like about this is the idea that our social behavior is mediated by (at least) two behavioral systems, which are organized on very different principles: hierarchy and dominance vs. equality and anarchy (in the sense of self-organizing w/out orders from above). So let's accept that as a premise. That is in our 'nature'. I'm also going to postulate our 'nature' has no way of giving priority to one of these systems. Rather, than is something that is done by 'culture' according to local social circumstances.In this view, one of the things we're working out over the course of history, then, is the relationship between these two systems. The (phylogenetically older) hierarchical system is perfectly happy with extreme wealth because the resulting inequality is consistent with it. But the (phylogenetically newer) system doesn't like it at all. I don't see any inherently 'right' way to resolve this interaction, but I note that neither system is going to disappear. Both 'make demands' on our behavior.So, it's all well and good for the economists to tell us that a rising tide floats all boats. But there's going to be a point where the peasants in the little rafts and zodiacs are going to be angry with the plutocrats and oligarchs in their megayachts sailing around the sea like they own it.* * * * *We can see this two-systems dynamic on display in Shakespeare. Consider Much Ado About Nothing. We've got two couples. Claudio and Hero interact through the hierarchical system. How does Claudio pursue Hero? Without speaking to Hero at all, he approaches his military commander to broach the matter with her father. Her father accepts on her behalf, all without conferring with her. Beatrice and Benedick, on the other hand, confront one another as equals, and one of the joys of this play is their wit combats. While both are aristocrats (as are all the principals in Shakespeare's plays), neither is rigidly fixed in the aristocracy. And so the play moves back and forth between the stories of these two couples. Of course, the play has a happen ending; both couples are to be married. But that ending has required the interaction of both of these plot lines.
If I might indulge a current hobby horse, I've been playing with the idea that language is the simplest thing humans do that requires a computational account. From this premise it follows, for example, that however the minds/brains of chimpanzees, dogs, bees, ants, or c. elegans work, it's not through communication. Something else is going on, complex dynamics, for example. OK.
I'm thinking that all these bumps, hesitations, fillers, whatever, of conversation betray the inner workings of these mechanisms. We've got, say, a dynamical system implementing a computational process, speech. And it doesn't always go smoothly. The right word or phrase isn't always available; it's not like they're all queued up just waiting to be entered into the speech stream. So the system has to hunt around looking for them. That is, we're listening to and making sense of our own speech via the auditory system even as the motor system is placing words into the speech stream.
Now, when we write, he can clean things up so it appears perfect. The language computer can parse those sentences readily (that is, map words and phrases onto semantic structures) and it all makes sense. But we all know that writing can often be quite difficult. We have to do quite a bit of reworking to produce computationally fluid prose.
Saturday, September 21, 2019
Hetty Roessingh, Why cursive handwriting needs to make a school comeback, The Conversation, August 23, 2019:
Beyond a nostalgia for the pre-digital age, there are good reasons why cursive handwriting needs to make a comeback. As a researcher who has studied the relationship of handwriting to literacy, along with other scholars, I've found that developing fluency in printing and handwriting so that it comes automatically matters for literacy outcomes. Handwriting is also an elegant testimony to the human capacity for written literacy and an inspiring symbol of the unique power of the human voice. [...]But touching a "d" on the keyboard, for example, does not create the internal model of a "d" that printing does. [...[Evolving research in the neurosciences underscores the importance of developing automatic skills in relation to what educational psychologists call the cognitive load.Lessons learned from sports or the performing arts highlight the importance of establishing neuronal connections that promote fluid movement. With reading and writing, too, the keys to unlocking creativity or interpretation of story elements are also related to being able to write automatically.
Markham Heid, Bring Back Handwriting: It's Good for Your Brain, Elemental, Sept. 12, 2019:
Psychologists have long understood that personal, emotion-focused writing can help people recognize and come to terms with their feelings. Since the 1980s, studies have found that “the writing cure,” which normally involves writing about one’s feelings every day for 15 to 30 minutes, can lead to measurable physical and mental health benefits. These benefits include everything from lower stress and fewer depression symptoms to improved immune function. And there’s evidence that handwriting may better facilitate this form of therapy than typing.A commonly cited 1999 study in the Journal of Traumatic Stress found that writing about a stressful life experience by hand, as opposed to typing about it, led to higher levels of self-disclosure and translated to greater therapeutic benefits. It’s possible that these findings may not hold up among people today, many of whom grew up with computers and are more accustomed to expressing themselves via typed text. But experts who study handwriting say there’s reason to believe something is lost when people abandon the pen for the keyboard.“When we write a letter of the alphabet, we form it component stroke by component stroke, and that process of production involves pathways in the brain that go near or through parts that manage emotion,” says Virginia Berninger, a professor emerita of education at the University of Washington. Hitting a fully formed letter on a keyboard is a very different sort of task — one that doesn’t involve these same brain pathways. “It’s possible that there’s not the same connection to the emotional part of the brain” when people type, as opposed to writing in longhand, Berninger says.Writing by hand may also improve a person’s memory for new information. A 2017 study in the journal Frontiers in Psychology found that brain regions associated with learning are more active when people completed a task by hand, as opposed to on a keyboard. The authors of that study say writing by hand may promote “deep encoding” of new information in ways that keyboard writing does not. And other researchers have argued that writing by hand promotes learning and cognitive development in ways keyboard writing can’t match.The fact that handwriting is a slower process than typing may be another perk, at least in some contexts. A 2014 study in the journal Psychological Science found that students who took notes in longhand tested higher on measures of learning and comprehension than students who took notes on laptops.“The primary advantage of longhand notes was that it slowed people down,” says Daniel Oppenheimer, co-author of the study and a professor of psychology at Carnegie Mellon University. While the students who typed could take down what they heard word for word, “people who took longhand notes could not write fast enough to take verbatim notes — instead they were forced to rephrase the content in their own words,” Oppenheimer says. “To do that, people had to think deeply about the material and actually understand the arguments. This helped them learn the material better.”
Language Log's Victor Mair concurs and goes on to offer some observations about the writing habits of personal acquaintances, noting:
Recently, however, I have discovered a third type of writer, one that fascinates me greatly. They have those tablets with a cover that you flip back and you're ready to write on it. They can compose or call up a text on the tablet and they can handwrite on it too. What really blows me away is when they draw lines and circles around different parts of the text or add handwritten notes to it — in different colors for emphasis or to signify categories of meaning! If they don't like what they wrote, they can effortlessly erase it. I love to watch them work dexterously; they seem to have the best of both worlds: typing and handwriting. As such, their thinking, creation, and analysis operate in multiple modes — and it shows when they scintillatingly start talking about what they've been writing.
He concludes with a list of posts on writing, most but not all are about writing Chinese characters.
Friday, September 20, 2019
Bumping this to the top just to remind myself of these questions.
1. What is the target/beneficiary of the evolutionary dynamic?
The direct beneficiary a cultural entity, which is what Dawkins had in mind we he posited the existence of memes. Of course human beings, biological entities, have to benefit indirectly and in the large, otherwise cultural evolution would have no biological (adaptive) value.
2. Replication (copying) or (re)construction.
I suppose both, sorta. The gene-like entities, which I’m now calling coordinators, are replicated, or something like it (think of phonemes in language as a central example). But the phenotype-like entities, which I’m calling cultural beings, are reconstructed of course. (See my page, Cultural Evolution Terms, for a bit more about these.)
3. Is there a meaningful distinction comparable to the biological distinction between phenotype and genotype?
Of course. The genetic entities (coordinators of various kinds) consist of observable features of objects and events. The phenotypic entities, called cultural beings, exist in the coordinated minds (linked through various processes of communication and participation) of a population of humans. If the humans like a given cultural being it will persist in the population, and thus the coordinators on which its construction depends. Otherwise it will die and any coordinators that are unique to it will die as well.
And then there is a fourth, over-arching question:
That is to say, what can an evolutionary account of cultural history tell us that isn’t captured in a pile of narratives of the more standard kind?
I would think so, but that requires an argument. I’ve not yet created one, though others may well have done so. For example, I think that Matthew Jocker's depiction of influence in 19th century Anglophone novels is best explained by an evolutionary account rather than a blither of individual narratives:
I've discussed that figure several times, most recently in Notes toward a theory of the corpus, Part 1: History [#DH].
Have Republicans decided it's do or die, that if they can't hold on to power now, they'll never win again?
Steven Levitsky and Daniel Ziblatt, Why Republicans Play Dirty, NYTimes, Sept. 20, 2019:
Why is the Republican Party playing dirty? Republican leaders are not driven by an intrinsic or ideological contempt for democracy. They are driven by fear.
Democracy requires that parties know how to lose. Politicians who fail to win elections must be willing to accept defeat, go home, and get ready to play again the next day. This norm of gracious losing is essential to a healthy democracy.
But for parties to accept losing, two conditions must hold. First, they must feel secure that losing today will not bring ruinous consequences; and second, they must believe they have a reasonable chance of winning again in the future. When party leaders fear they cannot win future elections, or that defeat poses an existential threat to themselves or their constituents, the stakes rise. Their time horizons shorten. They throw tomorrow to the wind and seek to win at any cost today. In short, desperation leads politicians to play dirty.
Take German conservatives before World War I. They were haunted by the prospect of extending equal voting rights to the working class. They viewed equal (male) suffrage as a menace not only to their own electoral prospects but also to the survival of the aristocratic order. One Conservative leader called full and equal suffrage an “attack on the laws of civilization.” So German conservatives played dirty, engaging in rampant election manipulation and outright repression in the late 19th and early 20th centuries.
In the United States, Southern Democrats reacted in a similar manner to the Reconstruction-era enfranchisement of African-Americans.
The only way out of this situation is for the Republican Party to become more diverse. A stunning 90 percent of House Republicans are white men, even though white men are a third of the electorate. Only when Republicans can compete seriously for younger, urban and nonwhite voters will their fear of losing — and of a multiracial America — subside.
Such a transformation is less far-fetched than it may appear right now; indeed, the Republican National Committee recommended it in 2013. But parties only change when their strategies bring costly defeat. So Republicans must fail — badly — at the polls.
American democracy faces a Catch-22: Republicans won’t abandon their white identity bunker strategy until they lose, but at the same time that strategy has made them so averse to losing they are willing to bend the rules to avoid this fate. There is no easy exit. Republican leaders must either stand up to their base and broaden their appeal or they must suffer an electoral thrashing so severe that they are compelled to do so.
Thursday, September 19, 2019
Evelyn Douek, Why Facebook’s 'Values' Update Matters, Lawfare, Sept. 16, 2019:
Mark Zuckerberg, Facebook’s founder and CEO, has released periodic “manifestos” in the form of blog posts laying out his vision for the company—or, as Zuckerberg prefers to call it, the “Community.” But this latest update written by Monika Bickert, Facebook’s vice president of global policy management, is far more substantive than mere corporate buzzwords. It may have a significant impact for the platform and, therefore, for online speech.
Under the heading “Expression,” the new values set out Facebook’s vision for the platform and its rules:
The goal of our Community Standards is to create a place for expression and give people voice. Building community and bringing the world closer together depends on people’s ability to share diverse views, experiences, ideas and information. We want people to be able to talk openly about the issues that matter to them, even if some may disagree or find them objectionable. In some cases, we allow content which would otherwise go against our Community Standards—if it is newsworthy and in the public interest. We do this only after weighing the public interest value against the risk of harm, and we look to international human rights standards to make these judgments.The update then goes on to note that there are four values that may justify limiting expression: authenticity, safety, privacy and dignity.
There is a lot to unpack in this very short post. A few things are especially worth noting: the prioritization of “voice” as the overarching value, the understanding that the purpose of this voice is to “build community and bring the world closer together” and the explicit incorporation of international human rights standards.
The Oversight Board:
Bickert’s post does not give any clues as to the reason for the update. But the post comes as Facebook is finalizing its plans for an independent Oversight Board, which will be able to review and overrule Facebook’s content moderation decisions. When Facebook released its Draft Charter for the new board, it noted that the board would base its decisions on Facebook’s Community Standards and “a set of values, which will include concepts like voice, safety, equity, dignity, equality and privacy.” As I wrote at the time, “This sounds good, but a list of priorities that includes everything prioritizes nothing.” Facebook had to make difficult choices about what the point of Facebook is in order to guide the board in cases of ambiguity. The quiet update to its values last week represents this important step.
As Facebook readies itself to outsource ultimate decisions about its rules to an independent, external body, these values represent both a set of instructions to the board about the ambit of its role, as well as a commitment to bind Facebook to the mast of the implications of these values expressly laid down.
Of course, none of the values Facebook has set out are technically binding. Facebook could theoretically change its values the day after it gets an Oversight Board decision it doesn’t like. [...]
But the point of the Oversight Board experiment is to garner greater public legitimacy for Facebook’s content moderation decisions through a commitment to transparency and explanation of Facebook’s decision-making. The board’s existence is fundamentally a bet that this kind of legitimacy matters to users’ perceptions of the company and their decisions on whether to keep using the platform—as well as to regulators pondering what to do about the tech giants. [...] it will be the public’s job, too, to hold the company and its new Oversight Board to a fair and justifiable reading of what these commitments entail.
Wednesday, September 18, 2019
William Langewiesche has a long article in the NYTimes Magazine about the malfunctions in the Boeing 737. It contains an example of rote learning that I'm parking here just to remind myself of the phenomenon.
At the start, civil aviation in China was a mess, with one of the highest accident rates in the world.
Dave Carbaugh, the former Boeing test pilot, spent his first 10 years with the company traveling the globe to teach customers how to fly its airplanes. He mentioned the challenge of training pilots in Asia. “Those were the rote pilots,” he said, “the guys standing up in the back of a sim. They saw a runaway trim. They saw where and how it was handled in the curriculum — always on Sim Ride No. 3. And so on their Sim Ride No. 3, they handled it correctly, because they knew exactly when it was coming and what was going to happen. But did they get exposed anywhere else? Or did they discuss the issues involved? No. It was just a rote exercise. This is Step No. 25 of learning to fly a 737. Period.” I asked about China specifically. He said: “The Chinese? They were probably the worst.” He spent every other month in China for years. He said: “They saw flying from Beijing to Tianjin as 1,352 steps to do. Yet if they flew from Beijing to Guangzhou, it was 1,550 steps. And they didn’t connect the two. It would get so rote that they just wouldn’t deviate. I remember flying with a captain who would never divert no matter how many problems I gave him. I asked him, ‘How come?’ He said, ‘Because the checklist doesn’t say to divert.’”
That changed over time. With the support of the Chinese government, which went so far as to delegate some regulatory functions to foreigners like Carbaugh, the manufacturers were able to instill a rigorous approach to safety in a small cadre of pilots and managers, who in turn were able to instill it in others. The effort was made not out of the goodness of the manufacturers’ hearts, but out of calculations related to risk and self-preservation. It is widely seen to have been a success. Today the Chinese airlines are some of the safest in the world.
Tyler Rogoway, The Strike On Saudi Oil Facilities Was Unprecedented And It Underscores Far Greater Issues, The Drive, Sept. 19, 2019:
That brings us to my next point, one you probably also thought to yourself when this happened—this was an unprecedented attack. Welcome to the murky world of unmanned warfare that I have been warning about for many years. I almost take this issue personally because people use to blow it off or even snicker at it. Now all the predictions I wish were wrong are coming true and at an alarming pace.Satellite imagery and GPS:
The Department of Defense was ridiculously asleep at the wheel regarding this threat and is now scrambling to play catchup. Anyone who says differently is straight-up lying. It's well established what non-state actors can already do with relatively low-end unmanned aircraft technology—Houthi rebels alone have been using suicide drones for two and a half years—just imagine what a peer state will be able to do in the very near future. Instead of a mass of individual suicide drones layered in with other weapons, like cruise missiles, attacking a target simultaneously, imagine a swarm that is fully networked and works cooperatively to best achieve their mission goals, including jamming or killing air defenses in order for the swarm to make it to its final destination. America's adversaries are all too aware of this game-changing potential and the lack of defenses to counter it in any robust manner.
Here's a cold hard reality that most people just don't understand, including many defense sector pundits—air defense systems, no matter how advanced and deeply integrated, aren't magic. They have major limitations, especially considering most primarily rely on ground-based sensors.
The cold hard truth is that counter-unmanned aircraft and counter-cruise missile capabilities are not 'sexy' to develop, field, and maintain operationally, but it will increasingly become absolutely essential to divert more funds in this direction. And no, I am not talking about some guys running around with wonky, sci-fi looking electronic warfare rifles. I am saying dense and layered counter-UAS capabilities will be required to even counter domestic threats in the years to come, especially against VIPs and critical infrastructure.H/t Tyler Cowen.
We live in an age where everyone has access to high-resolution satellite imagery of nearly any point on the globe. This is something that was unthinkable even following the end of the Cold War. A single individual now has the capabilities that entire government intelligence agencies were built to produce, all on their smartphone or laptop computer. And it's entirely free!
GPS is even more of a revolutionary capability. It's incredible pinpoint accuracy really has become more concerning since the hobby drone industry exploded and now components to control drones via GPS are somewhat off-the-shelf in nature and are supplied from manufacturers around the globe. With these two things combined, a bad actor has both the targeting intelligence and the precision targeting capabilities available for a minuscule fraction of what they cost in the past and without any major barriers of entry.
It is sometimes useful to reflect of the fact that, aurally, the speech stream is continuous, not segmented. The segmentation is something we impose on the stream through cognitive mechanisms – that, I argue, is the computational foundation of language. Thus early forms of writing often consisted of a continuous stream of characters, with no segmentation into separate words. Victor Mair has a post at Language Log that speaks to this, The challenging importance of spacing in Korean:
Who'da thunk it? – spacing is the most difficult aspect of Korean writing. One might have thought it would be a simple task, that word spacing / separation is innate for all speakers of a given language. Apparently that is not so.Be sure to read the comments.
In Hanyu Pinyin, it is called fēncí liánxiě 分詞連寫 ("word division; parsing"). Of course, it has its problems, but we do have rules to guide us, viz., zhèngcífǎ 正詞法 ("orthography").
This morning in my "Language, Script, and Society in China" course, I embarked on a discussion of the difference between zì 字 ("character") and cí 詞 ("word"). Although this seems like a simple, straightforward question, it is always one of the most difficult topics encountered in the course — especially for students of Chinese background. It took me a whole semester to get the idea across to the 72 very smart students in my language studies class at the University of Hong Kong in 2002-2003. Even at the conclusion of the semester, there were still some of the students who just couldn't comprehend the distinction.
Addendum: In fact, I'll reprint one of them in full. Victor Mair, who started the thread, posts this on behalf of an unnamed colleague
Spacing–word division–assumes shared knowledge among users of what constitutes a language's words. This is not a trivial matter, and Korean linguists, lexicographers and publishers have been working the issue for decades.This too is relevant to the issue of computation in the mind. And so: I've just been thinking about this. And I'm wondering if the problem isn't similar to the problem that adolescent and post-adolescent second language learners have with pronunciation. I don't know what the current literature says about that, but in the past I've seen it attributed to a lack of neuro-plasticity. I don't find that terribly convincing. My intuition – and it's no more than that – is that the problem is more like conscious access. For some reason conscious access to (something in) the aural-motor channel has been, if not lost, somewhat degraded.
The basic problem, as one of the commentators intimates, is that words, like (morpho)phonemic spelling, are an artifact of writing. They are not a given to be plucked from someone's brain. Orthography takes it upon itself to regularize (adjudicate) the intuitions users have about what constitutes the lexical units of their language, which are far from uniform and constantly shifting. Korean lacked that tradition and is catching up, although in a sense all written languages that use word division are continuously "catching up." I don't see it as a major problem, or a problem at all.
What I do find problematic in Asian languages is fluid "standards" for sentence representation, namely, where the period goes. This is not an issue (for me) in Korean, probably because the language does use word division, which enforces a discipline on writers that carries beyond the identification of (agreement on) word boundaries to one's whole approach to sentence structure. Chinese sentences–the text between periods–are often by western standards two sentences, five sentences, or partial sentences. Japanese writers also seem to have more liberty in this regard than a westerner would expect. Vietnamese sentences, in earlier novels at least, end or don't end seemingly at whim. And I question if Tibetans even have the concept of "sentence."
I've been out of this field for too long so my thinking may be dated. But there may be psycholinguistic issues at play here that merit serious study.
Could the same thing be going on in the transfer of segmentation from the aural-motor channel to the visuo-orthographic?
Roy Scranton, Climate Change Is Not World War, NYTimes, Sept. 18, 2019.
When Representative Alexandria Ocasio-Cortez of New York and Senator Edward Markey of Massachusetts introduced their Green New Deal proposal in February, they chose language loaded with nostalgia for one of the country’s most transformative historical moments, urging the country to undertake “a new national, social, industrial and economic mobilization on a scale not seen since World War II and the New Deal era.” [...] Two years later, Bill McKibben wrote an article arguing that climate change was actually World War III, and that the only way to keep from losing this war would be “to mobilize on the same scale as we did for the last world war.”Yet much of this rhetoric involves little or no understanding of what national mobilization actually meant for Americans living through World War II. As a result, the sacrifices and struggles of the 1940s have begun to seem like a romantic story of collective heroism, when they were in fact a time of rage, fear, grief and social disorder. Countless Americans experienced firsthand the terror and excitement of mortal violence, and nearly everyone saw himself caught up in an existential struggle for the future of the planet.
Scranton then quickly runs through the changes:
...30 million Americans were uprooted from their homes [...] 16 million service members among them were stripped of their civilian identities and then shuttled through a vast national bureaucracy in the greatest experiment in social mixing and mass indoctrination in American history. [...] More than 400,000 were killed, and 670,000 more were wounded.
Women entered the workforce; a million+ African-Americans served in segregated military units, others migrated north; race riots; industry retooled for war; "free speech and labor organizing were curtailed"; internment camps for Japanese Americans; mass media was consumed by war propaganda.
Total mobilization during World War II also led to the birth of what President Dwight D. Eisenhower would in 1961 define as the “military-industrial complex.” Annual military spending (adjusted for inflation) skyrocketed from less than $10 billion before the war to nearly $1 trillion during it, and except for a brief dip between the end of World War II and the Korean War, has never sunk below $300 billion, whether the United States was at war or not. The country now spends more on its military budget than the next seven nations combined, and maintains the largest number of military bases on foreign soil of any country.Such is the legacy of America’s mobilization during World War II, which inaugurated a long-term transformation in American politics, permanently shifting power from the legislative branch to the executive, and gave birth to the national security state, the nuclear arms race, and a culture of militarism. As the journalist Fred Cook wrote in 1962, “No break with the traditions of America’s past has been so complete, so drastic, as the one that has resulted in the growth of the military-industrial complex.”
Climate change is different:
First, climate change is not a war. There is no clear enemy to mobilize against, and thus no way to ignite the kind of hatred that moved Americans against Japan during World War II. No clear enemy also means no clear victory. [...]Second, as opposed to World War II, when national mobilization meant a flood of government money that truly did lift all boats, the transformations required to address climate change would have real economic losers. [...]Third, mobilization during World War II was a national mobilization against foreign enemies, while what’s required today is a global mobilization against an international economic system: carbon-fueled capitalism. It took President Franklin D. Roosevelt years of political groundwork and a foreign attack to get the United States into World War II. What kind of work over how many years would it take to unify and mobilize the entire industrialized world — against itself?[And] ... the fact is that climate change is just one of several progressive concerns. [...] Finally, national climate mobilization would have cascading unforeseen consequences, perhaps even contradicting its original goals, just like America’s total mobilization during World War II. [...]Nevertheless, total mobilization may be our only hope. [...] Nevertheless, total mobilization may be our only hope.
Tuesday, September 17, 2019
Tyler Cowen's Emergent Ventures recently had an Unconference to celebrate its one-year anni versary. Craig Palsson attended and then went immediately to a standard academic conference. He compares the two.
The Unconference was designed to forge new connections. Conferences are advertised as a way to create connections, but they usually don’t create a good environment for it. Sessions are organized by a series of papers, and you typically attend the session related to your work. If I work on financial panics, I go to the session with related papers, and so do all of the people in my field. Over the three days, I discover that I’m usually with the same people, because we have similar interests, and I never interact with most people at the conference. [...]Hierarchy (vs. anarchy?):
Contrast this with the Unconference. It started with a reception, which is typical of such events. But then at dinner we sat at tables by birthday, inserting some randomness into our conversation partners. Then halfway through dinner, Tyler shuffled the seating arrangements so we would have new conversation partners. Then during the Unconference sessions, participants were randomly assigned to groups for 45 minute discussions. Generally there was an understanding that people were welcome to walk into a conversation and join. [...]
The Unconference did not have a hierarchy of achievement. Anyone who has been to a Conference understands there are hierarchies. On the final day of the Conference, I’m eating breakfast, and two others are at the same table with me. I don’t know them, so I start a conversation. But then one of the most prominent people at the Conference sits at the table. Unashamed, I abandon the first conversation and focus solely on the important person. If this person knows and likes me, that could benefit my career, so I’m not going to waste an opportunity.H/t Tyler Cowen.
I’m writing these observations the day after the breakfast, and I’ve already forgotten the names of the two people I started the conversation with.
But at the Unconference, I did not see a hierarchy. There was an implicit assumption that everyone was working on something interesting, and therefore everyone had something to contribute. Indeed, one of the notes I made in my journal was a comment made by an 18 year old participant. This is the aspect of the Unconference that I think will be hardest to transfer to a Conference.
Addendum: This random connection thing, I wonder, does that come from the nascent world view Cowen has been trying out: Toward a theory of random, concentrated breakthroughs (2.28.2019):
I don’t (yet?) agree with what is to follow, but it is a model of the world I have been trying to flesh out, if only for the sake of curiosity. Here are the main premises:
1. For a big breakthrough in some area to come, many different favorable inputs had to come together. So the Florentine Renaissance required the discovery of the right artistic materials at the right time (e.g., good tempera, then oil paint), prosperity in Florence, guilds and nobles interested in competing for status with artistic commissions, relative freedom of expression, and so on.
2. To some extent, but not completely, the arrival of those varied inputs is random. Big breakthroughs are thus hard to predict and also hard to control.
Richard C. Bush, How Hong Kong Got to This Point, Lawfare, September 17, 2019.
An Imperfect, But Workable, Hybrid
To understand the current situation, it’s necessary to understand the political system that China designed for Hong Kong as it prepared to regain sovereignty over the territory 1997. This political system is embodied in the Hong Kong Basic Law. It’s worth keeping in mind a distinction between the protection of civil and political rights and the institutions that pick a society’s leaders. In liberal, electoral democracies, rights and elections work together and reinforce each other. But some, “hybrid” systems have one and not the other. Hong Kong is one of those systems.
In the China-U.K. Joint Declaration of 1984, which laid out the reversion plan for Hong Kong, and in the Basic Law that elaborated the plan, Beijing pledged that Hong Kong people would enjoy the rights contained in the International Covenant on Civil and Political Rights. The text of the International Covenant became a Hong Kong ordinance. Furthermore, Beijing pledged that the rule of law would apply in Hong Kong, in part to protect those rights, and that there would be an independent judiciary. The legal system was the common law system, not a Chinese-style, rule-by-the-Communist-Party system.
In short, when it came to civil and political rights, Hong Kong basically had a liberal order. This was a precious asset that cannot be over-emphasized. Even today, it should be valued by all citizens of Hong Kong, because they all benefit from its protection. [...]
On the issue of how Hong Kong leaders are elected, the set-up was more complicated and less satisfying for those who desired popular, democratic rule. After reversion, only some of the members of the Legislative Council were selected in popular elections; ultimately that share rose to half. The rest were selected in functional constituencies that reflected various economic and professional sectors (bankers, lawyers, real estate companies, manufacturers, educators, etc.). The majority of these constituencies were pro-Beijing and most of them were picked by a small number of voters. Moreover, the chief executive was picked by an election committee comprised mainly of pro-Beijing people. The upshot: Beijing had engineered how senior elective positions were filled to ensure that it maintained significant control and to block political forces it did not like from gaining power.
In part because of the design of this system, there has been a high concentration of economic and political power in Hong Kong. It has one of the highest Gini coefficients—measuring inequality—in the world (53.9). A relatively small number of families and companies control a lot of the wealth and a lot of the political power. Not surprisingly, the public had a high level of alienation against the establishment because of the unequal distribution of wealth and power. One way to rectify the situation was to get more democracy. [...]
Protests on the Rise
In the second decade after Hong Kong’s reversion to China, two important changes occurred. The first went relatively unnoticed at the time but proved to be consequential. That was that some people in the pan-Democratic, anti-government camp became unhappy with the rules concerning public assembly and began engaging in political action that was unpredictable, relatively disruptive, technically illegal, and sometimes violent. The number of such incidents grew steadily from the middle of the 2000s. It was mainly young people who conducted these new-style protests.
The second change was a decision by Beijing reforming the electoral procedures for the Chief Executive and Legislative Council. It was willing to allow all registered voters to vote for the chief executive, rather than the 800 members of the Election Committee, but there was a catch: It insisted that a clone of the election committee be the body that would nominate the candidates (not, for example, political parties). The nominating committee’s members were predominantly allies of Beijing and not representative of Hong Kong society as a whole. The conclusion that Hong Kong democrats drew was that control was still China’s priority and that any election result would still not reflect the will of the majority.
Things when downhill from there. Then:
The 2019 Extradition Law
But the most serious challenge to the rule of law was the extradition law, which, if passed, would have allowed China to request, with little or no justification, the transfer of individuals in Hong Kong to the mainland in order to subject them to the Chinese legal system. It remains unclear whether Hong Kong’s Chief Executive Carrie Lam was the author of this proposal, as she says, or whether Beijing put her up to it.
Whatever the case, it turned into a political disaster for the government, because Hong Kong people quickly mounted strong resistance to the draft law.
In conclusion, chill out:
The need for a “cooling-off period” in the protests and demonstrations—and for self-restraint—is urgent. China will celebrate the 70th anniversary of the founding of the People’s Republic on October 1. For that celebration to take place while protests continue would create great embarrassment for the Chinese leadership. That may be exactly what some in Hong Kong want, but the risks for Hong Kong of causing that loss of face are profound. China is not going away. It is Hong Kong’s sovereign. To live successfully with that sovereign and to restore a high degree of autonomy under current circumstances requires Hong Kong to pick its fights carefully.
Gregory Barber, Artificial Intelligence Confronts a 'Reproducibility' Crisis, Wired, 9.16.19:
When Facebook attempted to replicate AlphaGo, the system developed by Alphabet’s DeepMind to master the ancient game of Go, the researchers appeared exhausted by the task. The vast computational requirements—millions of experiments running on thousands of devices over days—combined with unavailable code, made the system “very difficult, if not impossible, to reproduce, study, improve upon, and extend,” they wrote in a paper published in May. (The Facebook team ultimately succeeded.)
The problem is widespread:
Neural networks, the technique that’s given us Go-mastering bots and text generators that craft classical Chinese poetry, are often called black boxes because of the mysteries of how they work. Getting them to perform well can be like an art, involving subtle tweaks that go unreported in publications. The networks also are growing larger and more complex, with huge data sets and massive computing arrays that make replicating and studying those models expensive, if not impossible for all but the best-funded labs.“Is that even research anymore?” asks Anna Rogers, a machine-learning researcher at the University of Massachusetts. “It’s not clear if you’re demonstrating the superiority of your model or your budget.” [...]It’s one thing to marvel at the eloquence of a new text generator or the “superhuman” agility of a videogame-playing bot. But even the most sophisticated researchers have little sense of how they work. Replicating those AI models is important not just for identifying new avenues of research, but also as a way to investigate algorithms as they augment, and in some cases supplant, human decision-making—everything from who stays in jail and for how long to who is approved for a mortgage.
Reproducibility is hard:
“Starting where someone left off is such a pain because we never fully describe the experimental setup,” says Jesse Dodge, an AI2 researcher who coauthored the research. “People can’t reproduce what we did if we don’t talk about what we did.” It’s a surprise, he adds, when people report even basic details about how a system was built. A survey of reinforcement learning papers last year found only about half included code.Sometimes, basic information is missing because it’s proprietary—an issue especially for industry labs. But it’s more often a sign of the field’s failure to keep up with changing methods, Dodge says. A decade ago, it was more straightforward to see what a researcher changed to improve their results. Neural networks, by comparison, are finicky; getting the best results often involves tuning thousands of little knobs, what Dodge calls a form of “black magic.” Picking the best model often requires a large number of experiments. The magic gets expensive, fast.
What's the point?
The point of reproducibility, according to Dodge, isn’t to replicate the results exactly. That would be nearly impossible given the natural randomness in neural networks and variations in hardware and code. Instead, the idea is to offer a road map to reach the same conclusions as the original research, especially when that involves deciding which machine-learning system is best for a particular task.
Monday, September 16, 2019
For some decades now I’ve been a bit puzzled by Shakespeare’s status. I don’t doubt that he’s good, but THAT good? Really? I mean, apparently he’s so good that the Klingons claim him for their own, and they’re not even real. No, Shakespeare IS good, but there’s more to his reputation than mere technical or aesthetic excellence–or is it that there’s more to aesthetic excellence than craft, skill, and grace? 
When someone asserts that Ussain Bolt is the greatest sprinter of all time, I know what they mean. He’s won many internal competitions and he’s the world record holder in the 100, 200, meter sprints, and the 4 by 100 meter relay. Excellence in running is easily measured. When someone says Bolt is a faster sprinter than, say, Karl Lewis I know what they mean. Bolt ran the 100 meters in 9.58 seconds, Lewis in 9.86. Bolt is 0.30 seconds faster than Lewis. Simple, clear, objective.
To assert that Shakespeare is the world’s best writer is to assert, not only that he’s merely better than, but that he is far and away better than, say, Rabindranath Tagore, George Eliot, Alexander Pushkin, Honoré de Balzac, Miguel de Cervantes, Dante Alighieri, Murasaki Shikibu, Sophocles, and so forth through a long list – and we haven’t even gotten to the Klingon masters. By what measure is Shakespeare head and shoulders above those many others? Oh, I’m sure learned reasons can be given; Harold Bloom has likely given three-quarters of them at one time or another. But none of them includes a simple objective measure like time to sprint 100 meters.
It’s not that simple, not at all. For one thing, it’s a subjective matter. The fact that this judgment is subjective does not, of course, mean we cannot talk and reason about it. Intersubjectivity is real, after all. We’ve been talking and reasoning about it for centuries and the upshot is that Shakespeare is the best.
But something else is going on, something other than mere subjectivity.
The founder effect
What do I mean by founder effect? It’s a term from evolutionary biology. Oleg Sobchuk  explains it this way [pp. 93-94]:
Imagine a group of monkeys living in an imaginary Unhappy valley. The valley is called Unhappy, because there are too many monkeys and too little food. In the search for more bananas and oranges, a group of four monkeys (who happen to be shorter than the rest of their group – just by chance) takes a risky trip to an unknown land, possibly full of predators. However, they get lucky: they find another valley with lots of bananas and no monkey competitors. So, these four settle in the Happy valley. They give birth to many children, all of which share the genes of this initial small group. As a result, most of the monkeys in the new colony are short – like their four ancestors. And – it is important to stress – they are short not because this trait is adaptive (i.e., it was not selected for), but simply due to chance. It just happened that the founders were short – by chance.
But what does that have to do with Shakespeare?
Sobchuk goes on to point out [p. 96]:
The contemporary literary field – Modern European literature – began to take shape in the late eighteenth and early nineteenth centuries. The main reason for this was the invention of the modern nation state: a radically new way to organize societies (Hobsbawm 1992). And at the avant-garde of nationalism were Germans. They did not have a united state, and they felt the need to obtain it more than anyone else. A state united by a single nation, a single language, and... a single literature. German intellectuals started thinking about their literary canon earlier than the intellectuals in other countries. They started inventing the canon.
And they decided to center it on Shakespeare. Why? Sobchuk continues:
An appeal to the senses, feelings, and imagination – this is what Shakespeare’s plays could provide for readers tired of the over-formalized and rational classicist drama. [...] However, it also happened that this was a decisive place and time in the history of European literature. It was the moment when, along with the formation of the modern nation state, modern national literature was formed, having a canon of “geniuses” at its center. Shakespeare was not the founder of modern literature, but he was chosen by the founders.
To repeat Sobchuk, we have the emergence of a new form of political organization, the nation state, accompanied by the institutionalization of a literary canon to support that political form. Shakespeare was chosen to anchor that canon.
But there is more to Shakespeare’s suitability for this role than “an appeal to the senses, feelings, and imagination”.
Shakespeare and the invention of the modern
Let us turn for a moment to Harold Bloom :
Western psychology is much more a Shakespearean invention than a Biblical invention, let alone, obviously, a Homeric, or Sophoclean, or even Platonic, never mind a Cartesian or Jungian invention. It’s not just that Shakespeare gives us most of our representations of cognition as such; I’m not so sure he doesn’t largely invent what we think of as cognition. I remember saying something like this to a seminar consisting of professional teachers of Shakespeare and one of them got very indignant and said, You are confusing Shakespeare with God. I don’t see why one shouldn’t, as it were. Most of what we know about how to represent cognition and personality in language was permanently altered by Shakespeare.
Frankly, I think that’s a bit overwrought – “I’m not so sure he doesn’t largely invent what we think of as cognition.” But I do think Bloom is looking in the right place.
Sabine Hossenfelder, Mind the Gap Between Science and Religion, Nautilus, Sept. 16, 2019.
Have you heard that we may be living in a computer simulation? Or that our universe is only one of infinitely many parallel worlds in which you live every possible variation of your life? Or that the laws of nature derive from a beautiful, higher-dimensional theory that is super-symmetric and explains, supposedly, everything?
And finally, if you are really asking whether our universe has been programmed by a superior intelligence, that’s just a badly concealed form of religion. Since this hypothesis is untestable inside the supposed simulation, it’s not scientific. This is not to say it is in conflict with science. You can believe it, if you want to. But believing in an omnipotent Programmer is not science—it’s tech-bro monotheism. And without that Programmer, the simulation hypothesis is just a modern-day version of the 18th century clockwork universe, a sign of our limited imagination more than anything else.
It’s a similar story with all those copies of yourself in parallel worlds.
Danger! Will Robinson:
This blurring of the line between science in religion is not innocuous. Resources—both financial and human—that go into elucidating details of untestable ideas are not available for research that could lead to much-needed progress. [...] We have fought hard for secularism, and we don’t want religious leaders to meddle in scientific debate. Scientists, likewise, should respect the limits of their discipline.
Do you want to know how music works? Watch this clip of Hiromi Uehara taking a piano lesson.
Here's some quick and crude observations from an email I just sent to a friend:
Here's some quick and crude observations from an email I just sent to a friend:
A younger Hiromi takes a lesson. Alas, not in any language I can understand. But at least you get to see a lot of supple fingers. And the left-hand gesture at 10:22. This is getting really interesting. But you have to watch their whole bodies, no?
You know about the lizard brain, right? Well, within the lizard brain there’s the worm’s brain (the reticular activating system, RAF). How does the RAF modulate whole body motion and the “long line” of the performance? There must be some kind of hierarchy of motor activation, from whole body/trunk, to arms, to fingers, whatever.
And it’s interesting how easily they slip in and out of intense playing mode.
So much to observe. So much to understand.
And the moments they play together. You notice how sometimes the teacher only tracks Hiromi on a note here and there, not every note in the passage.
Sunday, September 15, 2019
In U.S. Presidential races, the popular-vote winner will lose 40% of elections decided by 2 million votes or less. Electoral College "inversions" have been likely since the 1800s, from @MikeGeruso, Dean Spears, and Ishaana Talesara https://t.co/mCxLHGut39 pic.twitter.com/6NmnRvyiVj— NBER (@nberpubs) September 15, 2019
Bumping this to the top of the queue. I'm thinking about classification of cultural items and it's useful to be reminded that "language" is an informal notion that doesn't hold up under careful analysis. That's what this post is about.Yesterday I put up a post (A Note on Memes and Historical Linguistics) in which I argued that, when historical linguists chart relationships between things they call “languages”, what they’re actually charting is mostly relationships among phonological systems. Though they talk about languages, as we ordinarily use the term, that’s not what they actually look at. In particular, they ignore horizontal transfer of words and concepts between languages.
Consider the English language, which is classified as a Germanic language. As such, it is different from French, which is a Romance language, though of course both Romance and Germanic languages are Indo-European. However, in the 11th Century CE the Norman French invaded Britain and they stuck around, profoundly influencing language and culture in Britain, especially the part that’s come to be known as England. Because of their focus on phonology, historical linguists don’t register this event and its consequences. The considerable French influence on English simply doesn’t count because it affected the vocabulary, but not the phonology.
Well, the historical linguists aren’t the only ones who have a peculiar view of their subject matter. That kind of peculiar vision is widespread.
Let’s take a look at a passage from Sydney Lamb’s Pathways of the Brain (John Benjamins 1999). He begins by talking about Roman Jakobson, one of the great linguists of the previous century:
Born in Russia, he lived in Czechoslovakia and Sweden before coming to the United States, where he became a professor of Slavic Linguistics at Harvard. Using the term language in a way it is commonly used (but which gets in the way of a proper understanding of the situation), we could say that he spoke six languages quite fluently: Russian, Czech, German, English, Swedish, and French, and he had varying amounts of skill in a number of others. But each of them except Russian was spoken with a thick accent. It was said of him that, “He speaks six languages, all of them in Russian”. This phenomenon, quite common except in that most multilinguals don’t control as many ‘languages’, actually provides excellent evidence in support of the conclusion that from a cognitive point of view, the ‘language’ is not a unit at all.
Think about that. “Language” is a noun, nouns are said to represent persons, places, or things – as I recall from some classroom long ago and far away. Language isn’t a person or a place, so it must be a thing. And the generic thing, if it makes any sense at all to talk of such, is a self-contained ‘substance’ (to borrow a word from philosophy), demarcated from the rest of the world. It is, well, it’s a thing, like a ball, you can grab it in your metaphorical hand and turn it around as you inspect it.
I love these guys, and I love this particular clip, crude, rude, loud, and bombastic though it is. And funny too.
But I particularly like the passage that starts at about 3:23 and continues through 3:49, where the trumpets play in harmony, to about 4:14, where they're switching gears to move into the next section of the melody. That passage has been haunting me for the last two or three days. It pops into my inner ear and plays over and over until...it doesn't or until I listen to this clip.
Why? Well, I suppose in part because I'm a trumpeter and the trumpets get to soar. I know what it is to play like that. But you don't have to be a trumpeter to get there. You just have to like music.
* * * * *
I love these guys too, the whole clip. But they're not guys, most of them are girls–though there are a few guys here and there. I've been listening to this one for maybe 10 years. That long?
They clearly love what they're doing. They're serious about it. Above all, they're dignified. And, in a way, they're adult. Better, when they're performing they're no longer kids–for, as you can plainly see, that's what they are–they're just people making music.
No particular age. Age doesn't matter here.
Just musical beings.
Stephen Wertheim, The Only Way to End ‘Endless War’, NYTimes, Sept. 14, 2019:
Four years ago, President Barack Obama denounced “the idea of endless war” even as he announced that ground troops would remain in Afghanistan. In his last year in office, the United States dropped an estimated 26,172 bombs on seven countries.
President Trump, despite criticizing Middle East wars, has intensified existing interventions and threatened to start new ones. He has abetted the Saudi-led war in Yemen, in defiance of Congress. He has put America perpetually on the brink with Iran. And he has lavished billions extra on a Pentagon that already outspends the world’s seven next largest militaries combined.
Give up dominance:
Like the demand to tame the 1 percent, or the insistence that black lives matter, ending endless war sounds commonsensical but its implications are transformational. It requires more than bringing ground troops home from Afghanistan, Iraq and Syria. American war-making will persist so long as the United States continues to seek military dominance across the globe. Dominance, assumed to ensure peace, in fact guarantees war. To get serious about stopping endless war, American leaders must do what they most resist: end America’s commitment to armed supremacy and embrace a world of pluralism and peace. [...]
In theory, armed supremacy could foster peace. Facing overwhelming force, who would dare to defy American wishes? That was the hope of Pentagon planners in 1992; they reacted to the collapse of America’s Cold War adversary not by pulling back but by pursuing even greater military pre-eminence. But the quarter-century that followed showed the opposite to prevail in practice. Freed from one big enemy, the United States found many smaller enemies: It has launched far more military interventions since the Cold War than during the “twilight struggle” itself. Of all its interventions since 1946, roughly 80 percent have taken place after 1991.
Why have interventions proliferated as challengers have shrunk? The basic cause is America’s infatuation with military force. Its political class imagines that force will advance any aim, limiting debate to what that aim should be. Continued gains by the Taliban, 18 years after the United States initially toppled it, suggest a different principle: The profligate deployment of force creates new and unnecessary objectives more than it realizes existing and worthy ones. [...]
Despite Mr. Trump’s rhetoric about ending endless wars, the president insists that “our military dominance must be unquestioned” — even though no one believes he has a strategy to use power or a theory to bring peace. Armed domination has become an end in itself. Which means Americans face a choice: Either they should openly espouse endless war, or they should chart a new course.
Bring the soldiers home:
On its own initiative, the United States can proudly bring home many of its soldiers currently serving in 800 bases ringing the globe, leaving small forces to protect commercial sea lanes. It can reorient its military, prioritizing deterrence and defense over power projection. It can stop the obscenity that America sends more weapons into the world than does any other country. It can reserve armed intervention, and warlike sanctions, for purposes that are essential, legal and rare.
Shrinking the military’s footprint will deprive presidents of the temptation to answer every problem with a violent solution. It will enable genuine engagement in the world, making diplomacy more effective, not less. As the United States stops being a party to every conflict, it can start being a party to resolving conflicts.
On the question of why we continue to wage war "as challengers have shrunk", see TO WAR! Part 1: War and America's National Psyche and TO WAR! Part 2: A Marx Brothers Analysis of America's War Craziness.
Saturday, September 14, 2019
I’m an American so by another culture I mean, well... I wouldn’t consider Adam Roberts foreign, though he’s British. I know there are some (relatively minor) cultural differences despite the fact that we speak and write pretty much the same language. Would I think of Jules Verne as foreign? Probably not. Perhaps for me foreign means non-Western.
I’ve got two recent examples in mind: Cixin Liu, The Three-Body Problem (2006), Tade Thompson, Rosewater (2016). If you changed obvious non-English and non-Western elements, such as names of people and places, would I have recognized either of these books as coming from a non-Western author? I don’t know. It’s not in the least obvious to me that I would. Oh, if pressed I might be able to make up something that makes one book Nigerian and the other Chinese, but I’d have no confidence in that. They’re both science fiction, and very different kinds of science fiction at that – Three-Body centers on computers and ranges over vast reaches of space and time while Rosewater is set in a two decade span later in this century and reads like a detective story – but it's the SF dominates my sense of either.
Does participation in the culture(s) of science fiction over-ride ethnocentric thematics and resonance? Not necessarily universal, not at all, but cosmopolitan perhaps? Is science fiction an inherently cosmopolitan literary form? Or is it becoming, has become, so?
* * * * *
And, yes, Thompson is an interesting case. He is a Yoruba who was born in London and his family moved to Nigeria while he was young. He is a native speaker of English. But then Nigeria is like that. Nigerian English is the official language in a country where some 250 peoples live.
Language Log has an interesting post that contains most of a recent article from The Taiwan News (8.31.2019):
"World Civilization Research Association" (世界文明研究促進會) scholars are claiming that Western civilization originates from China and all European languages are merely Mandarin dialects, the Liberty Times reports.
World Civilization Research Association Vice President and Secretary-General Zhai Guiyun (翟桂鋆) said during an interview with Sina Online that some English words derive from Mandarin. For example, "yellow" resembles Mandarin for "leaf falling" (葉落, yeluo) because it is the color of autumn, while "heart" resembles "core" (核的, hede).
Zhai concluded this "proves" English is, in fact, a "dialect" of Mandarin. He further claimed that after Chinese formed the English language, Russian, French, German, and other European-based languages, went through a similar process of sinicization.
The World Civilization Research Association group of scholars are professors from a number of Chinese academic institutions. Association member Zhu Xuanshi (諸玄識) further claimed that Western civilization is a "sub-civilization" of Chinese culture.
He said Europeans "felt ashamed" due to the "fact" there was no history in Europe before the 15th century, compared to China. In an attempt to paper over this historical humiliation, the Europeans "fabricated" stories about ancient Egyptian, Greek, and Roman civilizations – all based on Chinese history.
World Civilization Research Association founder Du Gangjian (杜鋼建) said the organization has set up branches in the U.S., Canada, U.K., Thailand, South Korea, and Madagascar to "restore" the truth of world history. "Do not let fake, Western-centered history hinder the great Sino-Renaissance," he was quoted as saying.
They omitted the final paragraph:
Many Chinese citizens were unconvinced, however, with some mocking the association members by calling them “Wolf Warrior Scholars” – which references a patriotic Chinese movie. “Thanks, we can no longer laugh at the Koreans who claimed Confucius and Genghis Khan are Korean,” one commenter sardonically lamented.
When you scan down to the comments you find:
The author is not quite up to date. As P. Choudhury has shown decades ago, the origins of Chinese language, culture etc. have be sought in ancient India:
Choudhury, P. Indian origin of the Chinese nation: a challenging, unconventional theory of the origin of the Chinese, Calcutta 1990.
What’s going on? I’m pretty sure that the people promulgating their ideas aren’t stupid or uneducated. So why these strange ideas? I think it’s nationalism gone amuck. And it seems to be akin to the sort of thinking that goes on in conspiracy theories of various kinds – flat earth, moon landings are fake, 9/11 was a false flag operation, etc. And, for that matter, it’s not unlike the narcissism that countermands the National Weather Service in favor of the Presidential Sharpie. It’s an attempt to make sense of the world albeit subject to certain preconditions.
So, if the Middle Kingdom is indeed the center of the universe, then it necessarily follows that Western Civilization is an offshoot of China. And for 1000s of years it was reasonable for the Chinese to think of themselves as being the center. And now that China is on the rise, why not elide the embarrassments of the last two centuries or so and once again stand firm on that ancient truth?
I am a citizen of a large, rich, technologically advanced country, the remaining superpower, the United State of America. I may not like some of the things this superpower does in my name, I may have complex and ambivalent feelings about this superpower, but the fact that I am a citizen of the USofA is part of my identity and it's a way I think of myself in the world and in history.
What if you’re living in a nation-state that doesn’t have a very prominent place in world affairs, but you’ve got a TV and radio and through them have some sense of the larger world? You know that there’s a world beyond your village, or town, or your neighborhood. How do you relate to that larger world?
Some years ago I published an article about three post-WWII manga by Osamu Tezuka: “Dr. Tezuka’s Ontological Laboratory and the Discovery of Japan”. I argue that in these three manga, published between 1948 and 1951, Tezuka in effect creates a mythology which establishes Japan as a small peace-loving nation mediating between two warring superpowers. Only the third one, Next World, is overtly political. What interested me about these manga is that in them Tezuka reconstructs the whole world, the whole chain of being from top to bottom, thus giving them a science fiction cast. My point was that, as a conceptual device, nationalism pervades one’s world view.
Given that, it’s not so hard to understand what this “World Civilization Research Association” is up to. Standard mythological business. Here’s another Language Log post about these scholar-mythographers.