Thursday, March 26, 2020

Public perception of the Trump administration's handling of the evolving coronavirus crisis

Shibley Telhami, Stella M. Rouse, How Are American Views of the Coronavirus Crisis Evolving as It Intensifies?, Lawfare, March 25, 2020. The authors polled nationally representative of 2,395 Amreican adults from March 12 to March 20.
A plurality of respondents said they are unsatisfied with the Trump administration’s handling of the coronavirus (47 percent) while 42 percent said they are satisfied, and 10 percent said they are neither satisfied nor unsatisfied. As expected, there is a huge partisan divide, with 81 percent of Democrats saying they are unsatisfied, and 79 percent of Republicans saying they are satisfied. A slight majority of independents (51 percent) say they are dissatisfied, and 29 percent satisfied, with the Trump administration’s handling.
However, there was a small shift in attitudes over that period. The authors conclude:
These trends show that, as the crisis persists, more Americans are worried, but also slightly more Americans are satisfied with the administration’s handling of the crisis, transcending the partisan divide. At the same time, the lack of trust in President Trump has remained high, even increased. How does one explain the increase in American satisfaction—even if most Americans remain unsatisfied?

There are likely two principal reasons. First, the administration did move to announce some dramatic actions during this period—on March 13, Trump declared a national emergency; on March 14, the administration said that it would ban travel from the United Kingdom and Ireland; on March 16, the White House issued guidelines to Americans urging them to avoid bars and restaurants, to limit gatherings to 10 or fewer people, and to work from home and engage in homeschooling; and on March 18, the administration closed the U.S.-Canada border to nonessential traffic.

Second, as the crisis intensified, and Americans measurably became more worried—the president even called himself a “wartime president”—one might expect some rallying around the flag, even in a polarized time. In fact, the surprise here is not that there is some effect but that the effect seems relatively small in comparison to other crises: President George W. Bush, who had a contentious election in 2000, had an approval rating as low as 51 percent divided along partisan lines before the 9/11 attack; immediately after, his approval rating spiked to 90 percent.

Still, the full scale of the coronavirus crisis remains unknown, and, after announcing dramatic steps during the nine-day period of our poll, the president recently signaled he may soon change course. There is nothing we see in the poll to suggest that Americans will come together behind Trump’s handling of the crisis.

Wednesday, March 25, 2020

The President does not have the power to override state and local policies on social distancing

Trump can say what he wants, and that will affect those who believe in him and that "will translate into mounting political pressure on GOP officeholders at the state and local levels who might otherwise support these measures." And, as a practical matter, there's a limit to compel compliance to social-distancing measures. But the president does not have the legal authority to "order state and local officials to change their policies." Thus:
Our constitutional order has a federal structure, meaning that (a) federal powers are supreme, yes, but limited in scope and (b) the state governments are independent entities, not mere subordinate layers under and within the federal government (that is, the federal-state relationship is not similar to the way that counties and cities are subordinate layers under the state governments).

What follows from this? The federal government cannot commandeer the machinery of the state governments (or, by extension, of local governments). That is, the federal government cannot coerce the states into taking actions to suit federal policy preference. See, e.g., New York v. United States and Printz v. United States. And so, the federal government cannot compel state and local officials to promulgate different rules on social distancing and the like.
There's more at the link.

Cube Smart

Tuesday, March 24, 2020

Ghost Dancing in the USA [Magical thinking, Trump and Covid-19]

Or, Why the Old Myths and Magic Don’t Work Anymore
I published this in Buffalo Report fifteen years ago on 1 March 2005 (the URL now belongs to someone else and the old Buffalo Report is defunct). It’s about the collapse of the symbol systems that made the nation a coherent political body. As such, it remains relevant. Consider it as both precursor to and follower of The King's Phallus: Gold or Lead? and War and America's National Psyche.
In 1889 a young Paiute Indian named Wovoka fell ill with a fever and, in his delirium, visited heaven. While there he talked with God and saw that all the Indians who had died were now young and happy doing the things they had done before the White Man had come upon them. News of the new messiah spread rapidly among the remnants of the Indian tribes. If they danced the right dances, sang the right songs, and wore their consecrated Ghost Shirts, not only would they be immune to the White Man’s bullets, but their loved ones would return to them, the White Man would vanish from the face of the earth, and the buffalo would once again be plentiful. Their fervor and belief were not rewarded and the Ghost Dance, as this last wave of revivals came to be known, soon passed into history.

That, however, is not the Ghost Dancing that concerns me. I mention it only to provide some comparative perspective. Anthropologists and historians have told that story hundreds if not thousands of times. It is the story of a people’s last desperate attempt to retain symbolic control over their world. Such revivals occur when a way of life has become impossible, for whatever reason, but the people themselves continue to live. In desperation they resort to magic to remake the world in terms they understand.

The Ghost Dancing that concerns me is not that of Stone Age people displaced and conquered by iron-mongering and coal-burning industrialists. My concern is the Ghost Dancing that has become a major force in contemporary American cultural and political life. Widespread belief in the impending Rapture – when all good Christians will be taken to heaven and all unbelievers consigned to hell – is the most obvious manifestation of the contemporary Ghost Dance. But it is hardly the only manifestation. Refusal to accept evidence of global warning is another symptom, as is refusal to attend to ground intelligence in conducting the war and reconstruction in Iraq.

For that matter, belief that the so-called Singularity is at hand – when computers will surpass humans in intelligence – is Ghost Dancing as well. This type of Ghost Dancing may seem rather geekish and harmless, for there aren’t all that many of these particular believers. Belief in the Singularity, however, is close kin to continued belief in the feasibility of the Star Wars anti-missile defense systems, in the Pentagon’s desire to develop a highly robotized military where the machines do the riskiest jobs, and in a more general belief that technology will fix everything.

Contemporary American Ghost Dancing has not, of course, been occasioned by colonialism or conquest. The modern American way of life has not been destroyed by external enemies. America has become and remains the mightiest nation on earth. Our vulnerability has subtler sources.

The American way of life has indeed suffered grievously in the past half-century. Perhaps the most substantial assault has been to our economy, with industrial and manufacturing jobs going overseas to be replaced by service jobs and high tech jobs. That has set off rolling economic displacement that will continue for the foreseeable future as many service and high tech jobs follow the steel industry to Asia and elsewhere.

The civil rights movement also forced major change. The struggle between liberty and racism is deep in the American soul, deeper than we can as yet comprehend. The civil rights movement changed America’s political culture in ways both good and unfortunate, and helped catalyze a wide range of social and cultural transformations that we too easily summarize as “the sixties.”

Finally, there is the collapse of the Soviet empire. During the Cold War Americans were encouraged to lavish animosity and hatred on the Soviets, their allies, and their clients. When that enemy dissolved, the animosity had to be redirected. I suspect that some of it was channeled into the War on Drugs, and, more generally, into the so-called Culture Wars. That is to say, we devoted more time and effort to fearing our immediate neighbors rather our distant enemies.

The upshot of these events is injury to the beliefs and attitudes of many Americans as deep as that forced on many of the world’s peoples through conquest and colonialism. The world is changing in ways most Americans cannot control, for reasons they cannot grasp, and there seems little hope for a future filled with the familiar comforts of home. It is thus not surprising that we are retreating behind revivalisms of various kinds. That is what societies do when their lifeways collapse.

It is not at all clear that America has the economic, political, social and cultural reserves needed to stop the retreat and once again to face the future with imagination and a realistic optimism. Our response to the bombing of the World Trade Center does not auger well. We have set out to fight a vaguely defined enemy – terrorism – of unbounded scope and magnitude while doing little of substance to address the vulnerability of our airlines, our borders, and our ports. These actions and inactions – and a host more – are those of an administration that is Ghost Dancing into the future with its eyes fixed firmly on the past, and a largely imaginary past at that. It is possible that, in the manner of an alcoholic who must “hit bottom” before he can cure himself, this administration, and perhaps its anointed successors, will simply continue on this path until even the most vigorous Ghost Dancing looses all plausibility.

When that happens, what form will “hitting bottom” take? Who will remain to pick up the pieces? Will the Europeans and the Chinese formulate a twenty-first century Marshall Plan for the rehabilitation of the United States? Is it possible that, the sooner we admit to hitting bottom, the sooner we can join the rest of the world and move forward together?

Monday, March 16, 2020

Treasure trove of interviews with jazz musicians [Monk Rowe]

Check out the Fillius Jazz Archive at Hamilton College, which has 25 years of live interviews with jazz musicians conducted by Monk Rowe. Many of the interviews have been transcribed and the transcriptions are downloadable.

This clip has excerpts from ten of these interviews: Jon Kendricks, Kenny Davern, Nat Adderly, Annie Ross, Nicki Parrott, Eiji Kitamura, Charlie Gabriel, Eddie Locke, Denis DiBlasio, and Frank Foster (my teacher years ago).

Here's the full Frank Foster interview:

Mindset theory (?)

Sunday, March 15, 2020

Death, Terror Management, and Nationalism [#pandemic]

In Beethoven’s Anvil I cited Kierkegard, Ernst Becker and Franz Borkeneau in asserting (p. 90): “Our intelligence allows us to know that we will die, and the rituals though which we mark death are among the most important and intense we perform. I suggest that without such rituals, death threatens to become a psychological trap for the living. Periodic participation in ritual musicking reduces one’s sense of isolation and attaches one to the group, as Freeman has suggested, making one’s individual fate a matter of less concern.“

There is a body of theory more or less devoted to arguing that culture is primarily a device for dealing with our fear of death, Terror Management Theory. Here’s the opening of the Wikipedia entry:
In social psychology, terror management theory (TMT) proposes a basic psychological conflict that results from having a desire to live but realizing that death is inevitable. This conflict produces terror, and is believed to be unique to human beings. Moreover, the solution to the conflict is also generally unique to humans: culture. According to TMT, cultures are symbolic systems that act to provide life with meaning and value. Cultural values therefore serve to manage the terror of death by providing life with meaning.
At roughly the same time I came across an article with a very long title: Philip T. Hoffman, Why was it that Europeans conquered the rest of the world? The politics and economics of Europe’s comparative advantage in violence (PDF). The article argues that while it is not clear, in general, just when “Western Europe first forged ahead of other parts of the world,” it is clear that in one area, the ability to wage war, Europe had “an undeniable comparative advantage well before 1800...” While the whole argument is interesting, I’m interested in one sentence, from page 11: “In an era before nationalism motivated troops, armies had to be centralized, for if soldiers (many of whom were mercenaries) were scattered across a country, desertions would soar.”

There it is, our old friend death. Nationalism made a difference in how states could motivate their troops. Nationalism is one of those cultural inventions that distances us from death.

Thursday, March 12, 2020

Last fall's camping trip

And when the end comes...(of the world)

Agnes Callard, The End is Coming, The Point, March 11, 2020.
How long have we got? At a recent public talk, the economist Tyler Cowen spitballed the number of remaining years at 700. But who knows? The important thing is that the answer is not: infinity years. Forever is a very long time, and humanity is not going to make it.

A crisis of meaning looms, one that will only deepen as we feel ourselves approaching the end. The Schefflerian edifice is doomed to collapse. Just as the thought that other people might be about to stockpile food leads to food shortages, so too the prospect of a depressed, disaffected and de-energized distant future deprives that future of its capacity to give meaning to the less distant future, and so on, in an kind of reverse-snowball effect, until we arrive at a depressed, disaffected and de-energized present.

The last generation is the linchpin of the whole system. But how can their lives have meaning, if the mere thought of the abyss sends a person collapsing into panic and depression? The answer is that the last generation is going to have to be composed of people better and braver than we are now—and it is our job to help them end up that way. We must take the first steps toward learning to make the unthinkable thinkable, so that they can take the last ones.

On 9/11, some of the passengers on United Airlines Flight 93 did something very heroic: they rose up against the terrorists holding them hostage, with the result that their plane crashed into a field rather than the Capitol building. Viewed from a certain angle, you might wonder why this was so impressive: if you know you are going to die either way, why not do some good while you are at it? But this would be a mistake. It takes incredible energy, passion and conviction to rush at your captors, and mustering all that up in the face of the certainty of death is an astonishing feat. Courage means that things can still matter to you—a lot—even when you know you are going to die. Courage means seeing the value of your life as being about more than survival—living ethically, not merely biologically.

Tuesday, March 10, 2020

Oprah and Trump, two sides of the same coin?

Natasha Zaretsky, The Odd Couple: Donald Trump, Oprah Winfrey, and Contemporary Charisma, The Hedgehog Review, Spring 2020:
How, then, do we explain the respective charismas of Oprah Winfrey and Donald Trump? I propose that their charismatic powers make sense only in light of the dramatic shift in authority relations that has been underway since the 1970s. During the last five decades, traditional racial, gender, and sexual hierarchies have toppled as women, people of color, and sexual minorities have gained greater visibility in public life. At the same time, economic and social inequality has sharpened while the distribution of wealth has become precariously asymmetrical, workers’ rights have been obliterated, and public goods like health care, education, and housing have been degraded. We thus find ourselves moving simultaneously forward and backward in time: forward into a public sphere that is more gender egalitarian, more multiracial, and more sexually capacious; and backward into a winner-take-all economy and culture that is often described as a new Gilded Age. This backward-forward motion is the product of a comprehensive shift in authority relations. Some, such as those within the traditional family, have loosened, while others, such as those that revolve around property, have tightened. This is what we might call the neoliberal paradox, and it is a defining feature of our time.

On the surface, it appears that Winfrey and Trump reflect the two sides of this paradox, with Winfrey capturing the dissolution of traditional gender and racial hierarchies and Trump symbolizing the boss’s ever-tightening grip. But Winfrey and Trump are not oppositional figures. Rather, each signals the simultaneously occurring breakdown of patriarchal authority and consolidation of market forces throughout the society. Consequently, Winfrey and Trump work with rather than against each other by accelerating a historical transition underway in the late capitalist family and workplace.

Winfrey and Trump harnessed the energies unleashed by the gender revolution of the late twentieth century to consolidate their charismatic authority. Over the course of their careers, they have used the medium of television to translate these energies into lessons for their followers about how to navigate life in workplaces that are at once more meritocratic and more predatory. Ultimately, as contemporary charismatic leaders par excellence, Winfrey and Trump fulfill a crucial need among their devotees: They guide them as they live through the dissolution of patriarchy and the intensification of market fundamentalism and economic inequality.
At the end of the article:
If we take seriously Weber’s insights about charismatic authority, the two charismas of Winfrey and Trump track both the breakdown of patriarchal authority and the consolidation of market domination in contemporary life. This tale of two charismas reveals that twenty-first-century capitalism legitimates itself in the midst of so much predation by manipulating and exploiting the antiauthoritarian energies unleashed by the gender revolution, harnessing those energies rather than suppressing them. Even as Winfrey and Trump appeared estranged from the social movements that took off in the 1960s, both drew on those movements without realizing that that was what they were doing. And both relied on the intimate, pervasive medium of television to reroute the energies of those movements in ways that have ended up strengthening the winner-take-all ethos of market fundamentalism in our social and cultural imaginary.

It may be tempting to see Winfrey as the embodiment of everything good about our age and Trump as the embodiment of everything bad. But in the end, both endorse the same belief: that there are only winners and losers. Winfrey’s cruelty is shrouded in therapeutic language, while Trump is bald-faced about the brute forces that pervade society. The world according to Trump is one of tough operators and cutthroat financial killers, “the kind of people who leave blood all over the boardroom table.”26 While most American workers today do not move in Trump’s circles, they do inhabit workplaces where, no matter how hard they work, their fates are determined by forces beyond their control, and they experience life as largely a series of accidents, contingencies, lucky breaks, and sudden reversals of fortune.

Winfrey rejects this grim take on the winner-take-all society. While she rose to fame by tearing back the curtain on the ugly side of heterosexual relations, she has gone on to cultivate a self-help philosophy that insists that people create their own realities. Winfrey hates the concept of luck and considers herself in touch with the divine. “Luck is a matter of preparation,” she has said. “I am highly attuned to my divine self.” The callousness of this perspective came into sharp relief when she once suggested to Elie Wiesel that his survival at Auschwitz constituted a direct miracle from God. “If a miracle of God to spare me, why?” he countered. “There were people much better than me…. No, it was an accident.”

In Winfrey’s world there are no accidents; everything happens for a reason. In Trump’s world, might makes right and coercion rules. Trump’s worldview resonates with the lived experiences of workers, while also trafficking in the seductive fantasy that a baby can become the boss. Winfrey also offers a fantasy figure: a guardian angel with whose help working people imagine they might escape. But neither a dealmaker nor a fairy godmother offers us a way out of our new Gilded Age.

Sunday, March 8, 2020

Music education in the USA in the 1920s and 30s [#LingLing40hours]

Friday, March 6, 2020

Wes Montgomery, RIP

Tuesday, March 3, 2020

How groups of animals make decisions

Elizabeth Preston, Sneezing Dogs, Dancing Bees: How Animals Vote, NYTimes, March 2, 2020:
Any animal living in a group needs to make decisions as a group, too. Even when they don’t agree with their companions, animals rely on one another for protection or help finding food. So they have to find ways to reach consensus about what the group should do next, or where it should live. While they may not conduct continent-spanning electoral contests like this coming Super Tuesday, species ranging from primates all the way to insects have methods for finding agreement that are surprisingly democratic.
Preston discusses meerkats, honeybees, African wild dogs, rock ants, and baboons:
Primates, our closest relatives, have provided lots of material for researchers studying how groups make decisions. Scientists have seen gibbons following female leaders, mountain gorillas grunting when they’re ready to move and capuchins trilling to each other.

Sometimes the process is more subtle. A group may move across the landscape as a unit without any obvious signals from individuals about where they’d like to go next. To figure out how wild olive baboons manage this, the authors of a 2015 paper put GPS collars on 25 members of one troop in Kenya. They monitored the monkeys’ every step for two weeks. Then they studied the movements of each individual baboon in numerous combinations to see who was pulling the group in new directions.

The data showed that any baboon might start moving away from the others as if to draw them on a new course — male or female, dominant or subordinate. When multiple baboons moved in the same direction, others were even more likely to come along. When there was disagreement, with trailblazing baboons moving in totally different directions, others would eventually follow the majority. But if two would-be leaders were tugging in directions less than 90 degrees apart, followers would compromise on a middle path. No matter what, the whole group ended up together.

Ariana Strandburg-Peshkin, an animal-behavior researcher at the University of Konstanz in Germany who led the baboon study, points out that unlike in humans, no one authority tallies up baboon votes and announces the result. The outcome emerges naturally. But the same kind of subtle consensus-building can be part of our voting process, too.

“For instance, we might influence one another’s decisions on who to vote for in the lead-up to an election, before any ballots are cast,” she said.

Newborns 'wired' for face and scene recognition

Saturday, February 29, 2020

Netflix goes to Africa: Queen Sono

Friday, February 28, 2020

The Hudson, an orange bouy, and the George Washington Bridge

The world is changing, and increasingly uncertain

Farhad Manjoo, Admit It: You Don’t Know What Will Happen Next, NYTimes, Feb 26, 2020.
A projection of certainty is often a crucial part of commentary; nobody wants to listen to a wishy-washy pundit. But I worry that unwarranted certainty, and an under-appreciation of the unknown, might be our collective downfall, because it blinds us to a new dynamic governing humanity: The world is getting more complicated, and therefore less predictable.

Yes, the future is always unknowable. But there’s reason to believe it’s becoming even more so, because when it comes to affairs involving masses of human beings — which is most things, from politics to markets to religion to art and entertainment — a range of forces is altering society in fundamental ways. These forces are easy to describe as Davos-type grand concepts: among others, the internet, smartphones, social networks, the globalization and interdependence of supply chains and manufacturing, the internationalization of culture, unprecedented levels of travel, urbanization and climate change. But their effects are not discrete. They overlap and intertwine in nonlinear ways, leaving chaos in their wake.

In the last couple of decades, the world has become unmoored, crazier, somehow messier. The black swans are circling; chaos monkeys have been unleashed. And whether we’re talking about the election, the economy, or most any other corner of humanity, we in the pundit class would do well more often to strike a note of humility in the face of the expanding unknown. We ought to add a disclaimer to everything we say: “I could be wrong! We all could be wrong!”

Thursday, February 27, 2020

Norman Rockwell gets political

Hudson River, Verrazano-Narrows Bridge in the distance

A guerilla attack on (bogus) copyright claims over musical melodies: Create them all and release them into the publid domain

Two programmer-musicians wrote every possible MIDI melody in existence to a hard drive, copyrighted the whole thing, and then released it all to the public in an attempt to stop musicians from getting sued.

Programmer, musician, and copyright attorney Damien Riehl, along with fellow musician/programmer Noah Rubin, sought to stop copyright lawsuits that they believe stifle the creative freedom of artists.

Often in copyright cases for song melodies, if the artist being sued for infringement could have possibly had access to the music they're accused of copying—even if it was something they listened to once—they can be accused of "subconsciously" infringing on the original content. One of the most notorious examples of this is Tom Petty's claim that Sam Smith's “Stay With Me” sounded too close to Petty's “I Won’t Back Down." Smith eventually had to give Petty co-writing credits on his own chart-topping song, which entitled Petty to royalties.

Defending a case like that in court can cost millions of dollars in legal fees, and the outcome is never assured. Riehl and Rubin hope that by releasing the melodies publicly, they'll prevent a lot of these cases from standing a chance in court.

In a recent talk about the project, Riehl explained that to get their melody database, they algorithmically determined every melody contained within a single octave.

Monday, February 24, 2020

Wynton Marsalis on the difference between African rhythm and jazz rhythm

Ethan Iverson [EI] is interviewing Wynton Marsalis [WM] about his composition Congo Square, which combines the Lincoln Center Jazz Archestra with Odadaa!, a West African drum ensemble led by Yacub Addy. At various points in the interview Iverson plays a short clip for Marsalis. He did so just before this passage:
EI: Now, what is that break?

WM: Carlos Henriquez showed me that one. If you hear it in 4, it’s easy, but if you hear it in 6, it’s hard. But in 4 it is square, right on the beat, but maybe we “place” them a little bit. We have to adjust to the 6 Odadaa! is playing, especially since they are in the middle of a phrase. As conductor, I adjust to the bell.

EI: That’s a mysterious moment; that’s why I like it so much.

WM: The hardest thing is to get us to play with the bell pattern.

EI: The up-and-down of the beat is not American.

WM: No, it’s not, it’s more like a clave. And like Yacub told me: “In order for us all to play together, y’all will have to play with us.” For me, it was a blessing to have Carlos and Ali [Jackson], who spent a lot of time at night working it out; a real labor of love. They would sit up with me and go through rhythm patterns and say, “No, that’s not it.” Then, eventually, “This is it.”
YES! to this:  "The up-and-down of the beat is not American." I learned that from years of playing with the late Ade Knowles when I was living in Troy, New York. Early in his career Ade had toured as a drummer and percussionist with Gil Scott-Heron. I met him when he was an administrator at RPI (Rensselaer Polytechnic Institute) and I was on the faculty. "The Magic of the Bell" is a piece I wrote about a particularly magical rehearsal with Ade.

Abstract words in prose fiction [#DH]

Friday, February 21, 2020

Of telomeres, senescence, cancer, laboratory mice, evolutionary biology, and institutional failure: From the life of Bret Weinstein

This is a long podcast (over two hours), and it takes awhile to get off the ground, but it's worth your attention.

About the podcast:
All of our Mice are Broken.

On this episode of The Portal, Bret and Eric sit down alone with each other for the first time in public. There was no plan.

There was however, a remarkable story of science at its both best and worst that had not been told in years. After an initial tussle, we dusted off the cobwebs and decided to reconstruct it raw and share it with you, our Portal audience, for the first time. I don't think it will be the last as we are now again looking for our old notes to tighten it up for the next telling. We hope you find it interesting, and that it inspires you younger and less established scientists to tell your stories using this new medium of long form podcasting. We hope the next place you hear this story will be in a biology department seminar room in perhaps Cambridge, Chicago, Princeton, the Bay Area or elsewhere. Until then, be well and have a listen to this initial and raw version.

Louis Armstrong on the cover of Time Magazine

Thursday, February 20, 2020

The egalitarian proclivities of Louis Armstrong

M.H, Miller, Louis Armstrong, The King of Queens, NYTimes, 20 Feb 2020:
Armstrong was born in New Orleans in 1901, dropped out of school as a child and was a successful touring musician in his early 20s. By 1929, he was living in Harlem, though as one of the most popular recording artists in the country, he traveled about 300 nights a year. In 1939, he met his fourth and final wife, Lucille Wilson, a dancer at Harlem’s Cotton Club. Lucille, who spent part of her childhood in Corona, decided it was time for her husband to settle down in a house, a real house, instead of living out of hotel rooms. (Even their wedding took place on the road, in St. Louis, at the home of the singer Velma Middleton.) One day, when Armstrong was away at a gig, she put a down payment of $8,000 (around $119,000 in today’s money) on 34-56 107th Street. She didn’t tell him she’d done this until eight months later, during which time she made the mortgage payments herself. [...]

From the outside, the two-bedroom, 3,000-square-foot house looks just like any other on the block, which was deliberate. Armstrong often referred to himself as “a salary man” and felt at ease alongside the telephone operators, schoolteachers and janitors of Corona, a neighborhood that, in a testament to how much of his life was spent in jazz clubs, he referred to affectionately as “that good ol’ country life.” One of the earliest integrated areas of New York, Corona was mostly home to middle-class African-Americans and Italian immigrants when the Armstrongs moved in. The demographics would change in the coming decades — Latin Americans began replacing the Italians in the ’60s, and now make up most of the neighborhood — but not much else. There was never a mass wave of gentrification or development here, and Armstrong himself was so concerned with blending in with his working-class neighbors that when his wife decided to give the house a brick facade, Armstrong went door-to-door down the block asking the other residents if they wanted him to pay for their houses to receive the same upgrade. (A few of his neighbors took him upon the offer, which accounts for the scattered presence of brick homes on the street to this day.) [...]

He played behind the Iron Curtain during the Cold War and in the Democratic Republic of Congo during decolonization in 1960, during which both sides of a civil war called a truce to watch him perform, then picked up fighting again once his plane took off. There are few American figures as legendary and beloved, and yet, as Harris told me, a common reaction people have upon entering his home is, “This reminds me of my grandmother’s house.” Certainly the living room recalls a ’60s vision of Modernism with a vaguely minimalist formality.

Wednesday, February 19, 2020

Illustrated Japanese books from 1600-1912

Surveillance tech is deeply flawed [[Surprise! Surprise!]]

Charlie Warzel, All This Dystopia, and for What?, NYTimes 20 Feb 2020.
The above examples all represent a different, equally troubling brand of dystopia — one full of false positives, confusion and waste. In these examples the technology is no less invasive. Your face is still scanned in public, your online information is still leveraged against you to manipulate your behavior and your financial data is collected to compile a score that may determine if you can own a home or a car. Your privacy is still invaded, only now you’re left to wonder if the insights were accurate.

As lawmakers ponder facial recognition bans and comprehensive privacy laws, they’d do well to consider this fundamental question: Setting aside even the ethical concerns, are the technologies that are slowly eroding our ability to live a private life actually delivering on their promises? Companies like NEC and others argue that outright bans on technology like facial recognition “stifle innovation.” Though I’m personally not convinced, there may be kernels of truth to that. But before giving these companies the benefit of the doubt, we should look deeper at the so-called innovation to see what we’re really gaining as a result of our larger privacy sacrifice.

Friday, February 14, 2020

Romantic kissing is not universal

From pop culture to evolutionary psychology, we have come to take kissing for granted as universally desirable among humans and inseparable from other aspects of affection and intimacy. However, a recent article in American Anthropologist by Jankowiak, Volsche and Garcia questions the notion that romantic kissing is a human universal by conducting a broad cross cultural survey to document the existence or non-existence of the romantic-sexual kiss around the world.

The authors based their research on a set of 168 cultures compiled from eHRAF World Cultures (128 cultures) as well as the Standard Cross Cultural Sample (27 cultures) and by surveying 88 ethnographers (13 cultures). The report’s findings are intriguing: rather than an overwhelming popularity of romantic smooching, the global ethnographic evidence suggests that it is common in only 46% (77) of the cultures sampled. The remaining 54% (91) of cultures had no evidence of romantic kissing. In short, this new research concludes that romantic-sexual kissing is not as universal as we might presume.

The report also reveals that romantic kissing is most common in the Middle East and Asia, and least common of all among Central American cultures. Similarly, the authors state that “no ethnographer working with Sub-Saharan African, New Guinea, or Amazonian foragers or horticulturalists reported having witnessed any occasion in which their study populations engaged in a romantic–sexual kiss”, whereas it is nearly ubiquitous in northern Asia and North America.

In addition, cross-cultural ethnographic data was used to analyze the relationship between any presence of romantic kissing and a culture’s complexity of social stratification. The report finds that complex societies with distinct social classes (e.g. industrialized societies) have a much more frequent occurrence of this type of kissing than egalitarian societies (e.g. foragers).
More at the link (H/t Tyler Cowen).

Thursday, February 13, 2020

Conjunctions: transportation and graffiti [Jersey City]

Bernie Sanders isn't a socialist

The thing is, Bernie Sanders isn’t actually a socialist in any normal sense of the term. He doesn’t want to nationalize our major industries and replace markets with central planning; he has expressed admiration, not for Venezuela, but for Denmark. He’s basically what Europeans would call a social democrat — and social democracies like Denmark are, in fact, quite nice places to live, with societies that are, if anything, freer than our own.

So why does Sanders call himself a socialist? I’d say that it’s mainly about personal branding, with a dash of glee at shocking the bourgeoisie. And this self-indulgence did no harm as long as he was just a senator from a very liberal state.

But if Sanders becomes the Democratic presidential nominee, his misleading self-description will be a gift to the Trump campaign. So will his policy proposals. Single-payer health care is (a) a good idea in principle and (b) very unlikely to happen in practice, but by making Medicare for All the centerpiece of his campaign, Sanders would take the focus off the Trump administration’s determination to take away the social safety net we already have.

Has "civilization" entered a phase of decadence?

That's what Ross Douthat argues in his new book, The Decadent Society: How We Became the Victims of Our Own Success, which Damon Linker reviews in The Week, February 13, 2020. Decadence?
By calling us "decadent," Douthat doesn't mean that we're succumbing to imminent decline and collapse. Following esteemed cultural critic Jacques Barzun, Douthat instead defines decadence as a time when art and life seem exhausted, when institutions creak, the sensations of "repetition and frustration" are endemic, "boredom and fatigue are great historical forces," and "people accept futility and the absurd as normal."

Douthat goes on to refine the definition:
Decadence refers to economic stagnation, institutional decay, and cultural and intellectual exhaustion at a high level of material prosperity and technological development. It describes a situation in which repetition is more the norm than innovation; in which sclerosis afflicts public institutions and private enterprises alike; in which intellectual life seems to go in circles; in which new developments in science, new exploratory projects, underdeliver compared with what people recently expected. And crucially, the stagnation and decay are often a direct consequence of previous development: the decadent society is, by definition, a victim of its own significant success.
Douthat certainly isn't a favorite of mine, and I've got problems with the word "decadent", but that description is consistent with my own view, based on the theory of cultural ranks that David Hays and I developed,  that we're exhausting the cultural resources we've inherited but have not yet managed to invent new modes of thinking, feeling, living, and exploring.

Near the end Linker observes:
Interestingly, one way to describe the populist insurgencies taking place around us is to say that they're a rebellion against the decadence of the post-Cold War world — the sense that history came to an end in 1989, with all significant ideological disputes resolved and politics reduced to the fine-tuning of liberal democratic government. Francis Fukuyama's own high-level punditry on the subject was actually far more ambivalent than it's usually credited with being. Although Fukuyama argued that liberal democracy triumphed over communism because it was more capable of fulfilling humanity's material and spiritual needs than any other political and economic system, he also worried with uncanny prescience that a world in which liberal democracy was the only available option could be marked by boredom, repetition, and sterility — and that the intolerable character of such decadence could inspire anti-liberal movements that aimed to restart history once again.

Douthat's book can be read as a melancholy sequel to Fukuyama's The End of History and the Last Man that confirms the author's darkest predictions but without endorsing (or seriously wrestling with) any of the concrete efforts going on around us to overcome our own malaise by breaking away from decadent liberalism — whether it's Donald Trump's MAGA presidency, the Catholic conservatism of Poland's Law and Justice Party, Marion Maréchal's National Rally in France, the National Conservatism spearheaded by Yoram Hazony, or Viktor Orban's anti-liberal and pro-natalist populism in Hungary. Given that Douthat is a conservative who longs for renewal, rebirth, and revitalization — for an end to the decadence he thinks plagues us — it's surprising that he has so little to say about these efforts in the book. [...]

Douthat sees a lot, and far more than most of our less profoundly discontented commentators. That makes him an excellent pundit — maybe the best of our moment. But in his new book he also avoids a forthright confrontation with the political correlates of his own moral, aesthetic, intellectual, and spiritual dissatisfactions. In its place we find idle speculations about alternative realities. Which may mean that, for all its strengths, Douthat's book about decadence is more than a little decadent itself.
That is to say that Douthat is himself trapped in the same exhausted cultural forms. 

Who among us isn't?

Wednesday, February 12, 2020

Pandemics and cooperation between nation-states

Thomas Bollyky and Samantha Kiernan, No Nation Can Fight Coronavirus on Its Own, Lawfare, February 12, 2020: "Infectious diseases were the first global problem that nation-states realized they could not solve without international cooperation." This came about in the mid-19th century:
For most of human history, plagues, parasites and pests were a domestic affair. Quarantine was the principal means by which nations contained the microbes that were brought by invading armies and the passengers, both human and vermin, on trading ships and caravans.

Those isolation measures proved ineffective, however, against the six pandemics of cholera that swept the United States, the Middle East, Russia and Europe in the 19th century. A terrifying disease that struck seemingly healthy people, cholera killed tens of thousands in the cities of Europe and the United States—and, very likely, many more in India, where the pandemics originated. The economic costs of uncoordinated quarantines hurt nations and merchants alike.

In 1851, European states gathered for the first International Sanitary Conference to discuss cooperation on cholera, plague and yellow fever. That convention, and those that followed, led to the first treaties on international infectious disease control and—in 1902—the International Sanitary Bureau, which later became the Pan American Health Organization. These international initiatives were the early models for later agreements and agencies on other transnational concerns, such as pollution, the opium trade and unsafe labor practices.

Microbes have continued to inspire episodes of cooperation among even bitter rivals. The WHO, the United Nation’s first specialized agency, was created in 1946 in response to the horrors of World War II. Its early days were devoted to international campaigns against the great scourges of that era, such as malaria, smallpox and tuberculosis. At the height of the Cold War, the smallpox immunization campaign motivated the United States and the Soviet Union to join forces in an effort that succeeded in eradicating the disease in 1980. In El Salvador, an international vaccination campaign against pediatric infections led to a pause in the country’s 14-year civil war for the sole purpose of immunizing children.
And the current coronavirus epidemic?
There is much we do not know yet about how easily the virus spreads or its severity. But there is reason to think that the scale of this coronavirus outbreak and the likelihood of epidemics of the virus occurring outside China may inspire more cooperation than even the five previous occasions that the WHO designated as international public health emergencies: the H1N1 influenza pandemic (2009), the re-emergence of polio in several nations (2014), the Ebola outbreak in West Africa (2014), the Zika virus outbreak (2016) and the Ebola virus outbreak in the Democratic Republic of Congo (2019).

In a little over one month, the coronavirus has more than five times the number of laboratory-confirmed cases (43,114 as of Feb. 11) than the outbreak of SARS did in four months (8,096). The novel coronavirus has already spread to at least 26 countries, far more than the current outbreak of the Ebola virus in the Democratic Republic of Congo, its predecessor in West Africa in 2013-2015, or during the resurgence of polio in Afghanistan, Nigeria and Pakistan in 2014. The mortality rate for known cases of the novel coronavirus has been about 2-3 percent, deadlier than the Zika virus or the 2009 H1N1 swine flu. [...]

Perhaps a pandemic of novel coronavirus, if it occurs, would be a sufficiently frightening antagonist to force international cooperation, even at a moment that otherwise has proved inhospitable to global governance. If so, this novel coronavirus will do what climate change, tariff threats and the prospect of nuclear proliferation on the Korean peninsula could not: force nations to work together.

Bird in red and black (flooded by light)

Rodney Brooks on AI and robotics

As you may know, Rodney Brooks is a pioneering robotics researcher and entrepreneur (his company markets the Roomba) who once headed the AI lab at MIT. He has a blog where he's been commenting on AI. Here's a post where he has links to eight posts on the future of AI and robotics that he posted between August of 2017 and July of 2018, Future of Robotics and Artificial Intelligence. This post is from July, 2018, where he gives a capsule overview of the history of AI, Steps Toward Super Intelligence I, How We Got Here. He lists for main approaches, with approximate start dates:
1. Symbolic (1956)
2. Neural networks (1954, 1960, 1969, 1986, 2006, …)
3. Traditional robotics (1968)
4. Behavior-based robotics (1985)
Neural networks, as you see, has a spotty history. The basic idea is relatively old (as work in AI goes). 1986 marks the advent of back-propagation along with multilayered networks while the 2006 dates marks some new techniques ("deep learning"), much more computing power, and huge sets of training data. I found this discussion particularly useful. He shows us the following photo:

A Google program was able to generate this caption, “A group of young people playing a game of Frisbee”, and goes on to note:
I think this is when people really started to take notice of Deep Learning. It seemed miraculous, even to AI researchers, and perhaps especially to researchers in symbolic AI, that a program could do this well. But I also think that people confused performance with competence (referring again to my seven deadly sins post). If a person had this level of performance, and could say this about that photo, then one would naturally expect that the person had enough competence in understanding the world, that they could probably answer each of the following questions:
  • what is the shape of a Frisbee?
  • roughly how far can a person throw a Frisbee?
  • can a person eat a Frisbee?
  • roughly how many people play Frisbee at once?
  • can a 3 month old person play Frisbee?
  • is today’s weather suitable for playing Frisbee?
But the Deep Learning neural network that produced the caption above can not answer these questions. It certainly has no idea what a question is, and can only output words, not take them in, but it doesn’t even have any of the knowledge that would be needed to answer these questions buried anywhere inside what it has learned.
Brooks' own work has been in the fourth approach, behavior-based robotics, where he is a pioneer. He remarks:
...I started to reflect on how well insects were able to navigate in the real world, and how they were doing so with very few neurons (certainly less that the number of artificial neurons in modern Deep Learning networks). In thinking about how this could be I realized that the evolutionary path that had lead to simple creatures probably had not started out by building a symbolic or three dimensional modeling system for the world. Rather it must have begun by very simple connections between perceptions and actions.

In the behavior-based approach that this thinking has lead to, there are many parallel behaviors running all at once, trying to make sense of little slices of perception, and using them to drive simple actions in the world. Often behaviors propose conflicting commands for the robot’s actuators and there has to be a some sort of conflict resolution. But not wanting to get stuck going back to the need for a full model of the world, the conflict resolution mechanism is necessarily heuristic in nature. Just as one might guess, the sort of thing that evolution would produce.

Behavior-based systems work because the demands of physics on a body embedded in the world force the ultimate conflict resolution between behaviors, and the interactions. Furthermore by being embedded in a physical world, as a system moves about it detects new physical constraints, or constraints from other agents in the world.
Finally, Brooks has created a predictions scorecard in three areas, self-driving cars, AI and machine learning, and space industry. He first posted it on January 1, 2018 and has updated it on Jan. 1 of 2019 and again, Jan. 1 2020.  The list contains (I would guess) over 50 specific items distributed over those categories with specific dates attached. It makes for very interesting reading.

Spain is now the world's healthiest country

Music versus algorithms

Alexis Petridis reviews Ted Gioia's current book, Music: A Subversive History:
In terms of scope, well, put it this way: it starts out talking about a bear’s thighbone that Neanderthal hunters apparently turned into a primitive flute somewhere between 43,000 and 82,000 years ago and ends up, 450 pages later, discussing K-pop and EDM. His central theory: music is a kind of magical, ungovernable force that connects us to ancient shamanistic rituals, it’s primarily fuelled by sex and violence – anyone horrified by the lyrics of drill or death metal should consider that the first instruments were made from body parts and would once have literally dripped with blood – and all attempts to reduce it to mathematical formulae or “quasi-science”, while useful, go against its intrinsic nature. He’s really not keen on Pythagoras, whose mathematical theories about tuning underpin “music as it is taught in every university and conservatory in the world today”.

I didn’t agree with everything Gioia had to say, but something about that central theory stuck with me. For one thing, there is something magical and ungovernable about music: that weird tingling sensation you get when you hear something you love - a friend of mine calls it the Holy Shiver - is involuntary. It just happens. And we live in an era when music has never been more governed by mathematics. Algorithms are supposed to be able to predict everything, from what you want to hear next to whether or not a song’s going to be a hit: the digital strategist who developed the software behind the AI record label that’s just launched was also “involved in the development and marketing of stars such as Avicii, Logic, Mike Posner and Swedish House Mafia”.
For a series of anecdotes illustrating music's power, see my working paper,  Emotion & Magic in Musical Performance.

Tuesday, February 11, 2020

This is your brain on art

Monday, February 10, 2020

Fallen angel

When I saw this at night I wondered what it was, perhaps some strange art project?

When I came back the next day I realized that it was simply a decorative angel that had fallen down.

Howard Rheingold on democracy and online media

Howard Rheingold, Democracy is losing the online arms race, February 4, 2020. Opening paragraphs:
Democracy is threatened by an arms race that the forces of deception are winning. While microtargeted computational propaganda, organized troll brigades, coordinated networks of bots, malware, scams, epidemic misinformation, miscreant communities such as 4chan and 8chan, and professionally crafted rivers of disinformation continue to evolve, infest, and pollute the public sphere, the potential educational antidotes – widespread training in critical thinking, media literacies, and crap detection – are moving at a leisurely pace, if at all.

When I started writing about the potential for computer-mediated communication, decades before online communication became widely known as “social media,” my inquiries about where the largely benign online culture of the 1980s might go terribly wrong led me to the concept of the “public sphere,” most notably explicated by the German political philosopher Jurgen Habermas. “What is the most important critical uncertainty about mass adoption of computer mediated communication?” was the question I asked myself, and I decided that the most serious outcome of this emerging medium would have to do with whether citizens gain or lose liberty with the rising adoption of digital media and networks. It didn’t take a lot of seeking to find Habermas’ work when I started pursuing this question.

Although Habermas’ prose is dense, the notion is simple: Democracies are not just about voting for leaders and policy-makers; democratic societies can only take root in populations that are educated enough and free enough to communicate about issues of concern and to form public opinion that influences policy.
Five skillsets for online life:
When I set out to write Net Smart: How to Thrive Online, I decided that five essential skillsets/bodies of lore/skills were necessary to thrive online – and by way of individual thriving, to enhance the value of the commons: literacies of attention, crap detection, participation, collaboration, and network awareness:

· Attention because it is the foundation of thought and communication, and even a decade ago it was clear that computer and smartphone screens were capturing more and more of our attention.

· Crap detection because we live in an age where it is possible to ask any question, any time, anywhere, and get a million answers in a couple seconds – but where it is now up to the consumer of information to determine whether the information is authentic or phony.

· Participation because the birth and the health of the Web did not come about because and should not depend upon the decisions of five digital monopolies, but was built by millions of people who put their cultural creations and their inventions online, nurtured their own communities, invented search engines in their dorm rooms and the Web itself in a physics lab.

· Collaboration because of the immense power of social production, virtual communities, collective intelligence, smart mobs afforded by access to tools and knowledge of how to use them.

· Network awareness because we live in an age of social, political, and technological networks that affect our lives, whether we understand them or not.

In an ideal world, the social and political malignancies of today’s online culture could be radically reduced, although not eliminated, if a significant enough portion of the online population was fluent or at least basically conversant in these literacies – in particular, while it seems impossible to stem the rising tide of crap at its sources, its impact could be significantly reduced if most of the online population was educated in crap detection.
On attention:
I confronted issues of attention in the classroom during my decade of teaching at UC Berkeley and Stanford – as does any instructor who faces a classroom of students who are looking at their laptops and phones in class. Because I was teaching social media issues and social media literacies, it seemed to me to be escaping the issue by simply banning screentime in class – so we made our attention one of our regular activities. I asked my co-teaching teams (I asked teams of three learners to take responsibility for driving conversation during one-third of our class time) to make up “attention probes” that tested our beliefs and behavior. When I researched attentional discipline for Net Smart, I found an abundance of evidence from millennia-old contemplative traditions to contemporary neuroscience for the plasticity of attention. Simply paying attention to one’s attention – the methodology at the root of mindfulness meditation – can be an important first step to control. It doesn’t seem that attention engineers, despite their wild success, have the overwhelming advantage in the arms race with attention education that surveillance capitalists and computational propagandists deploy with their big data, bots, and troll armies.
The lopsided arms race is what leads me to conclude that education in crap detection, attention control, media literacy, and critical thinking are important, but are not sufficient. Regulation of the companies who wield these new and potentially destructive powers will also be necessary.
There's more at the link.

Saturday, February 8, 2020

Wild Child

The hill was covered with strange grassy mounds about the size of molehills. The adults had no idea what they were — which was very exciting to me, realizing that there were things in the world that not even the adults understood. So I filled in the blanks for myself and decided they must be burial mounds for fairies. This was the magical landscape that inspired my book “The Wizards of Once.”

For the wildwood in that book, I took particular inspiration from the ancient wood of Kingley Vale in Sussex. Its trees have gnarled, expressive faces, and roots that embed into the earth with an almost visceral power. The more you learn about trees, the more magical you realize they are. Did you know, for example, that trees can communicate with each other through their roots, even when they are many miles apart?

Trees grow throughout children’s books. From “Peter Pan” to “A Monster Calls,” “The Lord of the Rings” to “Harry Potter,” trees are refuges, prisons and symbols of nature’s potency. They can be a friendly home, like the Hundred Acre Wood in “Winnie-the-Pooh,” or give a sense of menace, like the snowy forest in “The Lion, the Witch and the Wardrobe.” They can also be symbolic, like the cement-filled dying tree in “To Kill a Mockingbird.” The writers I loved when I was a child were similarly inspired by magical landscapes and nature: Ursula K. Le Guin, J.R.R. Tolkien, L. Frank Baum, Diana Wynne Jones, Lloyd Alexander, Robert Louis Stevenson, T.H. White — and so many others.

Today, children have much less unsupervised access to the countryside. I worry that they may never know the magic of the wilderness, the power of trees and the thrilling excitement of exploring nature without an adult hovering behind them. And so I write books for children who will never know what the freedom of my childhood was like.

Friday, February 7, 2020


“Undecidability, Uncomputability and the Unity of Physics. Part 1.”

That's the title of a post by Tim Palmer at Backreaction. Here's the opening:
Our three great theories of 20th Century physics – general relativity theory, quantum theory and chaos theory – seem incompatible with each other.

The difficulty combining general relativity and quantum theory to a common theory of “quantum gravity” is legendary; some of our greatest minds have despaired – and still despair – over it.

Superficially, the links between quantum theory and chaos appear to be a little stronger, since both are characterised by unpredictability (in measurement and prediction outcomes respectively). However, the Schrödinger equation is linear and the dynamical equations of chaos are nonlinear. Moreover, in the common interpretation of Bell’s inequality, a chaotic model of quantum physics, since it is deterministic, would be incompatible with Einstein’s notion of relativistic causality.

Finally, although the dynamics of general relativity and chaos theory are both nonlinear and deterministic, it is difficult to even make sense of chaos in the space-time of general relativity. This is because the usual definition of chaos is based on the notion that nearby initial states can diverge exponentially in time. However, speaking of an exponential divergence in time depends on a choice of time-coordinate. If we logarithmically rescale the time coordinate, the defining feature of chaos disappears. Trouble is, in general relativity, the underlying physics must not depend on the space-time coordinates.

So, do we simply have to accept that, “What God hath put asunder, let no man join together”? I don’t think so. A few weeks ago, the Foundational Questions Institute put out a call for essays on the topic of “Undecidability, Uncomputability and Unpredictability”. I have submitted an essay in which I argue that undecidability and uncomputability may provide a new framework for unifying these theories of 20th Century physics. I want to summarize my argument in this and a follow-on guest post.
What interests me is simply the conjunction mentioned in the opening line,  general relativity theory, quantum theory and chaos theory, which featured in a post in January, The third 20th-century revolution in physics [non-linear dynamics]. Here are two (of three) observations I made at the end of that post:
  • It seems to me that quantum mechanics and relativity are focused on explanatory principles whereas non-linear dynamics tends more toward description, description of a wide variety of phenomena. Moreover quantum mechanics and relativity are most strongly operative in different domains, the microscopic and macroscopic respectively.
  • Computation: many cases there are various computational paths from the initial state to the completion of the computation. As a simple example, when adding a group of numbers, the order of the numbers doesn't matter; the sum will be the same in each case. In the case of non-linear systems successive states in the computation 'mirror' successive states in the system being modeled so the temporal evolution of the computation is intrinsic to the model rather than extrinsic.
Does that second observation imply that (something like) computation is inherent in the physical nature of the universe and is not merely an intellectual operation carried out by various artificial means?

Life in the ocean depths, minerals too

Wil S. Hylton, History's Largest Mining Operation Is About to Begin, The Atlantic, January/February 2020. As the title indicates, the article is primarily about mining the ocean depths, but this passage struck me:
Until recently, marine biologists paid little attention to the deep sea. They believed its craggy knolls and bluffs were essentially barren. The traditional model of life on Earth relies on photosynthesis: plants on land and in shallow water harness sunlight to grow biomass, which is devoured by creatures small and large, up the food chain to Sunday dinner. By this account, every animal on the planet would depend on plants to capture solar energy. Since plants disappear a few hundred feet below sea level, and everything goes dark a little farther down, there was no reason to expect a thriving ecosystem in the deep. Maybe a light snow of organic debris would trickle from the surface, but it would be enough to sustain only a few wayward aquatic drifters.

That theory capsized in 1977, when a pair of oceanographers began poking around the Pacific in a submersible vehicle. While exploring a range of underwater mountains near the Galápagos Islands, they spotted a hydrothermal vent about 8,000 feet deep. No one had ever seen an underwater hot spring before, though geologists suspected they might exist. As the oceanographers drew close to the vent, they made an even more startling discovery: A large congregation of animals was camped around the vent opening. These were not the feeble scavengers that one expected so far down. They were giant clams, purple octopuses, white crabs, and 10-foot tube worms, whose food chain began not with plants but with organic chemicals floating in the warm vent water.

For biologists, this was more than curious. It shook the foundation of their field. If a complex ecosystem could emerge in a landscape devoid of plants, evolution must be more than a heliological affair. Life could appear in perfect darkness, in blistering heat and a broth of noxious compounds—an environment that would extinguish every known creature on Earth. “That was the discovery event,” an evolutionary biologist named Timothy Shank told me. “It changed our view about the boundaries of life. Now we know that the methane lakes on one of Jupiter’s moons are probably laden with species, and there is no doubt life on other planetary bodies.”
As for mining:
Deepwater plains are also home to the polymetallic nodules that explorers first discovered a century and a half ago. Mineral companies believe that nodules will be easier to mine than other seabed deposits. To remove the metal from a hydrothermal vent or an underwater mountain, they will have to shatter rock in a manner similar to land-based extraction. Nodules are isolated chunks of rocks on the seabed that typically range from the size of a golf ball to that of a grapefruit, so they can be lifted from the sediment with relative ease. Nodules also contain a distinct combination of minerals. While vents and ridges are flecked with precious metal, such as silver and gold, the primary metals in nodules are copper, manganese, nickel, and cobalt—crucial materials in modern batteries. As iPhones and laptops and electric vehicles spike demand for those metals, many people believe that nodules are the best way to migrate from fossil fuels to battery power.

The ISA has issued more mining licenses for nodules than for any other seabed deposit. Most of these licenses authorize contractors to exploit a single deepwater plain. Known as the Clarion-Clipperton Zone, or CCZ, it extends across 1.7 million square miles between Hawaii and Mexico—wider than the continental United States. When the Mining Code is approved, more than a dozen companies will accelerate their explorations in the CCZ to industrial-scale extraction. Their ships and robots will use vacuum hoses to suck nodules and sediment from the seafloor, extracting the metal and dumping the rest into the water. How many ecosystems will be covered by that sediment is impossible to predict. Ocean currents fluctuate regularly in speed and direction, so identical plumes of slurry will travel different distances, in different directions, on different days. The impact of a sediment plume also depends on how it is released. Slurry that is dumped near the surface will drift farther than slurry pumped back to the bottom. The circulating draft of the Mining Code does not specify a depth of discharge. The ISA has adopted an estimate that sediment dumped near the surface will travel no more than 62 miles from the point of release, but many experts believe the slurry could travel farther. A recent survey of academic research compiled by Greenpeace concluded that mining waste “could travel hundreds or even thousands of kilometers.

Wednesday, January 29, 2020

Sofa in the wild with graffiti

Ezra Klein on the deep problem posed by social media

I think that social media and the way we deal with it — and this is true in a lot of places — we end up focusing on, one, the easy cases rather than the hard cases, like fake news as opposed to real news. Everybody agrees that fake news is bad, and you shouldn’t have it. Real news can also be very bad in terms of what it emphasizes, or the quality of the work, and so on. But the question of how to handle it is much, much harder, and it’s not going to be something that a Facebook supreme court handles.

I think the underlying and very deep problem with Facebook, with Twitter, with a bunch of them, is building the future of our communication commons atop a business model that is about engagement mediated through the intensity of the viewers’ or audiences’ emotional reaction. I don’t think that’s something the Facebook supreme court can solve, and I also don’t think it is a good thing for the future. But nobody really seems to want to fight it.

The questions about privacy — I think they’re important. The questions about fake news are important. All the questions people bring up in these cases are important. But I think all of them are also less important than the question of, is the future of how we will communicate with each other, of how politicians will communicate with the public, of how, basically, all important communication will be structured and incentivized — is it what gives you the strongest emotional countercharge?

If so, I think that we are in for this period where a lot of energy is going to go towards the most outrageous and most offensive players because they both get the energy of the people they inspire and the people who hate them. And it’s the combination of the energy and counter-energy that gives them so much control of the conversation.

Tuesday, January 28, 2020

Trump and TV: They grew up together

In this episode of Aryeh Cohen-Wade interviews James Poniewozik, chief TV critic with The New York Times. Poniewozik has just published Audience of One: Donald Trump, Television, and the Fracturing of America, in which he traces the parallel development of Trump's career and the development of television.

From the book's jacket copy:
Audience of One shows how American media have shaped American society and politics, by interweaving two crucial stories. The first story follows the evolution of television from the three-network era of the 20th century, which joined millions of Americans in a shared monoculture, into today’s zillion-channel, Internet-atomized universe, which sliced and diced them into fractious, alienated subcultures. The second story is a cultural critique of Donald Trump, the chameleonic celebrity who courted fame, achieved a mind-meld with the media beast, and rode it to ultimate power.

Braiding together these disparate threads, Poniewozik combines a cultural history of modern America with a revelatory portrait of the most public American who has ever lived. Reaching back to the 1940s, when Trump and commercial television were born, Poniewozik illustrates how Donald became “a character that wrote itself, a brand mascot that jumped off the cereal box and entered the world, a simulacrum that replaced the thing it represented.” Viscerally attuned to the media, Trump shape-shifted into a boastful tabloid playboy in the 1980s; a self-parodic sitcom fixture in the 1990s; a reality-TV “You’re Fired” machine in the 2000s; and finally, the biggest role of his career, a Fox News–obsessed, Twitter-mad, culture-warring demagogue in the White House.
The hour+ discussion covers that full story. Trump used his early real-estate career to become known in local and regional media. From there he went national and made himself into the paradigmatic contemporary example of a modern business tycoon and, in turn, used the reputation radiating from that image to pull himself out of business reverses. By the time The Apprentice came to a close he had become a creature of the media. The smooth operator of his earliest TV appearances had become the crude opportunistic populist firebrand whose campaign rallies where such good TV that cable channels were happy to broadcast them to take up airtime.