Saturday, November 30, 2024

A.I. as religion

While I was studying toward ordination as secular humanist clergy two decades ago, I had the fortune of meeting the late rabbi Sherwin Wine, a brilliant philosopher whom TIME magazine profiled as “the atheist rabbi” in 1967. He became my favorite teacher and mentor as I trained to serve communities of atheists and agnostics in ways that parallel how religious leaders typically minister to their congregations. Sherwin’s go-to line about technology was, “I’ve always said there is no God. I never said there wouldn’t be one in the future.”

I heard his quip around 2002, and took it as generalized sarcasm about the state of technology and science fiction. Little did I know he’d had a premonition.

Take, for example, Way of the Future, an official AI-worshipping religion created by Anthony Levandowski, a former Google AI engineer who earned hundreds of millions of dollars as a leader in the development of autonomous vehicle technology. Levandowski went as far as filing all the requisite paperwork to register as a church with the IRS, telling the agency that the faith would focus on “the realization, acceptance, and worship of a Godhead based on Artificial Intelligence (AI) developed through computer hardware and software.” In a 2017 interview, Levandowski told Wired that “what is going to be created [as AI] will effectively be a god . . . not a god in the sense that it makes lightning or causes hurricanes. But if there is something a billion times smarter than the smartest human, what else are you going to call it?”
There's more at the link.

Don't blink. Smile!

Neural maturation, cerebral plasticity, and the adaptive value of vacations

I’ve had another session with Claude (3.5 Sonnet) in connection with the article I’m writing. Tentative title: “Melancholy and Growth: Toward a Mindcraft for an Emerging World.” The post title carries the topics under discussion.

I’ve read widely in neuroscience over the years, and continue to dip into the technical literature now and then. Some years ago David Hays and I published on “Principles and Development of Natural Intelligence” in which we reviewed a wide range of material: neural processes and organization, behavioral correlates, the phylogeny of the nervous system, and growth in human development. That’s my basic framework. On that basis Claude’s remarks make sense to me. Still, I’d advise caution in making use of those remarks.

I’ve appended its remarks below. I’m using my standard convention, setting my words in bold Courier and Claude’s reply in plain Courier.

Version: Claude 3.5 Sonnet

* * * * *

What can you tell me about the development of the nervous system in humans? I know something about the course of myelination, which I believe is mostly complete by the early 20s, when the sutures of the skull finally set. I also know about the proliferation and pruning of dendrites early. Give me some more information.

I see you already have some knowledge about neural development! Let me add to that while gently correcting a small misconception - the cranial sutures actually begin fusing in early childhood and are typically mostly fused by age 6, though some continue fusing into adulthood. This is separate from the process of myelination you mentioned.

Neural development is a fascinating process that begins very early in embryonic development. Here are some key aspects and timeline points:

During embryonic development, the nervous system starts forming from the neural tube around week 3 post-conception. This process, called neurulation, creates the foundation for the entire central nervous system.

The early stages involve rapid neurogenesis (creation of new neurons), with neurons being produced at a rate of about 250,000 per minute during peak periods of fetal development. Most of our neurons are actually produced before we're born!

You mentioned dendritic pruning - this is indeed a crucial process. After birth, there's massive synaptic proliferation (synaptogenesis) creating many more connections than we'll ultimately keep. This peaks in different regions at different times:

  • Visual cortex peaks around 6 months
  • Prefrontal cortex peaks around age 1-2

The subsequent pruning you referenced continues through adolescence, following a "use it or lose it" principle where frequently used connections are strengthened and unused ones are eliminated.

Regarding myelination, it proceeds in a predictable pattern:

  • Begins in the spinal cord during fetal development
  • Progresses from back to front in the brain
  • Sensory pathways myelinate before motor pathways
  • Higher-order association areas myelinate last
  • The prefrontal cortex indeed continues myelinating into the early 20s

One fascinating aspect is how this development correlates with function. For instance, the areas controlling basic functions like breathing and temperature regulation are well-developed at birth, while areas involved in executive function continue developing into early adulthood.

Would you like me to elaborate on any particular aspect of neural development? I find the relationship between developmental timing and functional capabilities particularly interesting.

Tell me more about this, “though some continue fusing into adulthood,” and this “The prefrontal cortex indeed continues myelinating into the early 20s.”

Let me address both of those ongoing developmental processes.

Regarding cranial sutures, while most fusion occurs early, the sagittal suture (running along the top of the skull from front to back) can continue its fusion process into the 30s. The sphenofrontal and frontoethmoidal sutures typically complete fusion between ages 20-40. Some parts of the lambdoid sutures (at the back of the skull) may not fully fuse until around age 60, though this varies considerably between individuals.

As for prefrontal cortex myelination, this prolonged development has significant implications. The extended myelination period correlates with ongoing refinement of executive functions including:

  • Complex decision-making
  • Impulse control
  • Planning ability
  • Social cognition
  • Risk assessment

Recent research has shown that myelination isn't simply a linear process of adding more myelin. Instead, it involves active remodeling where myelin sheaths can be modified based on neural activity and learning. This process is particularly active in the prefrontal cortex through adolescence and early adulthood.

This extended development period helps explain certain patterns of adolescent behavior and why capabilities like judgment and impulse control continue to mature into the early-to-mid 20s. It's also relevant to understanding vulnerability to certain psychiatric conditions that often emerge during this period.

Would you like me to expand on the functional implications of either of these developmental processes?

Friday, November 29, 2024

Handwriting stimulates more neural mesh than typing does

That's very interesting. But I do know the, for example, some writers prefer longhand over keyboarding. Neal Stephenson prefers to write drafts in longhand. This seems relevant.

Friday Fotos: Hoboken April-May 2011

More Cobra Kai: Freud, Girard, and History [Media Notes 143b]

I wanted to follow up on my earlier post about Cobra Kai. Here’s my point of departure:

So, what do we have? A conflict between two teenage boys gets reignited in their early middle age where it becomes amplified into a war for dominance of the karate world. On the one hand, it is rather ridiculous, adults using their dojos as vehicles for working out their conflicts. How Freudian! [Indeed. Think about that for a minute, think about it very seriously.] The series is aware of this and indicates that awareness in various ways.

Freud was interested in how our early life influenced, strongly influenced, our behavior as adults. His theory of psychosexual development placed a lot of emphasis on the early years, especially the first six years, but also – but also of course, especially of course – dealt with the emergence of sexuality in puberty.

Cobra Kai centers on Johnny Lawrence and Daniel LaRusso, but their story originates in their middle-to-late adolescence in The Karate Kid and its sequels. That’s a bit late for the really deep Freudian stuff to kick in, but the causal dynamic is nonetheless clear and Freudian in kind. The attitudes, feelings, and actions of these middle-age adult men follow patterns set in place in their adolescence. Not only are we told about this past, but we see scenes from the early films.

But the Freudian dynamic goes deeper than that. For it also explores the earlier life of the men who taught Lawrence and LaRusso, John Kreese and Mr. Miyagi respectively. We see scenes from Kreese’s experience during the Vietnam War and before and are led to understand how they have made him the bitter man he is today. LaRusso travels to Okinawa and learns things about Miyagi that he hadn’t known, including something about his early experience with competitive karate the full nature of which has yet to be revealed (we’re two-thirds of the way into the final season). Moreover, when Terry Silver is brought into the film, another bad guy from the past, we learn more about the origins of the Cobra Kai school of karate, strengthening the shows depiction of the pull of the past.

Cobra Kai is also shot through with mimetic desire and sacrifice. The original conflict between LaRusso and Lawrence centers on a mimetic conflict over a woman, Lawrence’s ex-girlfriend who becomes attracted to LaRusso. In the present, LaRusso’s decision to start a dojo is a mimetic answer to Lawrence starting a new Cobra Kai studio – notice that it is that form of karate which gives the series its name. And mimetic dynamics play out in the relationships among the teenagers at the center of the series, LaRusso’s daughter, Samantha, Lawrence’s son, Robby, and his young protégé, Miguel, and a young Cobra Kai student, Tory Nichols.

The sacrifice dynamic is a bit trickier, but the series features a number of karate tournaments and a number of gang battles between the rival dojos. Kenny Payne and “Stingray,” among others, emerge as sacrificial victims in those battles, and then we have the grand melee that ends episode ten on the sixth season. This takes place the Sekai Taikai, an international karate tournament to determine, in effect, the dojo of dojos and anoint the best karate style. The tournament breaks down in a free-for-all battle involving, not only the karate students, but their teachers as well. [Spoiler alert] The battle comes to a dramatic end when one of the student fighters, Kwon, a particularly arrogant fighter, accidentally kills himself with a knife he’d picked up off the floor. The knife had been dropped by Kreese, who’d intended to kill Silver. We’ll have to wait until February to find out what happens next.

Think of the scope covered in the film: from Vietnam, Okinawa, Korea, and America in the mid 20th century through to an international tournament three decades into the middle of the 21sr century. That’s a big chunk of recent history, albeit centered on actions involving only a group of individuals. No nation-states take part in the conflict. This is no War and Peace, but it is, nonetheless history, and it is history displayed in patterns analyzed by Freud and Girard.

Neurons fire in a sequence that encodes information

Abstract for the linked article:

Neural coding has traditionally been examined through changes in firing rates and latencies in response to different stimuli. However, populations of neurons can also exhibit transient bursts of spiking activity, wherein neurons fire in a specific temporal order or sequence6. The human brain may utilize these neuronal sequences within population bursts to efficiently represent information, thereby complementing the well-known neural code based on spike rate or latency. Here we examined this possibility by recording the spiking activity of populations of single units in the human anterior temporal lobe as eight participants performed a visual categorization task. We find that population spiking activity organizes into bursts during the task. The temporal order of spiking across the activated units within each burst varies across stimulus categories, creating unique stereotypical sequences for individual categories as well as for individual exemplars within a category. The information conveyed by the temporal order of spiking activity is separable from and complements the information conveyed by the units’ spike rates or latencies following stimulus onset. Collectively, our data provide evidence that the human brain contains a complementary code based on the neuronal sequence within bursts of population spiking to represent information.

Thursday, November 28, 2024

Jersey City approach and exit, 2009

Depression and Creativity

I’m working on an article about depression and decided to ask about the connection between depression and creativity. Claude makes a few general remarks and asks for my thoughts. So I mention some remarks that my teacher, David Hays, had made years ago, mentioning that he was a computational linguist. After some general remarks, Claude asks:

... did Hays develop this idea through his work in computational linguistics? It seems like there might be interesting parallels between how computer systems need to reorganize data structures and how human minds might need to reorganize mental frameworks.

Interesting, most interesting. Our ensuing interaction felt a bit like a real conversation. I was surprised and pleased. This felt a bit like, you know, real intelligence.

I’ve appended its remarks below. I’m using my standard convention, setting my words in bold Courier and Claude’s reply in plain Courier.

Version: Claude 3.5 Sonnet

* * * * *

Is there evidence for a relationship between depression and creativity?

The relationship between depression and creativity is complex and has been studied extensively. The research shows several interesting patterns:

Historical observations have noted many renowned artists, writers, and musicians who experienced depression, like Virginia Woolf, Vincent van Gogh, and Sylvia Plath. However, we should be careful not to romanticize mental illness or assume correlation equals causation.

Scientific studies have found some evidence for links between depression and certain aspects of creativity:

Some research suggests that people with mood disorders may experience enhanced divergent thinking during certain phases, allowing them to generate more unique associations and ideas. The changes in emotional processing and perspective that can come with depression might contribute to seeing things in new ways or having deeper emotional insights.

However, severe depression typically impairs cognitive function and motivation, which can significantly reduce creative output. Many artists report that while their experiences with depression may inform their work, the actual creative process usually happens during periods of better mental health.

A key factor seems to be that creative pursuits can serve as a coping mechanism and form of emotional expression for people experiencing depression. The relationship may also be bidirectional - creative personalities might be more prone to rumination and emotional sensitivity.

What are your thoughts on this relationship between mental health and creativity? Have you observed any patterns in your own creative process?

That’s tricky. A teacher of mine, the computational linguist David Hays, thought that depression might indicate a process of mental reorganization that precedes a period of creativity. It’s as though you get stuck on a line of thought and have to undergo something like mental hibernation. During that “hibernation” the mind reorganizes.

That's a fascinating perspective from David Hays. His idea of depression as a period of mental reorganization or "hibernation" adds an interesting cognitive dimension to the discussion. It aligns with some modern theories about the potential adaptive functions of depression, particularly the "analytical rumination hypothesis" proposed by evolutionary psychologists.

This hypothesis suggests that depression's characteristic features - social withdrawal, reduced activity, and intense focus on problems - might serve to create mental space for working through complex problems or restructuring one's understanding. The "stuck" feeling you mentioned could represent this deep processing phase.

The metaphor of hibernation is particularly apt because it suggests both:

1. A necessary withdrawal or dormancy period
2. An underlying productive process that sets the stage for future growth

I'm curious - did Hays develop this idea through his work in computational linguistics? It seems like there might be interesting parallels between how computer systems need to reorganize data structures and how human minds might need to reorganize mental frameworks. His background in computational thinking might have given him unique insights into cognitive restructuring processes.

I think Hays primarily developed the idea from observing his own creative process. But he might also have been thinking about computers. Your remark about reorganizing data structures is suggestive. I know one thing that programmers do is refactor their code. That’s a specific kind of reorganization so it is more coherently organized.

The parallel between mental reorganization and code refactoring is quite illuminating. In refactoring, you don't change what the code does (its external behavior), but you restructure its internal organization to make it clearer, more efficient, or more maintainable. This often involves recognizing patterns that weren't obvious in the original implementation and finding a better way to express them.

This maps interestingly onto Hays's idea about depression and creativity. Just as a programmer might hit a point where the code has become too unwieldy and needs refactoring, a person might reach a point where their current mental models or ways of thinking have become inadequate for handling their experiences or insights. The depressive period could be like the uncomfortable but necessary process of pulling apart the existing structure to create something more robust.

Large language models surpass human experts in predicting neuroscience results

Luo, X., Rechardt, A., Sun, G. et al. Large language models surpass human experts in predicting neuroscience results. Nat Hum Behav (2024). https://doi.org/10.1038/s41562-024-02046-9

Abstract: Scientific discoveries often hinge on synthesizing decades of research, a task that potentially outstrips human information processing capacities. Large language models (LLMs) offer a solution. LLMs trained on the vast scientific literature could potentially integrate noisy yet interrelated findings to forecast novel results better than human experts. Here, to evaluate this possibility, we created BrainBench, a forward-looking benchmark for predicting neuroscience results. We find that LLMs surpass experts in predicting experimental outcomes. BrainGPT, an LLM we tuned on the neuroscience literature, performed better yet. Like human experts, when LLMs indicated high confidence in their predictions, their responses were more likely to be correct, which presages a future where LLMs assist humans in making discoveries. Our approach is not neuroscience specific and is transferable to other knowledge-intensive endeavours.

Looking out of the window

"Take Five," five ways

From Wikipedia:

"Take Five" is a jazz standard composed by Paul Desmond. It was first recorded in 1959 and is the third track on Time Out by the Dave Brubeck Quartet. Frequently covered by a variety of artists, the track is the biggest-selling jazz song of all time and a Grammy Hall of Fame inductee.

Dave Brubeck was inspired to create an album based on odd time signatures during his state-sponsored 1958 Eurasia trip. The track was written after the Quartet's drummer, Joe Morello, requested a song in quintuple (5/4) meter. Desmond composed the melodies on Morello's rhythms while Brubeck arranged the song. The track's name is derived from its meter, and the injunction, "Take five", meaning "take a break for five minutes". The track is written in E♭ minor and is in ternary (ABA) form.

Released as a promotional single in September 1959, "Take Five" became a sleeper hit in 1961, and then went on to become the biggest-selling jazz single of all time. The track still receives significant radio airplay.

Yesterday I cruised YouTube for recordings. I listed to over a dozen. Most of them had the same time feel as Brubeck's original recorder. The instrumentation would be different and the tempo might be faster of slower, but the basic time feel was the same, a jazz swing. But I found four versions that were distinctly different, though I assume there are others out there. I've presented them below, after Brubeck's original recording.

Jazz Swing: Dave Brubeck Quartet (1959, the original recording)

Salsa: Tito Puente (1985)

I heard Tito Puente live at the Troy Savings Bank Music Hall in the early 1980s. They were hot! hot!! hot!!! People were dancing in aisles. Not bad for a place originally built to present the European classics. Pictures of some of them – Bach, Haydn, Mozart, and the rest – lined the walls. I wonder what they were thinking about Puente. I'd like to think they were savvy enough to smile. [Note: They take it in four, but four with a clave.]

Ska: New York Ska-Jazz Ensemble (2009)

I know next to nothing about ska, but this is a good performance. I like it. [Also in four.]

Pakistani: Sachal Studios (2011)

This is interesting. It's a large group, lot's of fiddles. The arrangement borrows heavily from Brubeck's original. The major difference is in the percussion, which is a tabla player, a very good one, and that gives the whole thing a different underlying rhythmic pulse.

Africans: Chyke Martins and Udoh Band (2024)

I love this, so smooth and slinky. Listen to Martins throw in a bit of "Summertime" at 4:30.

Is it over? States and State Systems After Globalism

Christopher Caldwell, This Maverick Thinker Is the Karl Marx of Our Time, NYTimes, Nov. 28, 2024:

Who could have seen Donald Trump’s resounding victory coming? Ask the question of an American intellectual these days and you may meet with embittered silence. Ask a European intellectual and you will likely hear the name of Wolfgang Streeck, a German sociologist and theorist of capitalism.

In recent decades, Mr. Streeck has described the complaints of populist movements with unequaled power. That is because he has a convincing theory of what has gone wrong in the complex gearworks of American-driven globalization, and he has been able to lay it out with clarity. Mr. Streeck may be best known for his essays in The New Left Review, including a dazzling series on the cascade of financial crises that followed the crash of 2008. He resembles Karl Marx in his conviction that capitalism has certain internal contradictions that make it unsustainable — the more so in its present “neoliberal” form. His latest book, “Taking Back Control? States and State Systems After Globalism,” published this month, asks whether the global economy as it is now set up is compatible with democracy. He has his doubts.

The Rubicon:

But starting in the 1970s, things began to change. Sometime after the Arab oil embargo of 1973, investors got nervous. The economy began to stall. This placed politicians in a bind. Workers had the votes to demand more services. But that required making demands on business, and business was having none of it. States finessed the matter by permitting the money supply to expand. For a brief while, this maneuver allowed them to offer more to workers without demanding more of bosses. Essentially, governments had begun borrowing from the next generation.

That was the Rubicon, Mr. Streeck believes: “the first time after the postwar growth period that states took to introducing not-yet-existing future resources into the conflict between labor and capital.” They never broke the habit.

The rise of the technocrats:

At each stage of neoliberalism’s evolution, Mr. Streeck stresses, key decisions have been made by technocrats, experts and other actors relatively insulated from democratic accountability. When the crash came in 2008, central bankers stepped in to take over the economy, devising quantitative easing and other novel methods of generating liquidity. During the Covid emergency of 2020 and 2021, Western countries turned into full-blown expertocracies, bypassing democracy outright. A minuscule class of administrators issued mandates on every aspect of national life — masks, vaccinations, travel, education, church openings — and incurred debt at levels that even the most profligate Reaganite would have considered surreal.

Mr. Streeck has a clear vision of something paradoxical about the neoliberal project: For the global economy to be “free,” it must be constrained. What the proponents of neoliberalism mean by a free market is a deregulated market. But getting to deregulation is trickier than it looks because in free societies, regulations are the result of people’s sovereign right to make their own rules. The more democratic the world’s societies are, the more idiosyncratic they will be, and the more their economic rules will diverge. But that is exactly what businesses cannot tolerate — at least not under globalization. Money and goods must be able to move frictionlessly and efficiently across borders. This requires a uniform set of laws. Somehow, democracy is going to have to give way.

I'm ordering the book. There's much more at the link.

Wednesday, November 27, 2024

Cat [Japanese]

Claude parodies “Kubla Khan” using Elon Musk, FDR, and Walt Disney as subjects

As you may know, there’s been a fair amount of interest in the ability of LLMs to create poetry. I’ve done a bit of that myself with ChatGPT. The results were not good.

Then I had an idea: Why not do with poetry what I’ve done with stories? What I did with stories was give ChatGPT a short fairytale along with an instruction to use that as the basis for a new story. But the new story was to have a different protagonist, which I’d specify. The results were interesting.

I used Princess Aurora as the protagonist in the source story. When asked for a new story where Prince Harry was the protagonist, ChatGPT produced a new story that was very much like the original, except for where the protagonist was mentioned. But when the new protagonist was a giant chocolate milkshake or XP-708-DQ, hoo boy! did things change. What fun, and interesting as well: ChatGPT tells stories, and a note about reverse engineering: A Working Paper, Version 3.

Why not try the same thing with poetry? So, I presented ChatGPT with Wallace Stevens’s famous “Thirteen Ways of Looking at a Blackbird” and asked it to produce a poem about irises and another about AI. I did the same for “Kubla Khan.”

I’ve now taken the game to Claude, asking it to produce poems with Elon Musk, FDR, and Walt Disney as protagonists. It did so. The poems, as you can see, are not good poems. It seems to me, though, that they are bad in a way that no human would perform the task. A really good poet might refuse the task as ridiculous on the face of it, realizing that “Kubla Khan” is steeped in an exotic Orientalism that was appropriate in its time, but that is out of place in 21st century America. A middling poet might take it on and produce middling satire. A bad poet, with an inflated sense of themselves, might produce really bad poems, worse that Claude’s. But they would be bad in a human way.

Note that I only presented Claude with the first thirty-six lines of “Kubla Khan.” As you know there’s a yawning discontinuity between the first 36 and the last 18 lines of the poem. I wanted to spare Claude the problem of bridging that gap.

The Technocrat’s Dream (After Coleridge)

In Palo Alto, Elon Musk
A vast electric realm decreed:
Where Tesla's charging stations thrust
Through highways measureless to dust,
Down to the western sea.
So thousand cars of gleaming chrome
Beneath the falcon's wingspan roamed,
And there were factories bright with robotic drills,
Where hummed the lithium-burning hearts;
And here were server farms on desert hills,
Encoding dreams in circuit parts.

But oh! that deep Martian rift that beckoned
Up toward stars athwart the rocket's glare!
A barren place! as red and reckoned
As e'er beneath Phobos' shadow beckoned
To pioneers who dreamed to venture there!
And from this base, with deuterium flowing,
As if the planet's core were glowing,
A Starship's flame to heaven pressed:
Amid whose thrust and vapor-stress
Huge stages fell like titanium rain,
While Earth turned small upon the cosmic plane:
And 'mid these burning hours at once and ever
It carved through space, a steel endeavor.
Past stratosphere and thermal shield,
Through vacuum vast where no life stirred,
Until the red world's face appeared,
Its ancient secrets unrevealed:
And 'mid this quest Musk heard from far
The quantum computers plotting quasar!

The shadow of the mission's reach
Rippled through the market's dream;
Where neural networks sought to teach
Machines to think, machines to gleam.

It was an empire carved from light,
Where silicon dreams eclipsed the night!

WKRP bombs Cincinatti with turkeys [instant classic!]

Who we spend time with over the course of our lives

H/t Tyler Cowen.

Tuesday, November 26, 2024

Weight loss, bariatric surgery, and the new drugs

Time stamps:

00:00 Intro
01:06 Set Point Theory
11:24 Calories In / Calories Out
18:47 Biggest Hurdles To Weight Loss
23:05 What Is Bariatric Surgery?
30:10 Success Rates
36:08 BMI
40:50 My 600 Pound Life
48:22 Side Effects of Surgery / Patient Satisfaction
54:23 Ozempic vs. Surgery
1:06:42 Life After Ozempic
1:10:28 Is Ozempic Unethical?
1:15:18 Food Addiction / Muscle Loss
1:18:05 The Process / Misconceptions
1:25:35 Innovation / Motivation

Fennel

Will Medicare cover the new GLP-1 weight-loss drugs?

Margot Sanger-Katz, Obesity Drugs Would Be Covered by Medicare and Medicaid Under Biden Proposal, NYTimes, Nov. 26, 2024.

The Biden administration, in one of its last major policy directives, proposed on Tuesday that Medicare and Medicaid cover obesity medications, a costly and probably popular move that the Trump administration would need to endorse to become official.

The proposal would extend access of the drugs to millions of Americans who aren’t covered now.

The new obesity drugs, including Wegovy from Novo Nordisk and Zepbound from Eli Lilly, have been shown to improve health in numerous ways, but legislation passed 20 years ago prevents Medicare from covering drugs for “weight loss.”

The new proposal sidesteps that restriction, specifying that the drugs would be covered to treat the disease of obesity and prevent its related conditions.

“We don’t want to see people having to wait until they have these additional diseases before they get treatment,” said Chiquita Brooks-LaSure, the administrator of the Centers for Medicare and Medicaid Services, or C.M.S., noting the growing medical consensus that obesity is a chronic health condition.

What will the Trump administration do?

Given Mr. Kennedy’s skepticism, and the new drugs’ potentially large cost, Trump officials might not have proposed a coverage requirement. But they may now face pressure to approve it.

Demand for the drugs is strong, and some Medicare beneficiaries are already taking them, even paying full price themselves. A recent survey from KFF, a health policy group, found that 61 percent of Americans support Medicare coverage for these drugs. The Biden administration’s proposal will set off a public comment period, in which doctors and patients will share their views on the decision with government officials.

There's more at the link.

Freddie Hubbard, First Light

Perhaps more than any other, Freddie Hubbard set the style for mainstream 1960s-1970s jazz trumpet. Sure, there’s Miles Davis, but he’s his own universe and set course for planet Jazz-Hendrix-Fusion during this period. And I preferred Lee Morgan myself; he influenced me in a way that Freddie never did (a bit more fire). But I loved Freddie and have fond memories of the time he played the snot out of “Here’s That Rainey Day” in Troy Saving’s Bank Music Hall.

He recorded “First Light” in 1971 and put it out on a Grammy-winning album of the same name:

Notice the slick cover. That’s typical of CTI, Creed Taylor’s label. He was trying to get jazz more play by taking off some of the edges and wrapping it in a pretty package. It worked pretty well, at least for a while. But the fact is jazz has never recovered from the rise of rock and pop and probably never will.

The band consists of a small complement of strings, woodwinds, and two French horns wrapped around a jazz core. Some crazy good cats in the band: George Benson on guitar, Jack DeJohnette on drums, Airto Moreia on percussion, Ron Carter on bass, Herbie Hancock on Fender Rhodes, and Hubert Laws on flute. It’s the eleven violins and two cellos that take the edge of while the French horns make it look sassy.

The tune has two sections. There’s an opening section in free time where Freddie floats around over the rhythm section. Then the piano initiates a vamp in tempo at about 1:15 and Freddie comes in on the melody at 1:41. At 2:07 we have a transitional section where Laws floats on flute over low-key driving on the drums and regular chords in the piano. Freddie’s back in on the melody at 2:32 and starts improvising at about 2:57. Hubbard finishes his solo at 5:49 and we have another transitional interlude which gives way to a George Benson guitar solo at 6:13 (in tempo) which, in turn, gives way to our transitional interlude at 8:26. Freddie’s back in on the melody at 8:50. At 9:15 or so we start vamping to the end, Freddie noodling around, slowly fading to silence a bit over 11 minutes out.

AI granny vs. the phone scammers

Ali Watkins, An A.I. Granny Is Phone Scammers’ Worst Nightmare, NYTimes, Nov. 25, 2024:

Daisy Harris likes birds — there’s one outside her window. Her cat is named Fluffy, and she’d love to tell you about her knitting hobby. She likes tea and biscuits. And she just can’t seem to figure out this internet thing.

But perhaps you can explain it just one more time?

That’s the general vibe of a chat with Daisy Harris, an A.I.-generated granny unveiled this month by the big British phone company O2, as part of its efforts to curb phone scammers. She doesn’t have a badge, or a warrant, or any way to really stop them. But when an unlucky fraudster dials her number, Daisy does have the power to waste unlimited amounts of their time.

There are plenty of human scambaiters out there — people who identify would-be thieves and turn the tables, leading them on meandering conversations that keep them from calling other potential victims. Unlike them, Daisy is free of encumbrances like the need to sleep. [...]

Phone scams have reached mind-boggling levels: Tens of millions of scam calls rocketed around the world every day last year, according to the phone security crocompany Hiya. An anti-scam consortium says that more than $1 trillion was stolen, often when targets unsuspectingly turned over bank details, passwords or other personal information.

Can we have an email version that engages Nigerian princes seeking to deposit large amounts of money in our bank accounts?

There's more at the link.

The cost of education in the Roman Empire

Monday, November 25, 2024

A festival of synchronized human activity, but also birds, fish, and other creatures

Surprise!

My Streaming Diet [Media Notes 145]

I’ve written previously about my posting habits over the long-term, posting charts showing wide variation over time, with periods lasting two to four months where I post very little, between longer periods with many posts per month, over a 100 in recent years. I’ve attributed this to mood swings; I’m depressed during periods when I’m not posting much.

What do I do with my time when I’m not making many blog posts? I’m streaming a lot of movies and TV programs on Netflix and Amazon Prime and watching more YouTube videos that I otherwise do. I’ve been keeping a list of movies and TV programs since sometime in 2018. I’ve decided to post that list online as deep background for an article I’m preparing on my mood swings. 

The list is simple. I list the titles by year, and then by whether or not they’ve movies (films) or TV programs. I define movies as titles that are somewhat over an hour long and that reach a definite conclusion at the end of the title. This, I realize, is a crude criterion. There may, of course, be sequels. Everything else is a TV program. TV programs typically have multiple episodes and may or may not continue the same plotlines in the same story from one episode to the next. TV programs tend to have (somewhat shorter than) half-hour episodes or (somewhat shorter than) one-hour episodes.

The distinction is rough, but reasonable. The list says almost nothing about how many episodes to a TV program I watch. In many cases I’ve watched all of the available episodes, which may stretch into the hundreds in some cases. I’ve watched some titles more than once and I’ve generally listed the title more than once.

* * * * *

2018 more or less

TV

Bourdain: A Cooks’ Tour, Parts Unknown
Death in Paradise
Shetland
Iron Fist
Luke Cage
Jessica Jones
Lilyhammer
The Get Down
Samurai Gourmet
The Ranch
The Chinese Mayor
The Crown
Sunny in Philadelphia
Bordertown
Sherlock
Master of None (Aziz Ansari)
Hip-Hop Evolution
Sense8
Fyre
Grace and Frankie
Jack Ryan
Lost in Space, S1
The Man in the High Castle
Transparent
Sneaky Pete

Movies

Jurassic Park, 1, 2, 3
Gimme the Loot (2 graffiti artists)
Sand Castle (war in the Middle East)
Road House (Patrick Swayze)
Indiana Jones, all of them
Ex Machina
One Mississippi
Running Man
American Playboy
Spectre
Wild Card
Julie & Julia

Evidence of rabbit

The last universal common ancestor and its impact on the early Earth system

Moody, E.R.R., Álvarez-Carretero, S., Mahendrarajah, T.A. et al. The nature of the last universal common ancestor and its impact on the early Earth system. Nat Ecol Evol 8, 1654–1666 (2024). https://doi.org/10.1038/s41559-024-02461-1

Abstract: The nature of the last universal common ancestor (LUCA), its age and its impact on the Earth system have been the subject of vigorous debate across diverse disciplines, often based on disparate data and methods. Age estimates for LUCA are usually based on the fossil record, varying with every reinterpretation. The nature of LUCA’s metabolism has proven equally contentious, with some attributing all core metabolisms to LUCA, whereas others reconstruct a simpler life form dependent on geochemistry. Here we infer that LUCA lived ~4.2 Ga (4.09–4.33 Ga) through divergence time analysis of pre-LUCA gene duplicates, calibrated using microbial fossils and isotope records under a new cross-bracing implementation. Phylogenetic reconciliation suggests that LUCA had a genome of at least 2.5 Mb (2.49–2.99 Mb), encoding around 2,600 proteins, comparable to modern prokaryotes. Our results suggest LUCA was a prokaryote-grade anaerobic acetogen that possessed an early immune system. Although LUCA is sometimes perceived as living in isolation, we infer LUCA to have been part of an established ecological system. The metabolism of LUCA would have provided a niche for other microbial community members and hydrogen recycling by atmospheric photochemistry could have supported a modestly productive early ecosystem.

Driftwood and rocks

Sunday, November 24, 2024

Depression is on the rise

I’m preparing an article on depression and want to say a few words about rising rates of depression. I decided to ask Claude about it. Note its remarks about long-term historical trends, Robert Burton's The Anatomy of Melancholy, and about the “melancholy Dane” (Hamlet). I’ve not attempted to verify any of this, but it looks reasonable to me. It’s certainly a reasonable starting point for a serious look through the archives; here are some posts about depression at New Savanna. I did read large chunks of Burton’s remarkable book while in graduate school.

I’ve appended its remarks below. I’m using my standard convention, setting my words in bold Courier and Claude’s reply in plain Courier.

Version: Claude 3.5 Sonnet

* * * * *

Is there any evidence that depression has been increasing in the US in the last couple of decades?

Yes, there is evidence of increasing depression rates in the US over recent decades, with several notable trends: Data from major studies shows rising prevalence, particularly among certain groups:

The National Survey on Drug Use and Health found the percentage of adults experiencing a major depressive episode increased from 6.6% in 2005 to 8.4% in 2020, with particularly sharp increases among young adults aged 18-25.

Youth depression has shown especially concerning trends. The CDC reported that the percentage of high school students experiencing persistent feelings of sadness or hopelessness increased from 26% in 2009 to 37% in 2019.

Some demographic patterns in the increases include:

  • Higher rates among women compared to men
  • Particularly sharp rises among adolescents and young adults
  • Greater increases among people with lower incomes and education levels
  • Notable increases across racial and ethnic groups, though with some variations

Several potential contributing factors have been identified:

  1. Increased social media use and screen time
  2. Economic pressures and inequality
  3. Changes in sleep patterns and physical activity
  4. Decreased in-person social connection
  5. Greater awareness and willingness to report symptoms

It’s worth noting that some of the reported increase may also reflect improved recognition and diagnosis of depression, as well as reduced stigma around mental health that leads more people to seek help and report symptoms. However, researchers generally agree there has been a real increase in depression beyond just better detection.

Would you like me to elaborate on any particular aspect of these trends or their potential causes?

Using a 300-ton hydraulic press to crush a block of 10,000 Lego bricks

 

The really fun stuff starts at about 8 minutes in, but, really, if you're going to watch that, why not watch the whole thing? 

AND: What kind of person laboriously assembles 10,000 Lego bricks into a single block for the purpose of seeing how it will respond to pressure from a 300-ton hydraulic press? 

They're having so much FUN doing this.

Trump 47 and the battle for the soul of America

I was happy when Biden dropped out of the race. But why’d it take him so long? And I was pleased to see Harris rake in all the $$$ and do well in the polls, not so much because I liked her as a candidate [meh], but because it indicated that Trump might be defeated. That’s something I very much wanted to happen.

By the time I cast my vote for Harris, however, my doubts were inching ahead of my hopes. When I woke up the next morning to discover that Trump had won, clearly – but, please, don’t blather about a landslide; it was a clean win, both the popular vote and the electoral college, but no more, I was not happy. But I was not surprised.

What next?

The writing’s on the wall: Graffiti

Before I get to Trump’s victory, however, I want to talk about graffiti, perhaps the most significant development in American art in the last century. It started on the streets of New York City and Philadelphia in the late 1960s and early 1970s and two decades later it had made its way across the county and around the world. The people who started it did not go to art schools nor exhibit in galleries. It was born outside the legit art world.

In the summer of 2015 my friend Greg Edgell curated a large graffiti show covering the interior and exterior walls of a Pep Boys building in Jersey City. Forest City had acquired the property and was going to demolish the building to make way for an apartment complex. Greg contacted them and got permission to cover the walls with graffiti; he also talked them into a small budget. Since the building was slated for demolition, we called it The Demolition Exhibition.

Here's a shot of one of the exterior walls [click on photo to enlarge]:

It’s a battle scene depicting characters from the all-American comic strip Peanuts, fractured, crossed over, and messed-up. There’s Charlie Brown’s head in the center, Linus and his blanket at the right edge, and Snoopy’s in the sky in his aviator togs, pretending to be a WWI flying ace. Who’s that dark little creature to the left of Charlie Brown? Looks like Snoopy’s silhouette wandering the battlefield. Call it the soul of America.

People enjoying the show [click on photo to enlarge]:

Demolition in progress [click on photo to enlarge]:

Glenn Loury’s pleased as punch: A working class party

I’ve been watching these guys for years. Glenn Loury’s a distinguised economics professor at Brown and John McWhorter is a linguist at Columbia; he’s also a New York Times columnist. Their political views are complex.

John doesn’t like Trump at all, thinks he’s an idiot (his word), and dismisses him as a standup comedian. I find that a bit harsh, but boy! do I understand it. I don’t think Loury likes Trump either (& I don’t think he voted for him). But he can’t stand Obama, doesn’t like Harris, and is glad Trump won. The two share a disdain for wokeness (as do I).

Why’s Loury glad? Primarily, I believe, because he’s happy to see a working-class party in power. Imagine that, the Republicans are a working-class party! It wasn’t like that when Eisenhower was President. And the Democrats seem to have been captured by (out of touch) coastal elites; JFK’s best and brightest. The times they are changing – perhaps not in the ways Dylan had in mind when first he sang that song – but changing they are. We can’t go back.

Ross Douthat: What about those cabinet picks?

Now we have Trump’s cabinet picks. When that incendiary clown Matt Gaetz got the nod for Attorney General I lost it. Fortunately Trump listed to all the howls, took a deep breath, and decided to ditch him in favor of a woman with substantial legal experience. Pete Hegseth for Secretary of Defense did not make me happy either. A MAGA apologist for war crimes with no managerial experience, no way! Nor do I want a science-challenged Kennedy as Secretary of Health and Human Services. As for Tulsi Gabbard, I’m OK with her anti-war stance (I was a conscientious objector during the Vietnam War) as Director of National Intelligence.

On these lines, I found Ross Douthat recent NYTimes column, Three Theories of the Trump Cabinet, quite interesting. He takes his first theory from Matthew Zeitlin, who sees Trump’s “picks as making up an American version of a European-style coalition government, where small parties join with a bigger party and receive various ministries in exchange for their support.” After taking that out for a spin he offers a view he takes from Yuval Levin: “they’re designed to stoke conflict within the different agencies rather than within the cabinet. [...] Which suggests that what Trump 2.0 is seeking is less the representation of different factions and more just disruption of all kinds.”

Douthat lands on a third interpretation:

That he’s assembling a “team of podcasters,” to use the conservative writer Ben Domenech’s formulation, a cabinet of “communicators, not administrators,” who are picked for their celebrity and their experience as faces and voices [...] In which case it’s a mistake to look too closely at either their ideological commitments or their administrative experience. Trump mostly just wants them as charismatic faces who will be public salespeople for whatever he decides to do.

But the actual administration of the cabinet agencies still needs to happen, and Trump’s policy decisions are still likely to be strongly influenced by the ideas and proposals that are surfaced from below.

Hmmmm. I like it. Independently of what I’d like to see happen, will it accomplish what Trump wants it to accomplish and will it serve the working-class people who voted for him? I don’t know. But I do find it hard to believe that he cares about those Americans, though he may well have convinced himself otherwise.

But it is suggestive, very suggestive.

A tectonic shift?

Loury frequently says we’re witnessing a tectonic shift in American politics. He’s right. The realignment that’s given us a working-class Republican party is important. But there’s also the way the whole process has worked, the social media, the showmanship, the celebrity – and don’t forget Ronald Reagan, who was governor of California before he was President, not to mention Arnie “The Governator” Schwarzenegger. I believe Douthat’s got the same thing in mind with the team of podcasters idea. The whole process, the mechanism, is changing, top-to-bottom and inside out. I don’t know how to conceptualize it, but, yes, it’s happening and it’s tectonic.

* * * * *

Let the graffiti writers have the last word. I took this in FDR skatepark in South Philadelphia in December 2018, beneath the thruway and just north of the naval yard. Built by skateboarders, who have a strong DIY culture, FDR is legendary among skateboarders. Notice the initials “FDR” on the Pabst can. [click on photo to enlarge]

Romantic Love, Conversation, Biology, and Culture

[I'm bumping it yet again in 2024]
Once more I'm bumping this to the top of the queue. Why? It discusses a methodological problem that is of current interest to me.  [2021]
* * * * *
I'm bumping this to the top of the queue because it's one of my favorite posts. FWIW, this is the post that got me my monthly slot at 3 Quarks Daily.

Note: This post grew out of reflection on older earlier post on bundling.
When I was an undergraduate at Johns Hopkins I took a course in Medieval literature and was thoroughly gobsmacked when I learned that romantic love had been invented in 12th century France. Until then I’d believed it to be a human universal – one and only, forever and ever, that was just how it was, no? Well, not quite.

What arose in Medieval Europe is something called Courtly Love, a set of conventions used by high-born men in wooing their lovers. And these lovers were not their wives, nor wives to be. For aristocratic marriage had little to do with personal preference; it was politics. Powerful families would forge alliances by arranging marriages among their young.

In time, over the course of centuries, so the story went, romantic love was transformed from an aristocratic game into a set of conventions used to define the necessary, or at least the ideal, precondition for any marriage. This set of conventions was in place, at least among the middle class, by the time Jane Austen wrote her novels in the early 19th Century. Those conventions have remained more or less in place up to the present, though they’ve become a bit tattered in the last decade or three as a soaring divorce rate has made it abundantly clear that true love does not last forever. That, of course, is not exactly news – why, for example, did Flaubert write Madame Bovary? – but the myth is so attractive that it dies hard.

That was the state of things during my undergraduate years – which coincided with the emergence of feminist activism in the late 1960s. Whatever their personal experience, everyone gave lip service to one and only forever and ever and believed that it was human nature. In that context, then, the revelations of the learned scholars shook my world.

Counter-Revolution

Learned scholars, however, do not constitute a single tribe. Their tribes are many, and often contentious. Even as the literati were blissfully proclaiming the recent and Western origin of romantic love, other scholars set out to prove them wrong. In 1992, for example, W. R. Jankowiak and E. F. Fischer published “A cross-cultural perspective on romantic love” (Ethnology 31: 149-155). They defined romantic love as “any intense attraction that involves the idealization of the other, within an erotic context, with the expectation of enduring for some time into the future” and they contrasted this with “the companionship phase of love . . . which is characterized by the growth of a more peaceful, comfortable, and fulfilling relationship.” They examined ethnographic data on 166 societies from around the world and discovered romantic love in 88.5 percent of them, suggesting “that romantic love constitutes a human universal, or at the least a near-universal.”

More recently Jonathan Gottschall and Marcus Nordland published Romantic Love: A Literary Universal? (Philosophy and Literature 30: 450-470, 2006). They conducted a cross-cultural study of folktales from 79 cultures and found at least one reference to romantic love in 55 of those collections and multiple references in 39 collections. They assert that their study “offers staunch support to the existing evidence that romantic love is a statistical cultural universal. It would also seem to increase the probability that romantic love may be an absolute cultural universal offers staunch support to the existing evidence that romantic love is a statistical cultural universal.” “Statistical universal” is a term of art meaning that something is in a lot of places, but not everywhere, yet. It seems clear that if Gottschall and Nordland were to place a bet, they'd bet that further research would find that romantic love is a true cultural universal, present in every culture for which we have reliable records.

Still more recently, just yesterday in the time-scale of academic publishing, Brian Boyd has asserted, with the calm assurance of a senior scholar in command of wide learning, that “cross-cultural, neurological, and cross-species studies have demonstrated the workings of romantic love across societies and even species” (The Origin of Stories, Harvard 2009, p. 341). To this, Michael Bérubé has replied, with the calm assurance of a senior scholar in command of wide learning, but learning leavened with a dash of school-boy wit:
This just won’t wash. Other species might court and mate for life, but they do not engage in romantic love in the sense that humanists employ the term, save perhaps for the cartoon skunk Pepé Le Pew. “Romantic love” does not mean “mammals doing it like mammals”; it refers to the conventions of courtly love, which were indeed invented in the European middle ages and cannot be found in ancient literatures or cultures. Those conventions are culturally and historically specific variations on our underlying (and polymorphous) biological imperatives, just as the institution of the Bridezilla and the $25,000 wedding is specific to our own addled time and place.
What’s going on here? Who’s right?

Back to the Drawing Board: There's that pesky elephant

I don’t know. We don’t know. Not any more.

I’m inclined to invoke that hoary old story of the blind men who, upon examining a large beast, are unable to decided what beast it is, or even whether or not it is a beast at all. We, of course, know that they’re examining an elephant, but such different parts of the elephant – tusks, years, legs, tail – that they reach vastly different conclusions about the object under scrutiny.

In the case of romantic love, I believe we’re in much the same position as those blind men. But, in our case, there is no transcendent story-teller who actually knows what creature is under scrutiny. Rather, it is up to us to approximate that story-teller by making more and more sophisticated observations and examining them through richer concepts and models about human culture and behavior.

There is no point in continuing to argue using existing observations, methods, and theories. In light of the existing contretemps we would do well to consider such arguments to be ideological in nature and thus pointless, except, of course, to all-knowing ideologues. Meanwhile, let’s take a look around and see what else is there to be explained.

Companionship, Conversation, and the Novel

Let’s return to Jankowiak and Fischer, and their contrast between the romantic phase and the companionship phase of love, a distinction, I believe, that is common, and which I accept. This companionship, is it too universal?

Take those Medieval aristocrats who were playing courtship games on the side: Did they have a companionate relationship with their spouses? I’m guessing that in some cases, yes, and in other cases no. These marriages, after all, were arranged by parents for political ends. If companionship developed in the marriage, fine; if not, no big deal. For companionship was not the point, it was not part of the ideology.

And then we have my standard passage from John Milton's Doctrine and Discipline of Divorce, Preface to Book 1:
God in the first ordaining of marriage taught us to what end he did it, in words expressly implying the apt and cheerful conversation of man with woman, to comfort and refresh him against the evil of solitary life, not mentioning the purpose of generation till afterwards, as being but a secondary end in dignity, though not in necessity: yet now, if any two be but once handed in the church, and have tasted in any sort the nuptial bed, let them find themselves never so mistaken in their dispositions through any error, concealment, or misadventure, that through their different tempers, thoughts and constitutions, they can neither be to one another a remedy against loneliness nor live in any union or contentment all their days…
That strikes me as an assertion of the need for companionship between spouses – cheerful conversation – and a rather emphatic assertion at that. Would Milton have made such an assertion if it had, in fact, been the common understanding of the day? That seems unlikely to me, though I could be wrong, as I am not a scholar of 17th century English family practices.

But the late Lawrence Stone was, and in 1977 he published a ground-breaking study, The Family, Sex and Marriage in England 1500-1800 (Harper & Row) in which he argued that, over a period of three centuries, family organization underwent a transition that started with the Open Lineage Family – permeable by outside influences, with strong “loyalty to ancestors and to living kind” (p. 4). It was succeeded by the Restricted Patriarchal Nuclear Family P. 7):
which saw the decline of loyalties to lineage, kin, patron, and local community as they were increasingly replaced by more universalistic loyalties to the nation state and its head and to a particular sect or Church. As a result, ‘boundary awareness’ became more exclusively confined to the nuclear family, which consequently became more closed off from external influences, either or the kin or of the community.
Finally, during the last half of the 19th Century, the Closed Domesticated Nuclear Family emerges among “the upper bourgeoisie and squirarchy” (pp. 7-8):
This was the decisive shift, for this new type of family was the product of the rise of Affective Individualism. It was a family organized around the principle of personal autonomy, and bound together by strong affective ties. Husbands and wives personally selected each other rather than obeying parental wishes, and their prime motives were now long-term personal affection rather than economic or status advantage for the lineage as a whole … Patriarchical attitudes within the home markedly declined, and greater autonomy was granted not only to children, but also to wives.
This was a family in which companionship between husband and wife was important, for that companionship was now the foundation of family organization. And this is the family structure that is at the heart of the British novel in the late 18th century and into the 19th century. The novel and the family structure had a reciprocal relationship (dialectical?) in which the demands of this family structure created an audience for the novel and the novel, in turn articulated the inner-workings and hidden designs of the family.

Out of what biological equipment did the novel help people construct their familial relations? Let us speculate, and freely – for what else can we do? If we’re to search for evidence, we’ve got to make a guess about what we’re looking for before there’s any point to setting out. However, we do want our speculation to be biologically plausible. So, calling on chess as a metaphor, let’s select our pieces from biology while our speculation will be the game play.