So it's real bummer when the world's most famous linguist writes an op-ed in the NYT* and gets it largely wrong.https://t.co/aFyLJvRl7e
— @emilymbender@dair-community.social on Mastodon (@emilymbender) March 10, 2023
(*NYT famous for publishing transphobia & bad AI coverage, but widely read)
>>
(And the whole debate about whether or not humans have an innate universal grammar is just completely beside the point here.)
— @emilymbender@dair-community.social on Mastodon (@emilymbender) March 10, 2023
>>
So, read this, not that:https://t.co/qgWwqhWmpc
— @emilymbender@dair-community.social on Mastodon (@emilymbender) March 10, 2023
And thanks again @lizweil for your reporting!
The article is well worth reading.
At this point in time Bender is perhaps most widely known as the person who coined the term "stochastic parrot." I think the term is rhetorically brilliant, but misleading. Late in the middle the article juxtaposes Bender against Christopher Manning:
Bender and Manning’s biggest disagreement is over how meaning is created — the stuff of the octopus paper. Until recently, philosophers and linguists alike agreed with Bender’s take: Referents, actual things and ideas in the world, like coconuts and heartbreak, are needed to produce meaning. This refers to that. Manning now sees this idea as antiquated, the “sort of standard 20th-century philosophy-of-language position.”
“I’m not going to say that’s completely invalid as a position in semantics, but it’s also a narrow position,” he told me. He advocates for “a broader sense of meaning.” In a recent paper, he proposed the term distributional semantics: “The meaning of a word is simply a description of the contexts in which it appears.” (When I asked Manning how he defines meaning, he said, “Honestly, I think that’s difficult.”)
If one subscribes to the distributional-semantics theory, LLMs are not the octopus. Stochastic parrots are not just dumbly coughing up words. We don’t need to be stuck in a fuddy-duddy mind-set where “meaning is exclusively mapping to the world.” LLMs process billions of words. The technology ushers in what he called “a phase shift.” “You know, humans discovered metalworking, and that was amazing. Then hundreds of years passed. Then humans worked out how to harness steam power,” Manning said. We’re in a similar moment with language. LLMs are sufficiently revolutionary to alter our understanding of language itself. “To me,” he said, “this isn’t a very formal argument. This just sort of manifests; it just hits you.”
I note that the term "distributional semantics," I believe, was coined by linguist Raymond Firth back in the late 1950s. This is so well-known that I assume Manning knows it and wasn't claiming the term as his own in the paper.
My own position on what's at issue between them is subtle. I certainly recognize the relationships words have among themselves, which is what Manning is arguing. But I also recognize intention, as Bender does. We need both. To put it over schematically:
meaning = intention + semanticity
semanticity = relationality + adhesion
I say more about that in this post, from May 2023, and this one, from April 2023.
Why can't the octopus simple recognise the pattern = attack = prey = predator = shark = bear.
At that point it just has to say, run away, hide and camouflage yourself.
Aside from that surprised Bender's position is or was the standard way of looking at this subject, it seems somewhat over inflected.