Pages in this blog

Wednesday, July 21, 2021

What’s the difference between a conspiracy theory and a far-out, but reasonable, idea? When does signaling go haywire?

My answer: It’s hard to tell. It may come down to the identity of the believing community and, of course, who’s making the judgement. I am thus considering some kind of continuum where we have crazy conspiracy theories at one end and far out speculative ideas at the other. Ideas that fall in the middle...who knows?

Republicans and Trump

Before we go there, however, let’s look at a recent column by Paul Krugman, Republicans Have Their Own Private Autocracy, NYTimes, July 19, 2021. Here’s the opening three paragraphs:

I’m a huge believer in the usefulness of social science, especially studies that use comparisons across time and space to shed light on our current situation. So when the political scientist Henry Farrell suggested that I look at his field’s literature on cults of personality, I followed his advice. He recommended one paper in particular, by the New Zealand-based researcher Xavier Márquez; I found it revelatory.

“The Mechanisms of Cult Production” compares the behavior of political elites across a wide range of dictatorial regimes, from Caligula’s Rome to the Kim family’s North Korea, and finds striking similarities. Despite vast differences in culture and material circumstances, elites in all such regimes engage in pretty much the same behavior, especially what the paper dubs “loyalty signaling” and “flattery inflation.”

Signaling is a concept originally drawn from economics; it says that people sometimes engage in costly, seemingly pointless behavior as a way to prove that they have attributes others value. For example, new hires at investment banks may work insanely long hours, not because the extra hours are actually productive, but to demonstrate their commitment to feeding the money machine.

Signaling is the key here. One of the things that happens in dictatorial regimes is the underlings will assent to all sorts of problematic things, not because they really believe in them, but to signal that they believe in and are loyal to the Great Leader. And, Krugman argues, Republicans are doing this all the time now, where Trump, even in exile, is the leader at the center of this signaling activity.

We saw it as soon as Trump was inaugurated. Remember how he insisted that he had the largest crowd EVER at his inauguration even though the photographs did not show it? The fact that so many, from the hapless Sean Spicer on down, agreed with him, shows that they were willing to bow to his will, even in the face of reality. Trump’s been doing it ever since. The Republicans have become fawning courtiers at the foot of the great prince.

QANON and conspiracy theories

A similar argument can easily be made about QANON and various other conspiracy theories. People assent to these crazy beliefs as a way of signaling their loyalty to the group. The moment a person assents to these beliefs they become a member of the group. Thus, from the NYTimes, Jan 17, 2021:

What attracts Ms. Gilbert and many other people to QAnon isn’t just the content of the conspiracy theory itself. It’s the community and sense of mission it provides. New QAnon believers are invited to chat rooms and group texts, and their posts are showered with likes and retweets. They make friends, and are told that they are not lonely Facebook addicts squinting at zoomed-in paparazzi photos, but patriots gathering “intel” for a righteous revolution.

This social element also means that QAnon followers aren’t likely to be persuaded out of their beliefs with logic and reason alone.

“These people aren’t drooling, mind-controlled cultists,” Mr. Rothschild said. “People who are in Q like it. They like being part of it. You can’t debunk and fact-check your way out of this, because these people don’t want to leave.”

Our need for community is so strong that, in the right circumstances, we’ll choose community over a clear-eyed perception of reality. We signal that choice by giving assent to beliefs with little or no evidence to back them up, indeed, that often have quite a bit of evidence against them.

Religion is caused by memes

Let’s consider a somewhat different case, that of memes. Here I’m not talking about internet memes, though there is a kinship here, but about the idea that culture is organized by ideas the flit around from one person’s head to another. These ideas are thus immaterial homunclear bots of some kind. As I explain in Q: Why is the Dawkins Meme Idea so Popular?, Dawkins created the idea of memes in 1976 in The Selfish Gene. He proposed it as the cultural equivalent of the biological gene.

So far so good. It was just a casual idea. Dawkins had no clear idea of how these things worked or even whether or not they existed inside people’s brains or outside in the world; he entertained both possibilities in 1976. But things changed as the idea caught on. In particular, the meme idea became a favorite way of explaining religious belief, at least in a certain intellectual community, one that believes in scientific materialism, reductive explanation, and human reason.

Religious belief is irrational and, in some cases, goes quite against the grain of the biological imperative to reproduce – I’m talking about celibacy among priests and nuns and other religious. Why, given that people are basically rational beings – so this community believes – would people harbor such irrational beliefs? It’s obvious, isn’t it? Their minds have been taken over by memes.

This explanation would be one thing if these meme-believers could explain how memes worked. But they can’t. These memes just somehow flit from mind to mind converting one person after another. And that’s that. That is not much of an explanation coming from someone who believes in materialism, science, and human reason. For such a person to believe such a flimsy idea is, why, it’s irrational!

And maybe that’s the point. It IS irrational. And so it is a good vehicle for these rationalists, the “Brights” as Dan Dennett likes to call them, to signal their loyalty to their belief system, which otherwise has many commendable aspects. These beliefs certainly aren’t as irrational as Republican groveling at the feet of Trump, or as QANON, but they don’t make much sense either. That people do believe in memebots (my term, not theirs) and argue passionately for them requires an explanation. I think signaling may do the trick.

The Tech Singularity

Where does the Technological Singularity fall on this continuum? By this I mean a loosely related complex of beliefs. The central belief is that at some point in the future computers will become so “intelligent” that they will surpass us and then, who knows? One possibility is that they will prove malevolent and turn on us. This shows up often in science fiction – see, e.g. my current article at 3 Quarks Daily on Forbidden Planet and The Terminator. Related beliefs include 1) the possibility of uploading (or is it downloading?) one’s mind to a computer and thus achieving immortality, 2) the idea that we’re right now living in a computer simulation created by a super-advanced civilization, and 3) the possibility of establishing direct links between human brains and AIs and, for that matter between one brain and another.

There very little reason to believe that any of this is likely. As far as I can tell, the strongest argument is: “Well, you can’t prove it won’t happen.” No, I can’t, nor can anyone else. So we’re at a standoff.

Except that one could argue that these ideas are mere projection. Here’s something David Hays and I published in 1990:

The computer is similarly ambiguous. It is clearly an inanimate machine. Yet we interact with it through language; a medium heretofore restricted to communication with other people. To be sure, computer languages are very restricted, but they are languages. They have words, punctuation marks, and syntactic rules. To learn to program computers we must extend our mechanisms for natural language.

As a consequence it is easy for many people to think of computers as people. Thus Joseph Weizenbaum, with considerable dis-ease and guilt, tells of discovering that his secretary “consults” Eliza—a simple program which mimics the responses of a psychotherapist—as though she were interacting with a real person (Weizenbaum 1976). Beyond this, there are researchers who think it inevitable that computers will surpass human intelligence and some who think that, at some time, it will be possible for people to achieve a peculiar kind of immortality by “downloading” their minds to a computer. As far as we can tell such speculation has no ground in either current practice or theory. It is projective fantasy, projection made easy, perhaps inevitable, by the ontological ambiguity of the computer. We still do, and forever will, put souls into things we cannot understand, and project onto them our own hostility and sexuality, and so forth.

In the middle of that second paragraph we assert that two singularity beliefs are projective fantasy. That’s not an argument. But the idea that the computer is somehow ambiguous to people raised in a world of animals, mechanical devices, and human beings, so that it doesn’t really fit into any of those categories, that could be put to use in an actual argument. To that we can add the cultural lineage I sketch on in that 3 Quarks Daily piece, Then let’s toss in an alternative interpretation the singularity, that it is taking place in our ideas, Redefining the Coming Singularity – It’s not what you think. Now we’ve got some material with which to fashion an argument that Singulatarianism, if you will, is best seen as a quasi-religious cult rather than as a set of possibly fruitful speculative beliefs about the future of technology. The fact that this set of beliefs is backed by high tech billionaires is no more an argument in their favor than is the fact that Republican denial of reality is backed by a different set of billionaires.

* * * * *

What do I believe? I’m not sure. I want to think about it some more. I’m talking about four different groups of people here, or is it only two? There is, after all, some overlap between Trumpists and conspiracy theorists, on the one hand, and Dawkinsian memeticists and Singulatarians on the other. I rather suspect that each group eyes the other with suspicion, when they do so at all. But are they really so different. I’m not sure. 

An exercise for the reader

One way to think of this is to think of the relationship between the adherents (of some belief or belief system) and non-adherents. Is the distinction a sharp one or a loose one? Do non-adherents respect the ideas of adherents, or do they think they’re crazy? Do adherents respect the ideas of non-adherents, or do they think they’re crazy.

How do those considerations work out with respect to the four groups above?

2 comments:

  1. I don't think I've ever seen anybody arguing that memes just "flit around from brain to brain". A phrase (or really kind of a sheaf of phrases) you might find useful if you want to follow that up is "belief cascade" or "rumor cascade" (or informational cascade, or opinion diffusion) - there's been lots of study of how atomic beliefs are transmitted between people, through the obvious mechanism of just plain talking.

    There's a thread of interesting research on how farming techniques get spread through communities along social network lines (not via computer, but in the more abstract sense of networks of meat-space social relationships). [e.g. Bala & Goyal (1998)]

    And there's Tim Clancy, who's working on his doctorate based on a massive dynamic-systems approach to tracking the spread of terrorist ideation through populations (he doesn't call that "memes" because doctoral committees look askance at that terminology; I think he's calling them "cultural scripts"). Again: nobody says it's just magically in the air. People *talk*. All the time. About stuff they care about. You can model it and make real, testable predictions about outcomes and possible interventions.

    As to Singularitarianism, yeah. I personally believe that it's inevitable that computers will reach and surpass our level of intelligence for whatever metric of intelligence you choose to pick, eventually, as well as believing that if I keep my meatsack running long enough I'll be able to bootstrap my self onto longer-lasting hardware - but there is *absolutely* a cult built around that as well. It would be difficult for me to say whether I'm part of it or not; I think the only test of that would have to be just measuring the rationality of my actions - but there are so many other sources of irrationality in my actions I'd be hard put to allocate any specific portion to Singularitarianism.

    This post has really been interesting. I look forward to browsing around in the rest of your online work here.

    ReplyDelete
    Replies
    1. You seem to be talking about cultural evolution generally, not memetics, which is a specific approach to cultural evolution. Memeticists think of memes as agents that travel from one brain to another. Of course, they have to have some vehicle, so they may use language or music or whatever. But it's the memes that are doing the driving. It's because the term is used in that way that academics look askance at it.

      Delete