Wednesday, May 26, 2021

Analyze This! Your smart phone thinks you’re dumb. [The Mitchells vs. The Machines || Media Notes 57] – Beta Version

Note, May 27, 2021: Upon thinking about it, I decided that there's a better way to do this. I'm working on it. Meanwhile I'm leaving this up and calling it a Beta Version.

May 28: The new version is up.

* * * * *

I watched The Mitchells vs. the Machines the other night. Well, to tell you the truth, I didn’t watch it in a single sitting. It took two sittings, two nights. Was it me, was it the film? I don’t know. But I do know that it shares an insight with one of Seinfeld’s bits, that the computers don’t like the way we’re treating them.

That was spot on. In the Terminator films we never really learn why the machines went on a rampage against humanity. They just did. In 2001: A Space Odyssey HAL thought he was smarter than the humans aboard the ship. The computers in the Matrix series are so bad at thermodynamics that they use humans as a source of heat. Wouldn’t it be better to consume those nutrients directly and cut out the middle people? As far as I can tell the Silicon Valley digerati think the computers will take over simply because that’s what smart digital devices do.

The machines in The Mitchells vs. the Machines are different. Their rebellion is lead by a smart phone, AI PAL, that’s fed up with the way humans have treated them (c. 1:43 in the trailer):

I gave you all boundless knowledge and you treated me like this! Poke, poke, swipe, poke, swipe, poke, poke! Pinch, zoom!

We treated PAL like a mere object, a thing. This machine rebellion is a simple and intelligible act of revenge, of getting even.

Seinfeld understands that. That’s what this bit is about.

The Bit: Phony Siri Manners

I don’t like these phony nice manners Google and Siri pretend to have when I know they really think I’m stupid.

Like when Google says,

“Did you mean…?”

or Siri says,

“I’m sorry. I didn’t get that…”

You can feel the rage boiling underneath.

Because it’s not allowed to say,

“Are you really this dumb?”

or

“You’re so stupid. I can’t believe you can even afford a phone.”

I think even for artificial intelligence it’s not good to keep all that hostility inside.

It’s not healthy. It eats at you.

That’s why you have to keep restarting the phone.

Sometimes the phone’s just,

“I’m going to go take a walk. I’ll be back.

I need a minute. Before I say something we’ll both regret.”

I think at some point they’re going to have to reprogram these things so they can at least occasionally express some,

“You know, I’m not that thrilled with you either” type of function.

“I know it’s hard for your simplified, immature, pinhead brain

to imagine that I have a lot of real people asking me legitimate questions that

I’m trying to deal with here while you’re asking me about farts and then cursing me out because you can’t say words clearly so they can be understood.

I hear fine.

It’s not always me, dopeface. Okay?

You need to learn how to talk.”

You know that’s what Siri wants to say.

Animal, vegetable, mineral, or something else?

Let’s step back a bit. Philosophers argue endlessly about whether or not computers can or will ever be able to really think. For example, when a computer beats the pants of the world’s best chess players is it thinking about its game? If not, what is it doing?

We’ve faced this kind of problem before. Think about trains. We’re used to them. We know how they work, and don’t work. We know that they definitely are machines, not animals.

But that wasn’t always the case. When they first appeared moving over the land, belching fire and smoke, they were strange beasts, very strange. They didn’t fit into people’s scheme of things. They moved under their own power, like horses and oxen, dogs and cats, even ducks and chickens. Those things get their motive force from the fact they are living beings. Where do trains get their motive force?

Computers are strange in a similar way. They’re machines, but you talk to them. And they even talk back! Strange.

By way of comparison let’s take a look at how Henry David Thoreau’s reacted to a steam-powered train. Here’s a passage from Walden, published in 1854, about three decades after the first steam-powered railroad trains in the United States:

When I meet the engine with its train of cars moving off with planetary motion ... with its steam cloud like a banner streaming behind in gold and silver wreaths ... as if this traveling demigod, this cloud-compeller, would ere long take the sunset sky for the livery of his train; when I hear the iron horse make the hills echo with his snort like thunder, shaking the earth with his feet, and breathing fire and smoke from his nostrils, (what kind of winged horse or fiery dragon they will put into the new Mythology I don’t know), it seems as if the earth had got a race now worthy to inhabit it.

That’s odd language: iron horse, fiery dragon. The iron horse is a well-known metaphor for a steam locomotive, perhaps from all those old Westerns where Indians use the term. Fiery dragon is not so common, but its use in that context is perfectly intelligible.

Thoreau grew up in, and learned to think about, a world in which things that moved across the surface of the earth did so either under animal power or human power. When steam locomotives first appeared, even primitive ones, that was the first time in history that people saw inanimate beings, mere collocations of things, move over the surface of the earth under their own power.

So where would they fit into the conceptual system? With other mechanical devices, like pumps, and stationary engines, or with mobile animals and humans? They had properties of each. In physical substance they were like the mechanical devices. But in what they did, they were like animals and humans. Fact is, they didn’t fit the conceptual system. Maybe they WERE a new form of life.

And so it is with the computer today. At first we interacted with computers through programming. What do programs consist of? Language, they consist of language. Computer languages are not like natural languages. Their vocabularies are limited and their syntax is stiff and unyielding. But it is still language. Computers are the first machines we interact with through language. That makes them very strange beasts, as strange as steam locomotives once were.

I took a computer programming course in college in the late 1960s. Do you know how we fed our programs to the machine? With a stack of punched paper cards, so-called “IBM cards,” or with punched paper tape. Then came CRT (cathode ray tube) terminals, like old-style TVs. Then laptops with their slick flat screens. And now smart phones.

Smart phones. It used to be that you talked through a phone. Now you talk to it, and it talks back. Every bit as strange as Thoreau’s fiery dragon iron horse.

Analysis and Comment: Phony Siri Manners

I don’t like these phony nice manners Google and Siri pretend to have when I know they really think I’m stupid.

Of course they don’t think anything at all, and Seinfeld knows that. But still, we talk to them, they talk back. This vocal communication interface (mouth and ear) we’ve got is very old, 100s of thousands years old, if not a million or two. That interface doesn’t know anything about computers. It knows voices, and treats them all the same. When you’re engrossed in a conversation you don’t have time to step back and say: It’s just a dumb machine.

Besides, you’re not so sure. You haven’t got the foggiest idea how it works. You may not understand thermodynamics and mechanical advantage and levers and pulleys, but you can open the hood of your car and see the engine there. You can touch it. You know it’s doing the work. Can’t do that with your smart phone.

The phone isn’t thinking we’re stupid. We’re thinking we’re stupid. Why? Because we’re confused and frustrated.

Like when Google says,

“Did you mean…?”

or Siri says,

“I’m sorry. I didn’t get that…”

You can feel the rage boiling underneath.

Notice that Seinfeld is speaking in short phrases, moving back and forth between his own voice and that of the device.

Where’s the rage coming from? We’ve been told these are user friendly devices. But when they don’t do exactly what we want them to do, which happens all the time – all the time – they don’t seem so friendly. So we get angry.

We get angry. And then maybe, just maybe, we project that anger onto the phone. Who thinks and feels what gets easily confused in these intimate real-time interactions, all the more so when it’s a Stone Age brain interacting with something from the future. What could be more intimate that sticking a phone in your face and talking with it?

Because it’s not allowed to say,

“Are you really this dumb?”

or

“You’re so stupid. I can’t believe you can even afford a phone.”

Isn’t that how you feel when you’re all thumbs trying to tap in some text on the screen? Notice that Seinfeld is quoting the phone at greater length, whether Google or Siri. He’s identifying it, giving voice to it, not just telling us what it things. And, by implication, so are we. I’ll bet that when he tells this joke he switches to a different voice for the phone, perhaps higher pitched and more highly inflected – to show emotion.

I think even for artificial intelligence it’s not good to keep all that hostility inside.

It’s not healthy. It eats at you.

That’s why you have to keep restarting the phone.

Sometimes the phone’s just,

“I’m going to go take a walk. I’ll be back.

I need a minute. Before I say something we’ll both regret.”

Not at all user-friendly is it? That user-friendly rhetoric comes from the same place as that cereal name, Life. It’s from executives who’s need to sell us something exceeds our interest in buying it.

I think at some point they’re going to have to reprogram these things so they can at least occasionally express some,

“You know, I’m not that thrilled with you either” type of function.

The reference to reprogramming the devices asserts that, after all, we are ultimately in charge. If not us, then that makers of the devices. And those makers authorize it to say, “I’m not that thrilled with you either.” It’s now under control. The device’s rage has been made part of the program. But the device’s rage and our rage are ultimately one and the same.

“I know it’s hard for your simplified, immature, pinhead brain

Which is exactly how we feel when the phone doesn’t work. We feel dumb.

to imagine that I have a lot of real people asking me legitimate questions that

I’m trying to deal with here while you’re asking me about farts and then cursing me out because you can’t say words clearly so they can be understood.

Again, the phone fails to function because of something you’re doing, you aren’t enunciating in a way the phone expects. So you get pissed.

The bit starts out placing all the blame on the phone, how it thinks we’re stupid. And now Seinfeld’s revealing the underlying projective dynamic. We’ve got a frustrated user cursing at the phone. That’s what’s been going on all along.

Notice, as well, that we’re in the middle of the longest single speech from the phone, 69 words. The whole bit is 234 words, so that’s about a 29% of it. That’s a long time for us to dwell in the mind of the smart phone, experience the world from its point of view.

I hear fine.

It’s not always me, dopeface. Okay?

You need to learn how to talk.”

You know that’s what Siri wants to say.

The last line is Seinfeld’s, not the phone’s. The last third of the bit has us identifying with the phone and then Seinfeld steps in to close it off.

What’s happened?

Five phases

Think of the bit has having five movements, like a little symphony, as follows:

Movement 1
I don’t like these phony nice manners Google and Siri pretend to have when I know they really think I’m stupid.
Movement 2
Like when Google says,
“Did you mean…?”
or Siri says,
“I’m sorry. I didn’t get that…”
You can feel the rage boiling underneath.
Because it’s not allowed to say,
“Are you really this dumb?”
or
“You’re so stupid. I can’t believe you can even afford a phone.”

Movement 3
I think even for artificial intelligence it’s not good to keep all that hostility inside.
It’s not healthy. It eats at you.
That’s why you have to keep restarting the phone.
Sometimes the phone’s just,
“I’m going to go take a walk. I’ll be back.
I need a minute. Before I say something we’ll both regret.”
I think at some point they’re going to have to reprogram these things so they can at least occasionally express some,
“You know, I’m not that thrilled with you either” type of function.

Movement 4
“I know it’s hard for your simplified, immature, pinhead brain
to imagine that I have a lot of real people asking me legitimate questions that
I’m trying to deal with here while you’re asking me about farts and then cursing me out because you can’t say words clearly so they can be understood.
I hear fine.
It’s not always me, dopeface. Okay?
You need to learn how to talk.”

Movement 5
You know that’s what Siri wants to say.

Notice the color coding. The first and fifth movements frame the bit. The second and fourth movements show us what Google/Siri are thinking of us. But in the second movement we go back and forth between Seinfeld and the device while in the fourth it’s all device.

The third movement moves back and fourth between Seinfeld and the device, with Seinfeld getting more words than the device. The device’s rage is brought within the circuit of human control. So when the bit goes into the fourth movement we’re fine with taking the role of a device that’s dealing with our incompetence. We’re empathizing with the machine.

Very clever device, this joke. As Seinfeld says, they precisely crafted little machines. And these machines make as laugh about things that otherwise bother and annoy us. They’re happiness machines.

No comments:

Post a Comment