Friday, May 28, 2021

Analyze This! Your smart phone thinks you’re dumb. [The Mitchells vs. The Machines || Media Notes 57] Version 2

This replaces the version I published on May 26.

* * * * *

I watched The Mitchells vs. the Machines the other night. Well, to tell you the truth, I didn’t watch it in a single sitting. It took two sittings, two nights. Was it me, was it the film? I don’t know. But I do know that it shares an insight with one of Seinfeld’s bits, that the computers don’t like the way we’re treating them.

That was spot on. In the Terminator films we never really learn why the machines went on a rampage against humanity. They just did. In 2001: A Space Odyssey HAL thought he was smarter than the humans aboard the ship. The computers in the Matrix series are so bad at thermodynamics that they use humans as a source of heat. Wouldn’t it be better to consume those nutrients directly and cut out the middle people? As far as I can tell the Silicon Valley digerati think the computers will take over simply because that’s what smart digital tools do.

The machines in The Mitchells vs. the Machines are different. Their rebellion is led by a smart phone, AI PAL, that’s fed up with the way humans have treated them (c. 1:43 in the trailer):

I gave you all boundless knowledge and you treated me like this! Poke, poke, swipe, poke, swipe, poke, poke! Pinch, zoom!

We treated PAL like a mere object, a thing. This machine rebellion is a simple and intelligible act of revenge.

Seinfeld understands that. That’s what this bit is about. Or rather, it’s about how we should change our ways if we don’t want a robot insurrection on our hands. Sorta’.

The Bit: Phony Siri Manners

I don’t like these phony nice manners Google and Siri pretend to have when I know they really think I’m stupid.

Like when Google says,

“Did you mean…?”

or Siri says,

“I’m sorry. I didn’t get that…”

You can feel the rage boiling underneath.

Because it’s not allowed to say,

“Are you really this dumb?”

or

“You’re so stupid. I can’t believe you can even afford a phone.”

I think even for artificial intelligence it’s not good to keep all that hostility inside.

It’s not healthy. It eats at you.

That’s why you have to keep restarting the phone.

Sometimes the phone’s just,

“I’m going to go take a walk. I’ll be back.

I need a minute. Before I say something we’ll both regret.”

I think at some point they’re going to have to reprogram these things so they can at least occasionally express some,

“You know, I’m not that thrilled with you either” type of function.

“I know it’s hard for your simplified, immature, pinhead brain

to imagine that I have a lot of real people asking me legitimate questions that

I’m trying to deal with here while you’re asking me about farts and then cursing me out because you can’t say words clearly so they can be understood.

I hear fine.

It’s not always me, dopeface. Okay?

You need to learn how to talk.”

You know that’s what Siri wants to say.

Animal, vegetable, mineral, or something else?

Let’s step back a bit. Philosophers argue endlessly about whether or not computers can or will ever be able to really think. For example, when a computer beats the pants of the world’s best chess players is it thinking about its game? If not thinking, then what is it doing? What about when it spots your face in a crowd? Intrusive zombie or sharp-eyed scout?

We’ve faced this kind of category problem before. Think about trains. We’re used to them. We know how they work and don’t work. We know that they definitely are machines, not animals.

But that wasn’t always the case. When they first appeared moving over the land, belching fire and smoke, they were strange beasts, very strange. They didn’t fit into people’s scheme of things. They moved under their own power, like horses and oxen, dogs and cats, even ducks and chickens. Those things get their motive force from the fact they are living beings. Where do trains get their motive force?

Computers are strange in a similar way. They’re machines, but you talk to them. And they even talk back! Strange.

By way of comparison let’s take a look at how Henry David Thoreau reacted to a steam-powered train. Here’s a passage from the “Sound” chapter in Walden, published in 1854, about three decades after the first steam-powered railroad trains trod the land in America:

When I meet the engine with its train of cars moving off with planetary motion ... with its steam cloud like a banner streaming behind in gold and silver wreaths ... as if this traveling demigod, this cloud-compeller, would ere long take the sunset sky for the livery of his train; when I hear the iron horse make the hills echo with his snort like thunder, shaking the earth with his feet, and breathing fire and smoke from his nostrils, (what kind of winged horse or fiery dragon they will put into the new Mythology I don’t know), it seems as if the earth had got a race now worthy to inhabit it.

That’s odd language: iron horse, fiery dragon. The iron horse is a well-known metaphor for a steam locomotive, perhaps from all those old Westerns where Indians use the term. Fiery dragon is not so common, but its use in that context is perfectly intelligible.

Thoreau grew up in, and learned to think about, a world in which things that moved across the surface of the earth did so either under animal power or human power. They were pushed or pulled by living beings. When steam locomotives first appeared, even primitive ones, that was the first time in history that people saw inanimate beings, mere conglomerations of things, move over the surface of the earth under their own power.

Where would those, those things! fit into the conceptual system? With other mechanical devices, like pumps, and stationary engines, or with animals and humans? They had properties of each. In physical substance they were like the mechanical devices. But in their capacity to move they were like animals and humans. Fact is, they didn’t fit the conceptual system. Maybe they WERE a new form of life.

And so it is with the computer today. At first we interacted with computers through programming. What do programs consist of? Language, they consist of language. Computer languages are not like natural languages. Their vocabularies are limited and their syntax is stiff and unyielding. But it is still language. Computers are the first machines we interact with through language. That makes them very strange beasts, as strange as steam locomotives once were.

I took a computer programming course in college in the late 1960s. Do you know how we fed our programs to the machine? With a stack of punched paper cards, so-called “IBM cards,” or with punched paper tape. Then came CRT (cathode ray tube) terminals, like old-style TVs. Then laptops with their slick flat screens. And now smart phones.

Smart phones. It used to be that you talked through a phone. Now you talk to it, and it talks back. Every bit as strange as Thoreau’s fiery iron dragon horse.

Analysis: Form

Think of the bit has having seven phrases, like in a piece of music. The phrases are of different lengths. I’ve laid them out in this table, and color coded them so you can see the bit’s formal symmetry.

 

FIRST PHRASE

1

I don’t like these phony nice manners Google and Siri pretend to have when I know they really think I’m stupid.

 

SECOND PHRASE

2

Like when Google says,

3

“Did you mean…?”

4

or Siri says,

5

“I’m sorry. I didn’t get that…”

6

You can feel the rage boiling underneath.

7

Because it’s not allowed to say,

8

“Are you really this dumb?”

9

or

10

“You’re so stupid. I can’t believe you can even afford a phone.”

 

THIRD PHRASE

11

I think even for artificial intelligence it’s not good to keep all that hostility inside.

12

It’s not healthy. It eats at you.

13

That’s why you have to keep restarting the phone.

14

Sometimes the phone’s just,

 

FOURTH PHRASE

15

“I’m going to go take a walk. I’ll be back.

16

I need a minute. Before I say something we’ll both regret.” 

 

FIFTH PHRASE

17

I think at some point they’re going to have to reprogram these things so they can at least occasionally express some,

18

“You know, I’m not that thrilled with you either” type of function.

 

SIXTH PHRASE

19

“I know it’s hard for your simplified, immature, pinhead brain

20

to imagine that I have a lot of real people asking me legitimate questions that

21

I’m trying to deal with here while you’re asking me about farts and then cursing me out because you can’t say words clearly so they can be understood.

22

I hear fine.

23

It’s not always me, dopeface. Okay?

24

You need to learn how to talk.”

 

SEVENTH PHRASE

25

You know that’s what Siri wants to say.

The first and seventh phrases frame the bit. The second and sixth phrases show us what Google/Siri are thinking of us. But in the second phrase we go back and forth between Seinfeld and the device. In the sixth the device presents its thoughts without interruption.

Seinfeld turns reflective in the third phrase, addressing us about the device’s hostility. In the fifth movement Seinfeld suggests a way of dealing with that hostility: give the device a function that allows it to talk back – which it does in the sixth phrase.

Smack dab in the middle we go to the fourth phrase. Seinfeld gives it to the phone. Call it a time out. Which is what it is, no? The phone’s hung-up, crashed, and we have to restart it (line 13).

What the color-coding reveals is that the bit has a symmetrical form, with the phone’s time-out at the center. Literary critics call this ring-form or ring composition. They sometimes use a little quasi-formal expression to indicate that:

A, B, C ... Ω ... C’, B’, A’

I’ve used the Greek letter omega (Ω) to indicate the center phrase, but I could just have easily used the more prosaic ‘X’. The A and A’ are symmetrically placed, B and B’, C and C’.

Mary Douglas, the great anthropologist, wrote a book about ring composition, Thinking in Circles (2007) [1]. She was mostly interested in the Old Testament, but she believed that ring-composition was somehow inherent in the human mind. Most scholars who work with ring composition are working with old texts, either the Bible or Classical Greek and Roman texts. But ring-form does exist elsewhere. I’ve found it in, for example, the original King Kong (1933)[3] and in Gojira (1954), the Japanese film that started the Godzilla franchise [3]. President Obama’s eulogy for Clementa Pinckney exhibited ring-form [4]. So Seinfeld is in good company.

Strange Interlude: A lesson from Marx, Groucho

In various ways we treat machines like living beings. We give them names, we may talk to them from time to time. When Pixar made feature-length cartoons featuring cars, it was playing on our basic attitudes toward those mechanical beasts on which we have come to depend. And when the beast doesn’t do our bidding, we get mad at it. We curse it and kick it. That’s how we are. That’s the way it is.

Computers are no different. And now smart phones. We ask it for the nearest grocer and it gives us directions to the cleaners. We ask it again. It sends us to the undertaker. We get mad, curse it out, and it calmly directs us to the nearest exit. And so we think it’s out to get us.

Do you remember that wonderful scene in Duck Soup when Groucho, playing Rufus T. Firefly, head of Freedonia, is approached by Mrs. Teasdale, in an attempt to prevent war with Sylvania? Here’s their dialog:

TEASDALE: In behalf of the women of Freedonia, I have taken it upon myself to make one final effort to prevent war.

FIREFLY: No kidding? [...]

TEASDALE: I've taken the liberty of asking the ambassador to come over here. Because we both felt a friendly conference would settle everything peacefully. He'll be here any moment.

FIREFLY: Mrs. Teasdale, you did a noble deed. I'd be unworthy of the high trust that's been placed in me... if I didn't do everything within my power to keep our beloved Freedonia at peace with the world.

I'd be only too happy to meet Ambassador Trentino and offer him on behalf of my country the right hand of good fellowship.

And I feel sure he will accept this gesture in the spirit in which it is offered.

But suppose he doesn't. A fine thing that'll be. I hold out my hand and he refuses to accept it. That'll add a lot to my prestige, won't it?

Me, the head of a country, snubbed by a foreign ambassador! Who does he think he is that he can come here... and make a sap out of me in front of my people?

Think of it. I hold out my hand... and that hyena refuses to accept it. Why, the cheap four-flushing swine! He'll never get away with it, I tell you!

TEASDALE: oh, please!

[The Ambassador enters.]

FIREFLY: So, you refuse to shake hands with me, eh? [slaps Trentino]

TRENTINO: Mrs. Teasdale, this is the last straw! There's no turning back now! This means war!

FIREFLY: Then it's war! Then it’s war! Gather the forces. On with the horses. Then it’s war!

We then have an elaborate production number which takes the Broadway musical back to its roots in 19th Century minstrelsy.

Seinfeld’s bit has that kind of dynamic. Call it the Firefly flip.

Commentary: Phony Siri Manners

He’s frustrated with his phone, that is, he assumes the role of an Everyman – excuse me, Everyperson – frustrated with their phone who has done a Firefly reversal and projected their anger onto the phone.

I don’t like these phony nice manners Google and Siri pretend to have when I know they really think I’m stupid.

Of course they don’t think anything at all, and Seinfeld knows that. We talk to them, they talk back. This vocal communication interface (mouth and ear) we’ve got is very old, 100s of thousands years old, if not a million or two. That interface doesn’t know anything about computers. It knows voices, and treats them all the same. When you’re engrossed in a conversation you don’t have time to step back and say: It’s just a dumb machine.

Besides, you’re not so sure. You haven’t got the foggiest idea how it works. You may not understand thermodynamics and pistons and cam shafts, but you can open the hood of your car and see the engine there. You can touch it. You know it’s doing the work. Can’t do that with your smart phone. The phone isn’t thinking we’re stupid. We’re thinking we’re stupid. And we are, so we blame it on the phone.

Like when Google says,

“Did you mean…?”

or Siri says,

“I’m sorry. I didn’t get that…”

You can feel the rage boiling underneath.

Notice that Seinfeld is speaking in short phrases, moving back and forth between his own voice and that of the device. We’ve been told these are user friendly devices. But when they don’t do exactly what we want them to do, which happens all the time – all the time – they don’t seem so friendly. We get angry.

And just which of us thinks and feels what gets easily confused in these intimate real-time interactions, all the more so when it’s a Stone Age brain interacting with a gizmo from the future. What could be more intimate that sticking a phone in your face and talking with it?

Because it’s not allowed to say,

“Are you really this dumb?”

or

“You’re so stupid. I can’t believe you can even afford a phone.”

Isn’t that how you feel when you’re all thumbs trying to tap in some text on the screen? Notice that Seinfeld is quoting the phone at greater length, whether Google or Siri. He’s identifying it, giving voice to it, not just telling us what it things. And, by implication, so are we. I’ll bet that when he tells this joke he switches to a different voice for the phone, perhaps higher pitched and more highly inflected – to show emotion.

I think even for artificial intelligence it’s not good to keep all that hostility inside.

It’s not healthy. It eats at you.

That’s why you have to keep restarting the phone.

Sometimes the phone’s just,

“I’m going to go take a walk. I’ll be back.

I need a minute. Before I say something we’ll both regret.”

Not at all user-friendly is it? That user-friendly rhetoric comes from the same place as that cereal name, Life. It’s from executives who’s need to sell us something exceeds our interest in buying it. 

Moreover, what Seinfeld has done is taken a device failure, the crash, and interpreted it as a deliberate act on the part of Siri – who is, in effect, one of those imaginary friends we all used to have. And THAT’s the turning point of this bit, the middle section, the omega point (Ω) in the formal structure. That, if you will, is where (comedic) mind meets (machinic) matter in a Firefly flip. What’s the next thing Seinfeld does? He proposes a solution to the device’s problem. We’re on the way home.

I think at some point they’re going to have to reprogram these things so they can at least occasionally express some,

“You know, I’m not that thrilled with you either” type of function.

The reference to reprogramming the devices asserts that, after all, we are ultimately in charge. If not us, then that makers of the devices. And those makers authorize it to say, “I’m not that thrilled with you either.” It’s now under control. The device’s rage has been made part of the program. It’s been tamed. But the device’s rage and our rage are ultimately one and the same.

“I know it’s hard for your simplified, immature, pinhead brain

to imagine that I have a lot of real people asking me legitimate questions that

I’m trying to deal with here while you’re asking me about farts and then cursing me out because you can’t say words clearly so they can be understood.

Again, the phone fails to function because of something you’re doing, you aren’t enunciating in a way the phone expects.

The bit starts out placing all the blame on the phone, how it thinks we’re stupid. And now Seinfeld’s revealing the underlying projective dynamic. We’ve got a frustrated user cursing at the phone.

Notice, as well, that we’re in the middle of the longest single speech from the phone, 69 words. The whole bit is 234 words, so that’s about a 29% of it. That’s a long time for us to dwell in the mind of the smart phone, experience the world from its point of view.

I hear fine.

It’s not always me, dopeface. Okay?

You need to learn how to talk.”

You know that’s what Siri wants to say.

The last line is Seinfeld’s, not the phone’s. The last third of the bit has us identifying with the phone and then Seinfeld steps in to close it off.

What’s happened? What has happened is that Seinfeld has naturalized this strange gizmo from the future by revealing our identification with it. He has assimilated it to an ordinary mode of human interaction. And yet he keeps our distance from the device by having it call us “dopeface” and by referring to it as “Siri”.

Very clever device, this joke. As Seinfeld says, they are precisely crafted little machines. And these machines make as laugh about things that otherwise bother and annoy us. They are happiness machines.

References

[1] Mary Douglas, Thinking in Circles, Yale University Press, 2007.

[2] See my post, Beauty and the Beast: King Kong as ring composition, plus myth logic, New Savanna, April 30, 2018, https://new-savanna.blogspot.com/2017/10/beauty-and-beast-king-kong-as-ring.html.

[3] See my post, Ring Form Opportunity No. 4: Gojira is a Ring, New Savanna, December 19, 2013, https://new-savanna.blogspot.com/2013/12/ring-form-opportunity-no-4-gojira-is.html.

I’ve included that post, along with a 6-page table detailing the actions in the film, in a working paper, The Gojira Papers, April 15, 2014, https://www.academia.edu/7905287/The_Gojira_Papers.

[4] See my post, Obama’s Eulogy for Clementa Pinckney 1: The Circle of Grace, July 16, 2015, http://new-savanna.blogspot.com/2015/07/obamas-eulogy-for-clementa-pinckney-1.html.

That’s included in my working paper, Obama’s Eulogy for Clementa Pinckney: Technics of Power and Grace, July 2015, https://www.academia.edu/14487024/Obama_s_Eulogy_for_Clementa_Pinckney_Technics_of_Power_and_Grace.

No comments:

Post a Comment