Pages in this blog

Wednesday, October 14, 2020

Facebook or freedom, Part 4: Notes on Doctorow, How to Destroy Surveillance Capitalism

Cory Doctorow has recently published, How to Destroy Surveillance Capitalism, August 26, 2020 (on Medium). He spends much of the first half of the book arguing that high-tech claims about the ability to leverage user-data for targeted advertising are exaggerated. The technologists have drunk their own Kool-Aid and keep grasping for more data because, well, they don’t know what else to do. I think he’s right. He also points out, however, that if these same capacities are put to political use to round-up and punish opponents, dissidents, and miscreants, they would wreak havoc.

I would also note that it seems to me that the whole world of personal computing and the web is oriented toward a pre-existing commercial culture of passive consumerism and THAT is the source of much of the difficulties.

Doctorow ends up arguing that the big companies need to be broken up.

Here are some specific passages from the last half of the book.

Dignity and Sanctuary

pp. 64-65

To grow, you need to be and expose your authentic self, and in that moment, you are vulnerable like a hermit crab scuttling from one shell to the next. The tender, unprotected tissues you expose in that moment are too delicate to reveal in the presence of another, even someone you trust as implicitly as a child trusts their parent.

In the digital age, our authentic selves are inextricably tied to our digital lives. Your search history is a running ledger of the questions you’ve pondered. Your location history is a record of the places you’ve sought out and the experiences you’ve had there. Your social graph reveals the different facets of your identity, the people you’ve connected with.

To be observed in these activities is to lose the sanctuary of your authentic self.

There’s another way in which surveillance capitalism robs us of our capacity to be our authentic selves: by making us anxious. Surveillance capitalism isn’t really a mind-control ray, but you don’t need a mind- control ray to make someone anxious.

Break them up

p. 84

Ultimately, we can try to fix Big Tech by making it responsible for bad acts by its users, or we can try to fix the internet by cutting Big Tech down to size. But we can’t do both. To replace today’s giant products with pluralistic protocols, we need to clear the legal thicket that prevents adversarial interoperability so that tomorrow’s nimble, personal, small- scale products can federate themselves with giants like Facebook, allowing the users who’ve left to continue to communicate with users who haven’t left yet, reaching tendrils over Facebook’s garden wall that Facebook’s trapped users can use to scale the walls and escape to the global, open web.

The problems of tight industries

pp. 85-86

Highly concentrated industries also present a regulatory conundrum. When an industry is dominated by just four or five companies, the only people who are likely to truly understand the industry’s practices are its veteran executives. This means that top regulators are often former execs of the companies they are supposed to be regulating. These turns in government are often tacitly understood to be leaves of absence from industry, with former employers welcoming their erstwhile watchdogs back into their executive ranks once their terms have expired.

All this is to say that the tight social bonds, small number of firms, and regulatory capture of concentrated industries give the companies that comprise them the power to dictate many, if not all, of the regulations that bind them.

This is increasingly obvious. Whether it’s payday lenders winning the right to practice predatory lending or Apple winning the right to decide who can fix your phone or Google and Facebook winning the right to breach your private data without suffering meaningful consequences or victories for pipeline companies or impunity for opioid manufacturers or massive tax subsidies for incredibly profitable dominant businesses, it’s increasingly apparent that many of our official, evidence-based truth- seeking processes are, in fact, auctions for sale to the highest bidder.

It’s really impossible to overstate what a terrifying prospect this is. We live in an incredibly high-tech society, and none of us could acquire the expertise to evaluate every technological proposition that stands between us and our untimely, horrible deaths.

Fake news

p. 88

Fake news — conspiracy theories, racist ideologies, scientific denialism — has always been with us. What’s changed today is not the mix of ideas in the public discourse but the popularity of the worst ideas in that mix. Conspiracy and denial have skyrocketed in lockstep with the growth of Big Inequality, which has also tracked the rise of Big Tech and Big Pharma and Big Wrestling and Big Car and Big Movie Theater and Big Everything Else.

No one can say for certain why this has happened, but the two dominant camps are idealism (the belief that the people who argue for these conspiracies have gotten better at explaining them, maybe with the help of machine-learning tools) or materialism (the ideas have become more attractive because of material conditions in the world).

p. 89

We have always had disagreements about what’s true, but today, we have a disagreement over how we know whether something is true. This is an epistemological crisis, not a crisis over belief. It’s a crisis over the credibility of our truth-seeking exercises, from scientific journals (in an era where the biggest journal publishers have been caught producing pay-to-play journals for junk science) to regulations (in an era where regulators are routinely cycling in and out of business) to education (in an era where universities are dependent on corporate donations to keep their lights on).

The value of online tools: coordination

pp. 90-92

But there’s one way in which I am a tech exceptionalist. I believe that online tools are the key to overcoming problems that are much more urgent than tech monopolization: climate change, inequality, misogyny, and discrimination on the basis of race, gender identity, and other factors. The internet is how we will recruit people to fight those fights, and how we will coordinate their labor. Tech is not a substitute for democratic accountability, the rule of law, fairness, or stability — but it’s a means to achieve these things.

The hard problem of our species is coordination. Everything from climate change to social change to running a business to making a family work can be viewed as a collective action problem.

The internet makes it easier than at any time before to find people who want to work on a project with you — hence the success of free and open-source software, crowdfunding, and racist terror groups — and easier than ever to coordinate the work you do. […]

The upshot of this is that our best hope of solving the big coordination problems — climate change, inequality, etc. — is with free, fair, and open tech. Our best hope of keeping tech free, fair, and open is to exercise caution in how we regulate tech and to attend closely to the ways in which interventions to solve one problem might create problems in other domains.

What is information?

p. 94-95

The fact that information isn’t a good fit with property and markets doesn’t mean that it’s not valuable. Babies aren’t property, but they’re inarguably valuable. In fact, we have a whole set of rules just for babies as well as a subset of those rules that apply to humans more generally. Someone who argues that babies won’t be truly valuable until they can be bought and sold like loaves of bread would be instantly and rightfully condemned as a monster.

It’s tempting to reach for the property hammer when Big Tech treats your information like a nail — not least because Big Tech are such prolific abusers of property hammers when it comes to their information. But this is a mistake. If we allow markets to dictate the use of our information, then we’ll find that we’re sellers in a buyers’ market where the Big Tech monopolies set a price for our data that is so low as to be insignificant or, more likely, set at a nonnegotiable price of zero in a click-through agreement that you don’t have the opportunity to modify.

Meanwhile, establishing property rights over information will create insurmountable barriers to independent data processing. Imagine that we require a license to be negotiated when a translated document is compared with its original, something Google has done and continues to do billions of times to train its automated language translation tools. Google can afford this, but independent third parties cannot. Meanwhile, establishing property rights over information will create insurmountable barriers to independent data processing. Imagine that we require a license to be negotiated when a translated document is compared with its original, something Google has done and continues to do billions of times to train its automated language translation tools. Google can afford this, but independent third parties cannot. Google can staff a clearances department to negotiate one-time payments to the likes of the EU (one of the major repositories of translated documents) while independent watchdogs wanting to verify that the translations are well-prepared, or to root out bias in translations, will find themselves needing a staffed-up legal department and millions for licenses before they can even get started.

Monopolism and inequality

p. 97

If racists haven’t gotten more convincing in the past decade, then how is it that more people were convinced to be openly racist at that time? I believe that the answer lies in the material world, not the world of ideas. The ideas haven’t gotten more convincing, but people have become more afraid. Afraid that the state can’t be trusted to act as an honest broker in life-or-death decisions, from those regarding the management of the economy to the regulation of painkillers to the rules for handling private information. Afraid that the world has become a game of musical chairs in which the chairs are being taken away at a never-before-seen rate. Afraid that justice for others will come at their expense. Monopolism isn’t the cause of these fears, but the inequality and material desperation and policy malpractice that monopolism contributes to is a significant contributor to these conditions. Inequality creates the conditions for both conspiracies and violent racist ideologies, and then surveillance capitalism lets opportunists target the fearful and the conspiracy-minded.

Will paying for access work?

p. 101

Behind the idea of paying for access is a belief that free markets will address Big Tech’s dysfunction. After all, to the extent that people have a view of surveillance at all, it is generally an unfavorable one, and the longer and more thoroughly one is surveilled, the less one tends to like it. Same goes for lock-in: If HP’s ink or Apple’s App Store were really obviously fantastic, they wouldn’t need technical measures to prevent users from choosing a rival’s product. The only reason these technical countermeasures exist is that the companies don’t believe their customers would voluntarily submit to their terms, and they want to deprive them of the choice to take their business elsewhere.

What’s wrong with forcing Big Tech to police their users?

p. 104-106

Yet governments confronting all of these problems all inevitably converge on the same solution: deputize the Big Tech giants to police their users and render them liable for their users’ bad actions. The drive to force Big Tech to use automated filters to block everything from copyright infringement to sex-trafficking to violent extremism means that tech companies will have to allocate hundreds of millions to run these compliance systems.

These rules — the EU’s new Directive on Copyright, Australia’s new terror regulation, America’s FOSTA/SESTA sex-trafficking law and more — are not just death warrants for small, upstart competitors that might challenge Big Tech’s dominance but who lack the deep pockets of established incumbents to pay for all these automated systems. Worse still, these rules put a floor under how small we can hope to make Big Tech.

That’s because any move to break up Big Tech and cut it down to size will have to cope with the hard limit of not making these companies so small that they can no longer afford to perform these duties — and it’s expensive to invest in those automated filters and outsource content moderation. […]

Allowing the platforms to grow to their present size has given them a dominance that is nearly insurmountable — deputizing them with public duties to redress the pathologies created by their size makes it virtually impossible to reduce that size. Lather, rinse, repeat: If the platforms don’t get smaller, they will get larger, and as they get larger, they will create more problems, which will give rise to more public duties for the companies, which will make them bigger still.

A choice…

106

We can work to fix the internet by breaking up Big Tech and depriving them of monopoly profits, or we can work to fix Big Tech by making them spend their monopoly profits on governance. But we can’t do both. We have to choose between a vibrant, open internet or a dominated, monopolized internet commanded by Big Tech giants that we struggle with constantly to get them to behave themselves.

MORE GOOD STUFF AT THE END ABOUT BREAKING UP MONOPOLIES

No comments:

Post a Comment