Tuesday, December 18, 2018

Stagnation 1.2: Energy efficiency and the cost of deepening our understanding of the world

Yesterday (12.17.18) Alex Tabarrock had a post about the energy efficiency of the ever cheaper logic engines in our many digital devices. That, of course, is in the territory covered by one of the three case studies in Bloom, Jones, Van Reenen, and Webb [1]. In the next section of this post I explicate that connection and then offer some rough and informal remarks about the energy costs of understanding the world.

Silicon productivity and energy efficiency

Over the past 60 years, the energy efficiency of ever-less expensive logic engines has improved by over one billion fold. No other machine of any kind has come remotely close to matching that throughout history.

Consider the implications even from 1980, the Apple II era. A single iPhone at 1980 energy-efficiency would require as much power as a Manhattan office building. Similarly, a single data center at circa 1980 efficiency would require as much power as the entire U.S. grid. But because of efficiency gains, the world today has billions of smartphones and thousands of datacenters.
Here is Mills’ statement of Jevon’s Paradox:
Put differently: the purpose of improved efficiency in the real world, as opposed to the policy world, is to capture the benefits from an engine. So long as people and businesses want more of those benefits, the declining cost of their use increases demand, which in turn outstrips efficiency gains. Jevons understood (and logic dictates) that efficiency gains must come at the same capital cost; but magic really happens when hardware costs decline.
Bloom, Jones, Van Reenen, and Webb took a look at research productivity in the semiconductor industry and discovered that, while the increase in circuit density captured in Moore’s Law, has continued through the late 20th century and into the early decades of this one, the R&D effort goes up steadily (Figure 4, p. 17). Later in the paper they offer an observation that incorporates Jevons’ Paradox (p. 44):
Research productivity for semiconductors falls so rapidly, not because that sector has the sharpest diminishing returns — the opposite is true. It is instead because research in that sector is growing more rapidly than in any other part of the economy, pushing research productivity down. A plausible explanation for the rapid research growth in this sector is the “general purpose” nature of information technology. Demand for better computer chips is growing so fast that it is worth suffering the declines in research productivity there in order to achieve the gains associated with Moore’s Law.
As information processing technology gets cheaper its use spreads further and deeper into social and cultural processes. The productivity of the semiconductor industry may be dropping in economic terms, but that productivity loss enables an enormous increase in the energy efficiency of information processing. The increasing costs of semiconductor R&D are easily covered.

On the whole I’m inclined to reconceptualize that productivity loss as the increasing cost of learning more about the world: what’s out there and how do we build things?

Energy, information, and evolution

Some years ago David Hays and I published an article entitled, “A Note on Why Natural Selection Leads to Complexity” [2]. Here’s the abstract:
While science has accepted biological evolution through natural selection, there is no generally agreed explanation for why evolution leads to ever more complex organisms. Evolution yields organismic complexity because the universe is, in its very fabric, inherently complex, as suggested by Ilya Prigogine's work on dissipative structures. Because the universe is complex, increments in organismic complexity yield survival benefits: (1) more efficient extraction of energy and matter, (2) more flexible response to vicissitudes, (3) more effective search. J.J. Gibson's ecological psychology provides a clue to the advantages of sophisticated information processing while the lore of computational theory suggests that a complex computer is needed efficiently to perform complex computations (i.e. sophisticated information processing).
What about information processing in the evolution of human culture?

In the second quarter of the previous century the anthropologist Leslie White was interested in socio-cultural complexity and put energy consumption at the center of this thinking [3]. More recently David Hays examined a large literature and devoted a chapter to the subject in his history of technology [4]. Hays was interested in the energy usage of preliterate, literate but pre-industrial, industrial, and emerging informatic forms of social and economic organization. To that end he provided estimates in each category of per capita energy flux, energy densities per square mile of inhabited land, “energy taken from the environment, energy delivered to useful purposes, human labor required, [and] material welfare produced”. And he broke down energy usage by category of use: food, domestic, agriculture, transportation, and industry.

But he didn’t segregate informatics as a specific category of energy usage. Nor do I intend to do so here – if anyone knows of someone who has done so, please let me know. But I will offer some observations.

We might begin by asking at what point in socio-cultural evolution we find individuals who are full-time specialists in ‘information processing’ – which is not a particularly good term. Of course we should specify just what that means, but, as I’m only after a quick sketch, I’ll forgo that and suggest that we’re looking for a religious specialist. It is my impression – I recalling a literature I haven’t read in a number of years – that they appear well before the emergence of literacy, but not in the simplest societies.

With literacy would have various information specialists, priests, scribes, philosophers of various kinds, lawyers, engineers, architects and, for that matter, various kinds of artists. We also see the emergence of schools. And some of these activities would require energy beyond that used by their brains, e.g. the production of paper and writing implements.

This brings us back to Mark Mills, Energy and the Information Infrastructure Part 1: Bitcoins & Behemoth Datacenters (RealClear Energy 11.19.2018):
Society has not seen a new “energy service” vector arrive for two centuries, until now. Fouquet’s analysis doesn’t include energy in service of information. Fouquet could, in theory, have calculated energy for information services across those same five centuries. The energy cost to make paper in a single book, while far less now than centuries ago, is still equivalent to the fuel consumed driving a Prius 10 miles. There were also energy costs associated with building the libraries that housed the books, etc. But to be fair to Fouquet, those numbers were so tiny compared to energy for heating that they’d disappear from visibility.

History’s ignition point with regard to the energy cost of information becoming visible can be traced to 1946. The world’s first, and then only, datacenter was ENIAC’s room full of 20,000 burning hot vacuum tubes, which demanded 150 kW. But the proliferation of the new data infrastructures didn’t begin to explode until the Internet’s expansion started at the end of the 20th century — i.e., it began when Fouquet’s history ends. Now the power level of a single ENIAC is found in every dozen square feet inside the billions of square feet of datacenters.

There is no dispute that a new “energy service” has arrived. The core question is whether in fact the trajectory will look like all others in history. [...] The odds are the “information service” trajectory will look the same as for other services Fouquet mapped. By 2050, society will likely use more energy for data than was used for illumination in 1950.
And after that?

* * * * *

What happens to those productivity calculations in that world? The question, no doubt, is ill-posed. But I can live with that. As you may suspect, I think this business of stagnation is somehow ill-posed, as though it harbors a secret wish for the proverbial free lunch, where the lunch is increased knowledge of the world. It’s not free, nor is there any reason why it should be.

In any event, we seem headed for a world in which more and more energy, human and otherwise, is ‘information processing’, with more and more of that being embedded in artificial devices. While I’m skeptical about fantasies about super-intelligent computers – in part because I don’t see much evidence of a conception of intelligence that’s useful for engineering purposes – I’m quite sure that, if we don’t drastically degrade our world or even destroy ourselves, there are surprises over that horizon.

There is a major transformation coming. It involves computers and computation. But also our understanding of them, and, more than likely, of ourselves as well.

More later.

References

[1] Nicholas Bloom, Charles I. Jones, John Van Reenen, and Michael Webb, Are Ideas Getting Harder to Find? March 5, 2018, https://web.stanford.edu/~chadj/IdeaPF.pdf.

[2] William Benzon and David G. Hays, A Note on Why Natural Selection Leads to Complexity, Journal of Social and Biological Structures 13: 33-40, 1990, Academia: https://www.academia.edu/8488872/A_Note_on_Why_Natural_Selection_Leads_to_Complexity; SSRN: https://ssrn.com/abstract=1591788.

[3] Leslie White, Energy and the Evolution of Culture, American Anthropologist, 1943 45:335-356, Download at https://deepblue.lib.umich.edu/bitstream/handle/2027.42/99636/aa.1943.45.3.02a00010.pdf?sequence=1

[4] David Hays, “Energy”, The Evolution of Technology through Four Cognitive Ranks, 1995, Metagram Press. Online (the book’s only form), http://asweknowit.ca/evcult/Tech/CHAPTER3.shtml.

No comments:

Post a Comment