Pages in this blog

Sunday, April 3, 2022

Analog vs. Digital: How general is the contrast?

I've been thinking about computation and the brain recently and decided to bump this post from 2016 to the top because it discusses the distinction between digital and analog.  When I started reading about computers in the 1960s every introductory discussion would mention that distinction. But by the time personal computers had become widespread those introductory discussions no longer mentioned analog computing. See the graph below.
* * * * *
 
The distinction between analog and digital computers and, more generally, between analog and digital phenomena, has been an important one in contemporary thinking. It is central to David Golumbia’s The Cultural Logic of Computation, where Golumbia says (p. 21):
...a rough approximation of my thesis might be that most of the phenomena in each sphere [covered in this book], even if in part characterizable in computational terms, are nevertheless analog in nature. They are gradable and fuzzy; they are rarely if ever exact, even if they can achieve exactness. The brain is analog; language is analog; society and politics are analog. Such reasoning applies not merely to what we call the “human species” but to much of what we take to be life itself...
Mark Liberman took issue with that statement in a post at Language Log, Is language “analog”?, in which he argued that “crucial aspects of human speech and language are NOT "analog" — are not continuously variable physical (or for that matter spiritual) quantities.” While I agree with Liberman on that point, that is not my point here; if that interests you, by all means read Liberman’s post. Rather, I want to recount something that came up in the discussion, which, in some measure, depended on just what these two terms mean.

I went on to ask when analog and digital began to be used in opposition to one another. It is easy enough to think of slide rules as analog devices and the abacus as a digital device, but is that how they were thought of when they originated?

The Wikipedia entry on analog computer lists a bunch of mechanical, electrical and electronic devices in the late 19th and into the 20th century, but did the people who conceived and used them explicitly conceptualize them as specifically analog in kind? I've run an ngram query on “analog,digital”
 

 
The lines for both terms hug the X axis at the bottom of the chart until about 1950 and then both start up, with “digital” quickly outstripping “analog.” The McCulloch-Pitts neuron dates to the early 1940s and was digital in character. Von Neumann discusses analog and digital in his 1958 Computer and the Brain; indeed, that contrast is one of the central themes of the book, if not THE central theme. How much contrastive discussion was there before then?

There’s certainly been a lot of such discussion after then. FWIW, when I started reading elementary accounts of computing and computers in the 1960s, analog vs. digital was a standard topic. At some time during the personal computing era I began noticing that popular articles no longer mentioned analog computing.
 
What I'm getting at is that it may be a mistake to treat the terms as having a well-settled meaning that we can take as given. That may in fact be true for a substantial range of cases. But that need not imply that our sense of the meanings of these terms is fully settled. Are we still working on it?

Liberman responded:
The original sense-extension of analog, as in "analogy", was in the context of one signal (for instance sound as time functions of air pressure) being represented by another (in that case sound represented analogously by voltage in a wire). And the original sense-extension of digital was in the context of a continuous time-function being represented by sequence of numbers (= "digits"). I would have thought that in both cases, the origins were in engineering discussions of telephone technology, but Nyquist's 1928 paper [PDF] doesn't use either word in this way, nor does Claude Shannon in 1948 [PDF]. The OED's earliest citations to this sense of analog are in discussion of "analog" vs. "impulse-type" computers, e.g.
1941 J. W. Mauchly Diary 15 Aug. in Ann. Hist. Computing (1984) 6 131/2 Computing machines may be conveniently classified as either ‘analog’ or ‘impulse’ types. The analog devices use some sort of analogue or analogy, such as Ohm's Law.., to effect a solution of a given equation. [Note] I am indebted to Dr. J. V. Atanasoff of Iowa State College for the classification and terminology here explained.
Note that the “analogy” these is not between a continuous signal and a series of numbers, but between an equation to be solved in one (discrete or continuous) domain and the physics of some machine's internal operations.
Thus it seems that the contrast between analog and digital is a relatively recent one and was originally made in a relatively narrow technical domain. So when we're trying to figure out whether or not or in what way language is analog or digital we're extending the contrast from a situation where it was relatively well defined to a very different situation. In the case of language we don't really know what’s going on and we’re using the analog/digital contrast as a tool for helping us figure it out. And the same is certainly true for nervous systems.

In the case of Golumbia’s example of celluloid film we have a technology that predates that analog/digital contrast. In the context of that distinction I find it reasonable to think of the discrete presentation of frames as digital in character, but I don't off-hand see that the digital concept gives further insight into how the film technology functions. As for digital video or high resolution digital ‘film’ (whether printed to celluloid or digitally projected), the effect on the human nervous system is pretty much the same as that of celluloid film. The frame rate may be different, but in all cases it exceeds the flicker-fusion rate of the visual system so that what we see is continuous motion.

Liberman responded:
FWIW, the distinction in mathematics between “discrete” and “continuous” (in various senses of both) goes back quite a ways, as does the idea of mathematical concepts as symbolically-encoded propositions. But in the end I don't think it's helpful to try to decide on a single binary global classification of issues like whether a function is differentiable, or what it means to describe a band-limited time function as a Fourier series, or whether digitally-encoded music is the same as or different from an analog tape recording, or whether words are discretely encoded in the brain as sounds or as meanings or in whatever other ways. Though all such questions are conceptually inter-related in various ways, each has its own properties, and trying to find one simple metaphor to rule them all is a recipe for confusion.
And that’s where I think we are. The use of a global contrast between analog and digital may have some value in relatively informal discussions, but it’s problematic where precision is required. In those cases we should seek terms crafted to the properties of the domain under discussion.

No comments:

Post a Comment