Everyone knows that computers are about math. And that may be one source of humanistic resistance to computational research techniques, especially in the use of corpus technique for examining large bodies of texts of historical or literary interest. So: Computers are math, math is not language, literary texts ARE language; therefore the use of computer in analyzing literary texts is taboo as it sullies the linguistic purity of those texts.
Except that computers aren’t about math, at least not essentially so. To equate computers with math is to mis-identify computing with one use of computing, the calculation of numerical values. That equation also mis-identifies mathematics with but one aspect of it, numerical calculations.
* * * * *
The contrast between math and language is, of course, deeply embedded in the American educational system. In particular, it is built into the various standardized tests one takes on the way into college and then, from there, into graduate school. One takes tests that are designed to test verbal abilities, one thing, and mathematical abilities, a different thing. And, while some people score more or less the same on both, others do very much better on one of them. The upshot is that it is easy and natural for us to think in terms of math-like subjects and verbal-like subjects and people good at either but not necessarily both.
The problem is that what takes place “under the hood” in corpus linguistics is not just math (statistics) and natural language (the texts). It’s also and mostly computation, and computation is not math, though, as I said up top, the association between the two is a strong one.
When Alan Turing formalized the idea of computing in the idea of an abstract machine, that abstract machine processed symbols—in very general senses of symbols and processes. That is, Turing formalized computation as a very constrained linguistic process.
Sets of symbols and processes on them can be devised to do a great many things. Ordinary arithmetic is one of them. To learn arithmetic we must first memorize tables of atomic equivalences for addition, subtraction, multiplication and division. Thus:
1 + 1 = 21 + 2 = 31 + 3 = 4. . .9 + 7 = 169 + 8 = 179 + 9 = 18
And so on through subtraction, multiplication, and division. To these we add a few simple little recipes (aka algorithms) for performing calculations by applying these atomic equivalences to given arrangements of numbers.
What we do when we do arithmetic, then, is we manipulate symbols those symbols in very constrained ways. Those symbols are mathematical by virtue of the conventions that link them to the world, as counts of objects or units of measures of this or that sort (length, temperature, weight, etc.).
And just what is mathematics? Is Euclidean geometry math? Of course it is. Is it numerical? Fundamentally, no. But then Descartes came along and created conventions by which geometric operations can be achieved through arithmetic means. And…well, I’m not a mathematician, nor a philosopher of math, nor an expert in the theory of computing. But at this moment the question of the relationship between computing and mathematics is looking rather subtle and complex, interesting if you will, and not something that can be summed up by the common association between computers and mathematics.