Thursday, December 1, 2022

Jeffrey M. Binder's "Language and the Rise of the Algorithm"

Jeffrey M. Binder is a programmer, historian, and writer based in New York City. He has developed software for natural language processing and published peer-reviewed research on such topics as eighteenth-century mathematics, computational research methods, machine-generated poetry, and expressions of emotion on social media. He has a PhD in English from the Graduate Center, City University of New York and has taught at Hunter College and Pennsylvania State University; Binder currently works as a security researcher at Open Raven.

He applied the "Page 99 Test" to his new book, Language and the Rise of the Algorithm, and reported the following:
A reader who turns to page 99 of my book will learn about a debate between two French Enlightenment thinkers, Jean le Rond d’Alembert and the AbbĂ© de Condillac, over whether algebra should count as a language. Overall I think the Page 99 Test worked fairly well. This particular passage might seem a bit obscure without context, but it touches on a major theme that the book traces over the centuries.

Language and the Rise of the Algorithm is about the intellectual developments that gave us the idea of the algorithm. A crucial moment in this history was the development, in the early 1600s, of modern algebraic notation—the way students now learn to write equations, as in ax + b = c. Unlike words, these symbols could apparently be understood by speakers of any language. Admirers such as G. W. Leibniz attempted to extend this power to other fields, envisioning a universal notation that could express anything whatsoever with the certainty of algebra.

By the mid 1700s, the excitement had cooled, and many thinkers were more skeptical of the idea that symbols were inherently better than words. Page 99 quotes a statement to this effect from Condillac: “We should not suppose that the sciences are exact—or that we prove rigorously—only when we use x’s, a’s, and b’s.” The power of algebra, for Condillac, stemmed not from the symbols themselves, but from the clarity of the ideas they expressed.

Condillac, however, may have been overstating this clarity. In the 1700s, algebra was plagued by theoretical problems, and much that is now drilled into our heads in high school was still controversial—in fact, some mathematicians even believed that negative numbers were unscientific nonsense. D’Alembert did not go quite so far, but he did maintain that algebra differed fundamentally from how we ordinarily think and speak.

Although the debate I discuss on page 99 is largely forgotten, it gained a new relevance in the computer age. As I show later in the book, the developers of early programming languages encountered similar issues. Computer code is, paradoxically, both clear and opaque: its symbols have precise definitions, and yet most people find it incomprehensible. The rise of computers has given a new relevance to the rift between symbols and everyday language that troubled Enlightenment mathematicians centuries ago.
Visit Jeffrey M. Binder's website.

--Marshal Zeringue