This blog post is adapted from part of the introduction to a larger paper I wrote. The full paper is available for download here. You would be better off reading the paper, but here we are.
The history of computing is a small but growing discipline. Its early works were not written by trained historians, but practitioners: computer engineers and business people. While they had an intimate relationship with the creation of the technology (and saw the value in preserving the history of a technology that many people see as inherentlynew), they also perpetuated the field’s own biases. As Michael Mahoney explained: “[the insider histories’] authors take as givens (often technical givens) what a more critical, outside viewer might see as choices.”
The work of the first historians of computing (the insiders) has granted material to those of the coming decades, allowing them to add nuance to the narratives. Practitioners writing in the 1960s-80s were concerned with the contexts closest to them: particular machines and the highly trained specialists that created them.
Today, now possessing a somewhat firm foundation in research institutions, the history of computing has questioned the basic assumptions held by early practitioners. They emphasize the social and economic significance of the machines along with their changing technological makeup. Historians in recent decades have focused on the machines’ social context, as well as the contributions made by low- and un-skilled workers. Work by scholars like Jennifer Light, David Grier, Marie Hicks, Thomas Haigh and others has brought the “human” element into the discussion and broadened the concerns of the discipline.
Let me give a more focused example to illustrate this problem. I am interested in the history of automatic program-control—the ability of a computer to carry out a sequence of instructions without human intervention. This is one of the main features that distinguished the first computers from prior machines. After all, before the first computers, “computer” was a term only used to describe the profession of people employed in calculation. Automatic control replaced human computers.
We cannot understand computation by seeing machines as the work of engineers isolated from larger economic circumstances. So we must answer: how did the organization of human computers affect the design of the automatic computers? Central to the history of any machine is the story of the human work which it replaced. This is understood by many computer historians today, like those I listed above. But there are still problems that arise when these historians study the boundaries between human labor and machine activity.
For example, there is a wide gap in the literature now between the automatic control of Charles Babbage’s calculating “engines” in the mid-19th Century and the computers of the mid-20th Century, what Paul Ceruzzi refers to as “the divide.” This divide can be seen in recent scholarship, such as Haigh and Mark Priestley’s short piece on the history of automatic control. The authors make a nearly centurial “jump forward from the time of Babbage [1822-1871] directly to the early 1940s,” to the ASCC and Z3. From their understanding, there were no developments in automatic control during this time period. They explain that operators would reconfigure punched-card machines and analog computers with “wrenches and screwdrivers” to perform different operations, thus “most of the work we think of as executing a program was carried out by human operators, not by the machines themselves.”
Why is this manual work not included as part of the history of automatic control? To be fair, Haigh and Priestley’s piece is meant as a short introduction, not a comprehensive account. However, making their “jump” means ignoring decades of developments in how this manual labor was carried out. In Ceruzzi’s piece, he solves the divide by detailing earlier machines which blurred the lines between manual and automatic computing. Yet this limits him to the early- to mid-20th century, and to isolated scientific computing departments hardly representative of larger economic changes. David Grier’s book takes into account the relationship between the division of labor and the design of technology (cf. this post), but this analytic goes out of view after his section on Babbage. His discussion following that favors interpersonal exchanges and relationships over structural developments.
The solution to the divide is not discovering more machines, nor telling anecdotes about the human computers that existed before. Instead, we must detail how human computers were rationally organized, and how engineers incorporated this organization into the design of machines. Computers did not replace individual workers but entire systems of computing. As Ceruzzi puts it in the piece mentioned above:
Historical accounts of punched card machinery [an early popular type of computing machine] have described the functioning of the individual machines in great detail. More relevant is what went on in the entire room comprising a punched-card installation. That room—including the people in it—and not the individual machines is what the electronic computer eventually replaced.
Nonetheless, Ceruzzi here leaves developing a cohesive understanding of “the entire room” as a task for future historians. I propose a simpler solution to the divide: the reason why there was a lack of machine developments of automatic control between Babbage and the mid-20th century was because these developments were taking place in the management of human computing labor.
The organization of human computing labor was something the early computer engineers were intimately acquainted with, having managed computing operations themselves. The molding of office clerks and mathematicians into computers which execute a program is an essential part of this story. Also important is the question of the economic changes which made automatic computers advantageous in some industries (ballistics, astronomy) while others (accounting, management) took two to three decades longer to adopt the technology.
By only focusing on machine developments, the “divide” seems ever more chasmic. Turning our attention towards the complex developments in bureaucratic management—the ways that human bodies are organized as (control) technology in calculation—clarifies the shifting meaning of control in human-machine computing systems.
Historians have to move beyond the assumptions that humans are always exceptional elements needing to be disciplined and that machinery is the rational, instrumental alternative. These are assumptions passed down by insider historians. The rationality that computers impose was developed first in a human division of labor, one later reinforced by machinery. A hybrid human-computer historical analytic is a step towards a more cohesive understanding of computing.
The history of technology is deeply entwined with the history of labor, as technology is the means through which labor acts to sustain an organism’s life. By studying tools, we inspect how an individual interacts with their environment. By studying machines, we analyze the relationship between technology and the capitalist mode of industrial production.
Here, I want to approach a definition of “machine” and “tool” more specific than common use allows. The tool is an instrument that is external to an individual’s body. Nonetheless, the individual retains a direct guiding influence on the action of the instrument. Operators have only an indirect influence on the action of machines (e.g., designing the machine, then connecting it to a motive force). While we can observe the use of tools in primates and other non-human species, machinery is a more recent phenomenon particular to the human species because it is embedded in the development of the capitalist mode of production.
In Capital Vol. 1, Karl Marx describes the origin of machinery through a procession of productive systems. The period of manufacture (manufacture in this use exists in the absence of machines) brings together laborers to work cooperatively under one roof, increasingly subjecting them to a division of labor. In English, the words “handicraft” and “manufacture,” like “manual” have their root in the words for the human hand; in this system of production, the human hand remains the dominant guiding force. Through the division of labor, complex tasks are broken down into simpler ones, the laborers’ work becomes more specialized.
Subsequently, industrial production mechanizes the work system already laid out by manufacture. Following Adam Smith, Marx observes that as the division of labor divides complex tasks into simple steps, increasingly more of these steps can be accomplished by existing machinery. It is not that someone develops a technology which is complex enough to replace human work, but rather that human work becomes so simple that someone can create machinery with which to perform it. Technoscientific ingenuity is secondary to the increasing simplification of work as a force guiding the introduction of machinery.
What is particular to the machine is that the tools are removed from the hands of the worker and united by some transmitting mechanism with a motive force. Under manufacture, the capitalist unites separate workers into cooperation under a division of labor. This process is extended as the machine takes the tools of the worker and unites them with one motive force. It doesn’t matter whether the motive force of the machine is electric, steam, or human. Furthermore, the machine can even use the workers’ tools as the implements by which it acts on its object. So, the definition of a machine applied by Marx is not a technical (e.g. a machine is something with cogs that runs on steam). Rather, it references the conditions of production which force it into existence. Industrial capitalism mechanizes the division of labor already laid out by manufacture.
Charles Babbage, the British mathematician and designer of the earliest automatic calculating machines, was Marx’s primary intellectual influence in his thinking about machines. The definition of machinery that Marx uses to structure his discussion is from Babbage’s On the Economy of Machinery and Manufactures: a machine is “The union of all these simple instruments, set in motion by a single motor.”
Babbage’s understanding of machinery and the division of labor was crucial to his invention of calculating machines (which I have written about here). Babbage observed that machinery merely united the simplest forms of labor and enacted them through a single motive force, rather than through multiple workers. If machinery could do this for the physical labor of pin-making, Babbage thought, it should be able to do the same for the mental labor of calculation.
It is no coincidence that Babbage, both a mathematician and a student of industrial production, led the development of the earliest automatic calculating machines. For his machines, Babbage modified the existing design for mechanical calculators developed by Pascal and Leibniz. Babbage’s “calculating engines” stand out against their predecessors because they are machines rather than simply being tools. Those mechanical calculators were already in existence to assist individuals with large additions and subtractions. However, Babbage mechanized the larger process of production, incorporating the work of multiple laborers into one machine driven by a single motive force.
The understanding of computers today is very far from that of Charles Babbage. Because past systems of labor have been so thoroughly mechanized in today’s computers, the machine history of the computer seems strange. The computer acts as a tool for us—a means by which we individually manipulate digital objects. Today, computers are designed to give their user the sensation, if not the reality, of direct control. Douglas Engelbart’s 1962 piece on the future of computer technology, “Augmenting Human Intellect,” is a paradigmatic example of this conception. Engelbart characterized computer technology as improving the abilities that an individual’s brain already possesses.
This conception is not necessarily inaccurate with regard to the present application of technology. However, it proves problematic for the study of the history of computation. When we study the history of the computer as a tool, we see the improved cognitive capacity of individuals. When we study the computer as a machine, we can see the complex ways that once-separate forms of human labor are incorporated into a single mechanism. After all, computer was once a word to describe a human worker, and only later became the name for a machine. By studying the computer as a machine, we can more directly understand its peculiarity to the capitalist system of production, and its relation to the forms of human labor that today remain external to computing machines.
I’ve decided, if I ever get around to posting more often, to refrain from using long-form academic writing on this blog. I hope to use this outlet to practice concise and easy-to-understand language that is accessible to a wider audience. Nonetheless, I still want to keep sharing the work that might not fit this description.
The essay that I’m linking to here was mostly written three months ago and represents one of my earliest explorations into more strictly historical research. If I were to rewrite it now, or in a few months, I think I would take a much different approach. Reading it today, it’s clear that my prose is still in some respects caught in the imprecise “purple” style of literary theory & cultural studies—something else I’m trying to move away from. However, I think I made a number of exciting developments
The paper accomplishes two goals. The first is to evaluate the materialist methods of Hessen and Grossman, which I discussed in an earlier post. While Hessen and Grossman read Newton and Descartes, I read Babbage. The second goal is to experiment with a materialist/Marxist approach to the history of computing by identifying the material conditions that contributed to Babbage’s work. I’m excited to see much of history of science turning in a materialist direction in the past ten years or so, but it’s rare to find more politically-charged (esp. Marxist) arguments within the discipline. So this is something that could use some development.
Readers acquainted with the history of computing will find the account of Babbage familiar, but the application of Hessen and Grossman novel. For the rest of you, this may be more exciting.
See excerpts below. You can find the entire paper in downloadable PDF format on my academia.edu.
“The method of differences is one way of reducing complex functions to simpler ones, allowing the employment of workers whose labor is valued less. It is in no way necessary to solving algebraic functions. However, it demonstrates the time and space that calculation necessarily occupies. Mathematics, often seen as the most abstract of scientific disciplines, bares its real material basis in the process of calculation, where economies of time and the resources needed to sustain laboring bodies must be considered.“
“During Babbage’s time, computation was merely one form of productive labor among others, which was organized using the division of labor and subsequently mechanized. Like pin-making and textile production, computation could be broken down into simpler tasks, simple enough for machines to accomplish them. With this in mind, it seems almost obvious that the labor of computation would be mechanized as well. Perhaps to us, Babbage’s work appears to stand out from these other forms of mechanization given the persisting tendency to view intellectual labor as immaterial—something abstracted from the material conditions of everyday existence.”
Product and process: texts take computing and manufacturing as their subject; these texts are computed and manufactured. The latter is, to some extent, necessary to the production of the former: all books must be made. Book production followed suit with the industrial mechanization of the 18th and 19th Centuries, and for the past few decades, published print has most likely existed in a digital form even if its final material is tree pulp. The product has also contributed to the process: books about manufacturing have improved the economy and prevalence of industrial production. The double meaning of the word “production” captures the tension between these two foci—process and product. I could call this relationship between text and machine cybernetic, but this would open the system further. Instead, let us close this self-referential system by using the framework defined by Ada Lovelace—one manufacturer of computing texts. The power of production (as process and product) is operational. As she defines operation: it is the changing of a mutual relationship between two or more things (Lovelace 693).
What follows is an exploration into the bizarre economy of print (and) manufacturing from the testimony of one “father” of computing, Charles Babbage. Computer history remembers Babbage as the designer of the calculating engines that were the first attempts to mechanize computing, which was then a labor intensive and tedious process carried out by humans. Despite being an inventor of geared mechanisms, most of his work failed to exceed the printed page. Babbage was unable to produce more than a small-scale demonstration model of his Difference Engine, and the plans for his Analytical Engine and Difference Engine No. 2 remained largely printed words, diagrams, and schematics—never finished machines. While we think of the history of computing today in terms of electric pulses and glowing screens, the works of Babbage hold import primarily in the realm of stereotypes, pulp, and ink.
Charles Babbage is primarily remembered for his work on numerous calculating machines, but he also made developments in mathematics, economics, and philosophy. His pursuits shared the common theme of the mechanization of calculation. He was less concerned with the mathematical production of the formulas to be solved or the product of calculated formulas, and more by the labor and time which performing calculation necessitated. During the time when he was producing his first calculating machine, the Difference Engine, Babbage made numerous visits to factories across Britain. He took a special interest in the division of labor, and studied the works of renowned political economist Adam Smith.; this industrial tourism resulted in the book On the Economy of Machinery and Manufacture, published in 1832. The book included Babbage’s account of the improvements new machines introduced to the process of production, some technical descriptions of specific machines, and Babbage’s general analysis of the new British industrial economy.
The printed materials that Babbage developed were themselves the product of the machinic infrastructure which he studied. Babbage begins to construct the self-referential nature of his textual system in the preface to his first edition: “I have throughout this volume, wherever I could, employed as illustrations, objects of easy access to the reader; and, in accordance with that principle, I selected the volume itself” (vi). Babbage holds true to this promise, as he dedicates Chapter XI to mechanical copying. The production of “these very pages,” the pages that constitute the original editions of On the Economy of Machinery, are the subject at the end of this chapter; Babbage gives a thorough account of the six-step process of stereotyped printing which produced them (112). Later, in Chapter XXI, he provides the cost for each separate step in the same book’s production; this includes the printing labor and paper itself, as well as the “Average charge for corrections” in type composing errors, and “Expenses for advertising” (Babbage 205).
Yet Babbage presents more than just facts about print production; he also criticizes the publishing system itself in his chapter “On Combinations of Masters Against the Public.” Becoming increasingly polemical in style, Babbage writes against the various monopolies which increase the price and decrease the quality of consumer goods. The publishing monopoly forces authors to go through publishers and booksellers to produce and sell their work; while this shifts some financial risk off the author themselves, the overhead for these extra steps decreases the total share of profit that the author receives (Babbage 314-317). Furthermore, some printers publish and sell more copies of the book than they pay the author for (Babbage 322).
The publishing industry does not leave Babbage unpunished for these transgressions; his own volume falls victim to the shady policy of underselling practiced by most of the booksellers dealing his book, particularly because he critiqued such practices. In response, he dedicates the Preface to the Second Edition to levying allegations of underselling against the publishers of his books. He explains the cause of this discipline:
“It has been objected to me [by the book publishers], that I have exposed too freely the secrets of trade. The only real secrets of trade are industry, integrity, and knowledge: to the possessor of these no exposure can be injurious; and they never fail to produce respect and wealth” (Babbage viii).
Evidently Babbage’s thorough study of manufacturing did not lead him to the same conclusions as someone like Marx—who himself was one of Babbage’s most dedicated readers. Babbage still believes that a capitalist’s virtue is itself a source of wealth, yet he finds himself under economic pressure to justify his critique of what he sees as corrupt business.
The booksellers betray Babbage again as they inserttheir own page into the copies of the Second Edition books they sell, refuting Babbage’s claim of underselling by providing their own data table on how many books they purchased. Babbage must then dedicate the preface to the Third Edition to refuting their claims, by showing how the data they provided was merely the subscribed or anticipated purchases, not the actual purchases that the booksellers made.
Babbage’s multiple editions of On the Economy of Manufacture tell an interesting narrative about the manufacturing of texts. Firstly, Babbage’s own writing brings his book into a self-referential loop in which it speaks of the conditions which brought itself into existence. The symbolic words (product) call attention to the material process which brought the pages into existence. The shifter which calls attention to “these very pages” may no longer actually point to the pages held by a contemporary reader of a re-publication of Babbage’s work—such as the work which I referenced while writing this paper, published in 1989. Yet while “these very pages” are no longer the pages which Babbage first spoke of, such words still call attention to the chain of commodities, capital, and labor which bring text to formation under the capitalist system of production.
Critical self-referentiality has real economic consequences for Babbage, which leads to the second crucial point that this volume demonstrates. The conversation had between Babbage and the publishers within the very volume he published establishes the inability for the author of a text to completely define not only the text’s interpretation and use, but also the very content of the text itself. So long as the author themselves does not control the means of production, the system of capitalism acts as a medium between the written word of the author and the text read by the reader. Capital alienates the author-as-proletariat from their product simply because their words literally become the words of a publisher rather than their own (whether those words were ever the author’s “own” is an entirely different conversation). Something as concrete as black symbols on a white page appear mutable under capital. The publisher or manufacturer can edit the text or insert their own voice into the publication itself. This exchange between author and publisher provides a material example of the slipperiness of authorial intent which is testified to symbolically by so much of postmodern/poststructuralist literary thought (Barthes, Derrida…).
Imagine instead if the producers of Babbage’s work, the actual laborers who created the stereotypes for printing, inserted their own page into the volume. Perhaps it would say something like “Yes, Mr. Babbage has the right to advance a claim of underselling, but we workers receive an even poorer bargain. Having to sell our labor power to survive, our employers subject us to exploitation; they receive the larger portion of the value that we produce” (these workers are well-read in Marx, apparently). Such an intervention seems impossible, given the strict levels of surveillance and control to which workers are subject. Not only the symbolic interpretation, but the material content of texts is defined by the power structures which constitute its formation.
Apart from his interest in the economics of printing, print as a medium of communication was also of central concern for Babbage in the design of his calculating engines. After all, the tables which were the end result of the calculation were to be printed by the machine; this was the primary way of viewing the data his engines produced. His own experiences with organized efforts to calculate mathematical tables gave Babbage awareness of the need for efficient and accurate printing. After all, no matter the accuracy of your mathematicians in their calculation, misprints will lead to a flawed final product. Even putting aside their devaluation of the final product, misprints could have disastrous results. Mathematical tables often served as nautical navigation aids and a misprinted number could mean shipwreck and death.
As historian Mark Priestly describes it, Babbage’s “overall vision [for the Difference Engine] was of a machine divided into two parts, each dealing with one of these fundamental processes:” calculating and printing (Priestly 22). The calculating machine was also a printing machine, and the printing mechanism for Babbage’s engine would require as much development upon existing technology as the calculating portion would. Yet Priestly, primarily interested in the history of mathematics, declines to discuss the printing mechanism at any more length. Even Babbage himself devoted significantly more of his attention to the geared adding mechanism than to the printing one. Like most of his work, Babbage never fully implemented a printing mechanism for the Difference Engine; only a crude version was assembled by his son before Babbage’s death.
While Babbage was more interested in implementing mechanized calculation than printing, he also struggled to represent his calculating machines in print. As I mentioned above, Babbage’s mechanical work mostly remained in books during his lifetime rather than constituting machines. He became skilled at reproducing diagrams of his engines, their gears and columns, but he faced the problem of representing the process that they enacted. To show that the teeth of one gear fit into another was insufficient, Babbage wanted to represent that machine’s motion, how its parts interacted with one another, and how its state changed over time.
Babbage decided to create an entirely new system of mechanical notation. Machine diagrams only presented the machine in one state, but drawing multiple diagrams would be impractical and natural language description was too ambiguous: it would not allow the precision that mechanism design required. The notation which he created consisted of both textual and graphical elements in a two-dimensional table. Each part of the machine was assigned a column, and each row was a time unit. One would read down a column to understand how one part changed over time, and across a row to understand the state of the machine at one moment in time (Priestly 30).
This notation creatively responded to the limitations of the existing media, and demonstrated a conflict between the emerging media of production and the traditional media of representation. Babbage required a new language to represent the machines in print because their novel complexity exceeded the capacity for current techniques for printed mechanical representation. His notation anticipated computer code although Babbage himself did not come to this conclusion. Both Babbage’s notation and modern computer code are symbolic descriptions of information machines. These descriptions are then read and implemented by either an engineer (wetware) or a compiler (software).
The works of Babbage are incredibly difficult to find in print today, despite his popularity as a figure in the history of computing and the texts’ importance for the history of mathematics and economics. The few published editions of Babbage’s On the Economy of Machinery available online often cost hundreds of dollars. Many of these versions, especially the cheaper ones, are reprints of a digital copy of the text. One of these reprints, however, sells for $1,981.03 (just add $3.99 for shipping). It is published by Nabu Press, a subsidiary of BiblioLife—a republishing company (formerly?) based in Charleston, South Carolina. The website for the company is presently unavailable, so it is unclear whether they still exist, but one can find many of their reprints on Amazon and other online retailers. BiblioBazaar, another member of the BiblioLife family of “non-traditional” publishers, produced 272,930 books in 2009, almost as much as the entire “traditional” publishing industry combined (Albanese). This is not the number of books, but the number of texts which they published.
The company’s secret? They harvest the vast amounts of newly digitized texts that enter the public domain, format an electronic version of the text, and sell the results to printers. As their president put it in 2010, “we are really a software company that has books coming out at the end of our process” (Albanese). Most of this work is done through a computer program which extracts text from scanned books and formats them. Although Babbage acknowledged the importance of print when he was designing his calculating engines, there is no way he could have predicted the future of the computerized printing economy, or that his own book would be produced in this way. BiblioLife’s political economy of print involves extracting value from laborers working in universities, libraries and archives, without having to pay these laborers as the texts are in the public domain. The “neoliberalization of academia” (the increasing profit-orientating of higher education) remains a hot topic and it is evident that using computational tools to process texts in the “Digital” Humanities does not stray far from the business practices of companies like BiblioLife. The considerable support that universities now give to the utilization of computing tools in humanities research demands critical attention as the software developed by BiblioLife shows an example of the profitability of often low-wage university labor. What prevents the digitization of historical texts from being merely their commodification?
Interrogating the status of print at the origins of mechanized computing elucidates the relationship between the two media which are often analyzed separately. The argument that “nothing has changed” throughout the centuries-long history of computing is just as naïve as the technophilic mysticism which prophesizes a future of silicon. Rather we see the old in the new, both still operating on each other: such is history. A Thousand Plateaus, for all its talk of rhizomes against the arboreal, is still printed on leaves made up of tree pulp. HTML prioritizes the cultural language of print: a two-dimensional surface containing symbolic characters. After all, what makes an internet link more hyper-textual than a note in a book referring the reader to “see Chapter 8”?
Studying the history of computers challenges us to reconsider the distinction between what we would normally consider media (photography, cinema, sound, internet…), and the means of production generally. Why are print and personal computers privileged in media studies, while automated factory robots are peripheral, if studied at all? All of these exist as technical artifacts within an environment of established social practices. Computers and “digital media” are a popular topic in media studies, yet the origins of computing distinctly reveal how they are merely the mechanization of human cognitive activity. Are not the systems of production which build “media,” media themselves? Making a distinction between the media of production and representation is an absurd venture. The two are evidently indistinguishable in and as the current state of things.
Albanese, Andrew. “BiblioBazaar: How a Company Produces 272,930 Books A Year.” Publishers Weekly 15 April 2010.
Babbage, Charles. On the Economy of Machinery and Manufactures. 4th Edition. London: Charles, Knight, 1835.
Lovelace, Ada. “Sketch of the Analytical Engine Invented by Charles Babbage Esq. By L.F. Menabrea of Tutin, Officer of the Military Engineers.” Scientific Memoirs. Vol. 3. 1843. 666-731.
Priestly, Mark. A Science of Operations: Machines, Logic and the Invention of Programming. London: Springer, 2011.
Media studies, ever biased towards the new, has taken upon itself the project of dissecting “the digital.” Satisfied with its analysis of the digital media which have emerged within the past eight decades, it has abstracted the digital itself from media. Now we have “the digital,” an apt ontological quandary for our 21st century: the “Information Age.” The theorists of the digital contend that their subject has not received its due attention either from philosophy or from media studies. Avowedly not simply media theorists and some even taking oath against philosophy itself, these trans-disciplinary scholars locate digitality in issues of ethical, political and aesthetic concern.
Although there exist various such approaches to the digital, I want to highlight two examples. The first is the thought of Alexander Galloway, who understands the digital as the distinction that allows for distinction itself. Galloway takes a metaphysical approach to the digital, seeing it as the process of discretization—a difference not between the 1 and 0 of binary code, but 1 and 2: a unified one that divides into two. French philosophers Gilles Deleuze and François Laruelle primarily influence Galloway’s thought, and he occasionally maneuvers into more political terrain through Althusserian Marxism alongside Laruelle and his own teacher Fredric Jameson. Alternatively, Seb Franklin has a more political and media-historical approach to the digital, looking at the prehistory of digital computers—various information processing projects and machines—and analyzing their implications for capitalist production. While Galloway approaches digitality through French philosophy, Franklin understands it more from a Marxist and historical perspective. Nonetheless, they are similar in their immanentist or materialist approach and their application of Deleuze, primarily with regard to his writings on the societies of control.
However, these abstractions of the digital and the analog are based on a number of false assumptions: 1.) Computer science and mathematics have concretized the digital and the analog as uncontested and static definitions. These concepts can then be retroactively applied to other cases that fit their definition. 2.) These definitions of digital and analog describe systems which are opposed to each other and mutually exclusive. 3.) The digital maps directly to the discrete while the analog pairs with the continuous.
In this essay, I will clarify how the digital and analog are a non-mutually exclusive pairing with a contested and unsettled intellectual history. I will demonstrate this with a discussion from the early decades of electronic computing: part of the 1950 Macy conferences on cybernetics in which a group intellectuals including scientists, mathematicians, engineers, psychologists, and anthropologists debate the usage of “digital” and “analog”. I use this discussion to challenge the above assumptions about the analog/digital pairing from a primarily scientific perspective.
This reevaluation will provide a more stable foundation in order to address the concerns that Galloway and Franklin have about philosophy, computation, and politics. But where Galloway primarily applied Laruelle and Franklin, Deleuze, I hope to trace this issue earlier in intellectual history: to Spinoza. Despite their focus on digital media, these theorists are essentially trying to conceptualize the discrete and the continuous. Instead of interrogating the problems of the digital and the analog, terms recently borrowed from computer science, I will investigate the differences between the discrete and the continuous, or the transcendent and the immanent as outlined in Spinoza’s philosophy.
Ontologies of the Digital
It is difficult to give a base definition of digital and analog here, considering that I want to question the legitimacy of any definition for these concepts. While the complexity of the two terms will be developed below with the Macy conference’s analog vs. digital debates, it will suffice to provide a general outline of the digital and the analog here.
“Digital” comes from the Latin word digits: meaning finger or toe. Because fingers served for centuries as a technology for counting, “digit” came to be used in the fifteenth century to describe an integer between zero and nine. The legacy of this usage is the near universal base-10 numerical system, which corresponds to the ten fingers of the human hands. “Digital” was not used as a descriptor for technology until 1942, when mathematician George Stibitz suggested that a form of the Allies’ fire-control devices be described as “digital” (previously deemed “impulse” by John Mauchly) to set them apart from the analog devices. These devices relied on electronic signals whose continuous range was discretized, a condition of all digital technology today. This means that the continuous variation in a range of voltage is reduced to the difference between only two digits. So a digital logic gate will, for example, propagate a range of 0 to 2.5 Volts as 0 or “off,” while 2.5 to 5 Volts will be propagated as 1 or “on.”
These ranges vary depending on the device but what they all have in common is the discretization of a continuous signal into a limited number of acceptable zones: in digital logic it is the 0 and 1 of a binary system. Such discretization is not exclusive to electronics; consider the abacus, one of the first forms of “digital” technology. The ring on an abacus slides continuously across the bar, but is taken to mean a certain discrete value depending on which part of the bar it is located closer to.
“Digital,” with regard to technology, generally denotes a method of encoding information into binary digits, with one and zero respectively corresponding to the on/off state of an electric pulse. “Analog” describes a system that incorporates continuous variation; so rather than an abacus, its prototype is the slide rule. In sonic media, digital formats like the mp3 exist fundamentally as strings of bits (binary digits) read by a computer, while analog vinyl records guide a turntable’s needles continuously. To give another example, a light dimmer adjusts the voltage level that a lightbulb receives analogically, while a on/off light switch does so digitally.
Galloway is one of the primary theorists attempting to extend the technical definition of the digital to other concerns in philosophy. Primarily a media scholar and a Deleuzian student of French literature and Theory, Galloway developed his conception of the digital in his talks on “Deleuze and Computers” and “10 Theses of the Digital”. However, Galloway’s recent book provides the most rigorous analysis: Laruelle: Against The Digital, published in 2014. The book is principally a monograph on the contemporary French philosopher François Laruelle, whom Galloway claims moves “against the digital” through his immanent practice of non-philosophy.
Properly working within the French tradition, Laruelle finds much to disagree with in the history of Western philosophy and metaphysics. He situates himself against the “philosophical decision”: what he defines as the decision implicit in any philosophy—the decision to philosophize itself. This decision splits the world from the interpretation of the world, the “secret” and the communication of the secret. Laruelle responds with non-philosophy, a science which takes philosophy as its object in order to avoid ever having to make these divisions. This science “is always direct or radical, not reflective or mediated,” it “reveals things immediately, unilaterally, and unconditionally” (Galloway, xxiv).
Galloway draws similarities between what Laruelle critiques and the structures of digitality: “Like philosophy, the digital is also an insatiable beast, and like philosophy, the digital is also inescapable today… so-called digital thinking—the binarisms of being and other or self and world—is often synonymous with what it means to think at all” (xviii). In general, Galloway defines the digital as “‘the one dividing into two’” of analysis and the analog as “‘the two coming together as one’” of synthesis (xxix). So while the digital enforces transcendental distinctions between mind and body or Being and being, the analog “brings together heterogenous elements into identity, producing a relation of non distinction.”
Why speak of the digital instead of the discrete itself? Galloway says that defining a set of points as discrete does nothing to tell us how the points came to be discrete. Rather, Galloway is interested in the digital as the capacity to “divide things and make distinctions between them” (xxix). So digitality is a virtuality (a discretization that has not been actualized) or a process of discretization, while the discrete is an attribute statically possessed by a given system. However, Galloway does not articulate how we know whether something is discrete or continuous. Galloway’s contention that “nothing has been said as to how such points became discrete in the first place” already betrays his bias towards a continuous real that is then discretized a posteriori without further explanation or proof. I agree with Galloway here (although via Spinoza) that the real is a continuous substance that is then discretized, but I think that understanding this rather than assuming it as true is essential to analyzing a logic of “digitality” in philosophy.
Galloway grows a pair of concepts (digital/analog) with a relatively short history into a transhistorical metaphysics. He locates the digital as the primary logic of Western philosophy beginning with Plato’s essence vs. instance and continuing through Descartes’ body vs. mind. The analog philosophers include the likes of Spinoza, Henry and Deleuze, who made movements towards immanentism or materialism. Yet many of these philosophers are guilty of the “compromise of immanence,” retaining “pure multiplicity within the univocity of being” (xxxiii). For Galloway, only Laruelle abstains from the decision of philosophy. He then pairs Non-philosophy with the analog, as a radically immanent non-distinction.
Yet in the same page, Galloway claims that Laruelle “exits” from “arguing… digitality or analogicity, pro or con” and that “The digital is… Laruelle’s chief enemy” (xxxiii). Galloway leaves it unclear as to whether Laruelle abstains from the argument against digitality or precisely that his argument against the digital is the strongest made so far in the history of thought. Since Laruelle himself never explicitly spoke on the subject, Galloway vacillates between these two conclusions.
Seb Franklin, evidently inspired by Galloway’s abstraction of scientific concepts from media forms, develops his critique of the digital in Control: Digitality as Cultural Logic, published in 2015. Franklin centers his analysis on Deleuze’s later writings on the societies of control. Deleuze argues that the many institutions included in Foucault’s societies of discipline have undergone a crisis of legitimacy. New installations of power centralized around computational technology are taking their place, they are perhaps more “decentralized” but no less entrapping: this is “control” versus “discipline”. Franklin seeks to locate the emergence of this control system in the history of computation and social control.
Rather than developing a critique of Western philosophy, Franklin shows how digital logic parallels the logic of capital via Marx. “Digitality, as evoked in the introduction to this book, can in part be understood as a mode of capturing individual and social behaviors for the purpose of valorization” (Franklin, 8). For example, capitalism “digitizes” labor by splitting it into quantifiable units of labor time, what Marx determines as socially-necessary labor time (SNLT), the source of exchange value. The property of exchange value allows any commodity to be exchanged with any other of equal exchange value, like how all digital media exchange strings of bits which may represent varying images or sounds but all have the same binary substrate. So capital expands from the factory to the home in order to capture and exploit all behaviors as valorizing labor. Connecting this with Deleuze’s control societies, Franklin shows how the societies of discipline aggregate subjects analogically into masses such as the factory, school or prison, but control splits these subjects digitally, dividing them internally into what Deleuze calls dividuals.
Franklin then investigates numerous projects in history that are exemplary of this digitization. For example, Herman Hollerith’s machinic automation of the 1890 census demonstrates how the biopolitical project of population management involves digitizing people into discrete categories like races and genders. Similarly, he shows how Charles Babbage was influenced by Adam Smith’s thought and the structure of manufacturing in order to create his computing machines. These processes extend to a digitization of the social through post-War programs like cybernetics which applied thinking from computation, mathematics, and even defense technologies to prescribe methods of managing human psychology and social forms through feedback mechanisms. (It is precisely this discourse I turn to to understand the analog and digital below.)
As with Galloway’s work, it is difficult to tell if Franklin is talking about the digital proper, or simply about the discrete in general. However, his analysis is more consistent in that it is less eager to abstract itself from media forms, showing how media constitute digitality even before nominally digital media came about in the early 1940s. Franklin maps the digital as a cultural and social logic outside of explicit connections to media, but this is only so far as the media and systems of production are the base determination for the logic of digitality.
Yet retroactively applying the model of digitality to all capitalist forms makes it appear as if digital media were determined to come about from the very beginnings of capitalism. As Deleuze stresses in his essay on the societies of control, the early stages of capitalist development were characterized by the analog machines of industrial production and analog social management: the massing of individuals into larger class groups. Franklin disregards how the “digitization” of bodies into categories such as race and gender at the same time involves the formation of an analog between different particular bodies. With regard to the history of media, it would be inaccurate to show a strict movement from analog to digital. For example, long-distance electronic communication transitioned from telegraph (digital) to telephone (analog), before becoming digital again through digitized telephone and email. As I will show in the next section, analog/digital is an incongruent terminology, especially when applied outside of the sciences.
The Analog vs. Digital Debates
Not only does the pairing of digital and analog function inconsistently in these abstract theories, it also has a contested history within the realm of computer science and mathematics. The Macy Conferences, held between 1946 and 1953 in New York, hosted intellectuals from the “hard” and “soft” sciences to debate the significance of wartime theories of communication and control technologies and how they could be applied to animals—specifically human beings. Initially titled the “Conference on Feedback Mechanisms and Circular Causal Systems in Biology and the Social Sciences,” the name was changed to “Cybernetics” after the 1948 publication of Norbert Weiner’s book of the same name.
Although the Macy Conferences covered a rich variety of topics and created the foundations for the disciplines of cybernetics and information science, my purpose here is not to summarize or critique their history. Instead, I want to focus on the transcripts of one conversation from the Seventh Conference, held in 1950. This discussion includes engineers, psychologists, and anthropologists and shows how the application of“analog” and “digital” is varying and unstable within their native disciplines of electrical engineering and computer science, and even more so when they are removed from these discourses.
The debate began around the role of models in science, specifically around whether to use an analog or digital model to explain the functioning of the brain and nervous system. Many of the members, like mathematician John von Neumann, were excited to apply the digital model of computers to nervous function—highlighting parallels between the all-or-none state of electrochemical pulses across neurons and that of binary digits. However, neurophysiologist Ralph Gerard brought up some concerns. Gerard identified that elements within the nervous system such as “chemical factors (metabolic, hormonal, and related) which influence the function of the brain are analogical, not digital” and furthermore that “digital functioning is not overwhelmingly the more important of the two [analog and digital]” (Gerard, 12). He believed that the group switched from the “as if” idiom to the “is” idiom, that they had taken their scientific models to be literal presentations of reality (11). What followed was a lengthy debate about whether the brain was a primarily analog or digital organ, how to define analog and digital, and how to apply scientific models in general.
Several of the conference members had their careers at stake in the discussion, Warren McCulloch and Walter Pitts gained their reputation for their model of the brain as a computer made up of digital neurons, while von Neumann and Julian Bigelow applied this model to their design for the electronic computer at Princeton (Kline, 47). However, von Neumann made an important concession when he admitted that “an electrical computing machine is based on an electric current, which is an analogical concept… these ‘discrete actions’ are in reality simulated on the background of continuous processes” (Gerard, 20). The significance of this statement shows how, from a scientific perspective, discrete values are only ever representations, models that exist on top of an underlying continuous substrate. Von Neumann later concludes more explicitly that “in almost all parts of physics the underlying reality is analogical… The digital procedure is just a human artifact for the sake of description” (27, my emphasis). Models only work insofar as they accurately represent a system—but as models they are never presentations of reality.
The discussion gets even more complex as anthropologist Gregory Bateson questions the parallels between analog and continuous on the one hand, and digital and discrete on the other: “It seems to me that the analogical model might be continuous or discontinuous in its function” (27). To this, von Neumann admits that the use of “‘analogical’ and ‘digital’ in science is not completely uniform.” Engineer J. C. R. Licklider agrees: “I can conceive of digital system [sic] which is the digital process and the analogue of another digital process, and therefore really analogical” (32). So digital systems can be both digital in themselves and analogical as analogs to other digital systems, because they present the nature of the other digital system directly—that is, continuously. Like how von Neumann stated that the analog/digital pair was only a model, Pitts then identifies that “continuous” and “discrete” are also both only models for representing reality. He explains that if you can take a continuous variable in a given system and accurately discretize it into a quantifiable set of values—like how a digital computer treats a Voltage from 0 to x Volts as a “0”—then that system can be modeled discretely. If not, it can only be modeled continuously. But the important conclusion he makes is that “It does not depend upon whether it is in its own nature continuous or discrete” (33, my emphasis). Systems can be modeled in either way so long as the model accurately represents the way the system functions. Bigelow points out that even in this way, it is impossible to define something as digital without referencing “a continuous process by which you are defining your digit” (35). So continuous or discrete models are not fundamentally “true” representations of the thing itself, and furthermore any discrete model always has a continuous referent underlying it which it discretizes.
Bigelow deals the final blow to the analog and digital pairing by bringing up the contradiction in the digital model with a “forbidden zone.” This zone is a continuous range between discrete values that is “forbidden” since it is undeterminable as one or the other. For example, the CMOS logic gate propagates a range of 0 to 1.5 Volts as ‘0” and 3.5 to 5 Volts as “1,” with the range of 1.5 to 3.5 Volts being the “forbidden zone” that is neither 1 nor 0. It emits what is termed a floating value, sometimes called “Z.” Bigelow demonstrates that this forbidden zone is not really forbidden since it represents a value that is read and responded to by the system, albeit unpredictably. So the forbidden zone should either be treated as an acceptable value, Z, or assume that “there are as many values as you please…therefore a continuum of zones, in which case the digital property has really vanished and you are talking about analogical concepts” (47). The fact that most digital systems have this forbidden zone evidences that the process of discretization can never completely abstract itself from its continuous substrate.
Licklider suggests doing away with the digital and analog pairing, seeing them as inaccurately defined, and to instead only use discrete and continuous. He is quickly shut down by statistician Leonard Savage, who proposes continuing to use the terminology unchanged, only because the group has already been using them for five years. The general consensus that the group arrives at is summarized by Bigelow: the usefulness of models “depends on whether or not by exploring such analogies you can come to any new insight into what goes on…whether these notions contain useful, descriptive properties of what goes on” (47).
Unfortunately, the group grows restless and no consensus is made for how to model the brain nor does the group redefine the analog/digital terminology. The repercussions in many fields are enormous, as this terminology began to take on a wider use as a direct result of these conferences. Despite much criticism of the cybernetics movement mainly from those in the social sciences and humanities, cognitive science and neuroscience still frequently apply the computing brain model. Furthermore, theorists like Galloway and Franklin still base their ideas on the analog/digital model without analyzing this or other historical debates.
Given that the digital does not have a stable definition even within the discourses of computer science and mathematics, is it possible to treat it as Galloway does, to “examine digitality largely uncoupled from any sort of immediate technological referent,” treating it instead “as a strictly theoretical concept” (xix)? The digital is incomplete as a technological concept, and acts as an insufficient theoretical concept as well. We see from its technological definition that the digital is just as “immanent” as the analog—digitality is only a model of an analogicity, discrete variables are always defined against a continuous substrate, and the definition of a body as continuous or discrete depends on how it affects another body—Spinoza will be useful here.
Analog and digital are models of the real, scientific methods of representation that work insofar as they allow us to understand a system more. The danger is in taking these models to be presentations of reality—that is, as an ontology. Critiquing this misapprehension of reality in scientific models is of course part of Galloway and Franklin’s projects. However, by basing their arguments on false assumptions they also play into that misapprehension. And as these theorists show, such scientific models have real effects as the immanent logics of the computational technologies that presently facilitate capitalist domination. As such, it is necessary to investigate the ontology of discrete and continuous. But why conceptualize “digitality” as the process of distinction, when distinction itself is already this process? Why look to the “analog” to understand immanence when immanence is already immanent to itself? In order to mapthe discrete and the continuous as abstracted from any particular media form, it is better to do away with the terminology of analog and digital which is internally incoherent and inextricably tied to specific media.
Spinoza: The Discrete and The Continuous
Spinoza does not attempt to redouble the world but rather the opposite: to arrive at “knowledge of the union that the mind has with the whole of Nature” (Treatise on the Intellect, §§13). As such, Spinoza’s thought involves a thinking of the continuous and the discrete that results in an actual non-distinction between the two: bodies are discrete or continuous only in their affects. The movement made by Spinoza which Galloway deems the “compromise of immanence” is precisely what allows for a more complicated thinking of this pairing.
The relation between substance and attributes evinces this non-distinction between discrete and continuous. Substance is the base makeup of everything, it is that which “is in itself and is conceived through itself…that whose concept does not require the concept of another thing, from which it must be formed” (Ethics I, D3). Attributes, on the other hand, are what the intellect perceives of a substance. “All the essences, distinct in the attributes, are as one in substance, to which they are related to by the attributes” (Deleuze, 51, my emphasis). Substance in itself is continuous, but insofar as it is made up of an infinity of individual attributes perceived by the intellect, it is expressed discretely: multiple-as-one. In this way, one can think the discrete or continuous without restricting them to a mutually-exclusive relation—as is seen in the thought of Galloway and Franklin above.
Similarly, Spinoza’s thought allows for no good or bad things in themselves. Instead, there is only that which increases one’s power of acting, and that which decreases it. In parallel, the danger that Pitts recognizes in the cybernetics conference is in taking what one models as continuous or discrete—such as the brain and nervous system—to be really continuous or discrete in themselves. What one interprets as being really discrete is only so modally. The conditions for determining whether a system is discrete or continuous have nothing to do with the system itself, but rather the mode that it shares with another—its affect. As such, this distinction between the modal and the real is one of the important powers of reason.
While Laruelle is averse to distinctions, they hold an important place in Spinoza’s thought as a means of reaching divine knowledge; such a distinction is made between the first and second types of knowledge. The first type, imagination, results from the linking of inadequate ideas with passions. For example, when we experience two affects at one time, we often have inadequate ideas and may consider one affect to be the cause of another, when really they are not directly related in a chain of cause and affect. The second type of knowledge comes when we can accurately understand what is common to both us and the body affecting us, the way in which are actions are only the effects of some outside cause. This distinction allows us to seek out affects that increase our power of acting.
The distinction that the mind makes is not between philosophy and non-philosophy but between adequate and inadequate ideas. These types of thinking allows us to determine that reality is continuous: “If we attend to quantity as it is in the imagination, which we do often and more easily, it will be found to be finite, divisible, and composed of parts, but if we attend to it as it is in the intellect, and conceive it insofar as it is a substance… it will be found to be infinite, unique, and indivisible” (Ethics I, 15 Schol.). So while the underlying reality of substance is continuous, the continuous and discrete are not opposed to each other: “matter is everywhere the same… parts are distinguished in it only insofar as we conceive matter to be affected in different ways, so that its parts are distinguished only modally, but not really” (I, 15 Schol.).
But this distinction (between inadequate and adequate ideas) is never a willed distinction: “knowledge as an affirmation of the idea is distinguished… from consciousness as a reduplication of the idea” (Deleuze, 82). Unlike the philosophical decision in which a subject doubles the world by mirroring it with the interpreted world, the knowledge of something in Spinoza is the affirmation by the idea of something inside the mind—the mind is acted upon by the idea. Rather than being a transcendental difference, the distinction between true knowledge and imagination pertains to understanding the immanent relation between causes and effects. Knowledge of this distinction is the power of reason to change passive affects into active ones.
Immanentist (non-)philosophers such as Laruelle and Spinoza find common enemies in the notions of “Transcendentals” such as the general Being expressed by all beings. Laruelle determines that Being is formed out of the philosopher’s redoubling gesture—the “philosophical decision” to interpret the Being expressed in all beings, and distinguish the two from each other. Alternatively in Spinoza, Transcendentals are formed from the incapacity of the human mind to make as many distinctions as there are existing particulars that affect it. When the body is affected by something that it perceives as present to it, it imagines that thing—it forms an image of it. But when “the number of images the body is capable of forming distinctly in itself at once is greatly exceeded, they will all be completely confused with one another… the mind also will imagine all the bodies confusedly, without any distinction, and comprehend them as if under one attribute” such as Being or thing (Ethics II, 40 Schol. 1). In this way, Transcendentals are inadequately understood as continuous because the number of particulars exceed the cognitive ability of the human mind to make distinctions between actually individual things. Transcendentals result from the formation of an identity out of discrete parts, but this is not the grand “analog” unification of separate parts into Laruelle’s One. Rather, it is more like the “digital” gesture that allows different voltages such as 3.6 V and 4.9 V to be read by a digital logic gate as “1.”
A more properly “analog” formation of identity is found in the amalgamation of bodies. With Spinoza, singular bodies can be conceptualized as a collection of other singular bodies that share a common relation of motion and rest. But by sharing this common relation, the singular bodies really become one body. Furthermore, Spinoza’s definition of singularity is not determined from the thing itself, but from its effect: “if a number of individuals so concur in one action that together they are all the cause of one effect, I consider them all, to that extent, as one singular thing” (II, def. 7). So the continuous can be understood as a union of virtually discrete parts that share a common relation as actually continuous.
Nevertheless, one’s power of existing is in preserving oneself as one—a continuous (multiple in) one rather than a discrete multiple. “The object that does not agree with me jeopardizes my cohesion, and tends to divide me into subsets, which, in the extreme case, enter into relations that are incompatible with my constitutive relation (death)” (Deleuze, 21). When one is divided, and one’s parts enter into incompatible relations, one dies. Death is not a biological state in Spinoza, but the permanent discretization of a body: “no reason compels me to maintain that the body does not die unless it is changed into a corpse” (Ethics IV, 39 Schol.). Death is subtractive, one’s body is split into parts. But perseverance as one is additive so long asone enters into relations with bodies that agree with one’s own body in motion and rest (II, L5). In contrast to death, the union of the mind with all of nature constitutes the most divine knowledge: Nature as this union perseveres in existence infinitely without duration, exceeding the body which can only exist within duration. Ethics in Spinoza are what actions preserve the continuous relation between all things; Spinoza is profoundly against the discrete but only insofar as he affirms the existence of one.
The study of digitality both as digital media and an abstract theoretical concept has so far disregarded the controversial beginnings of such terminology, even as used within their native discourses of engineering and computer science. Given this difficulty, the use of “digital” or “analog” in cultural theory or (non-)philosophy is accurate only insofar as this thought is explicitly focused on or related to the material media forms from which these concepts are borrowed. The cybernetics movement (many of whose participants originated and popularized the analog/digital terminology) had its own disagreement over these terms, demonstrating the failure of these models to really present the fundamentals of any system.
However, a simple change in terminology is insufficient to solve this problem as even the opposition between continuous and discrete is non-definitive. “Analog” and “digital” as models evidently involve a mutually exclusive universalization of behavior. They do not take into account the reality that discrete parts always require a continuous referent or substrate and that the discrete and the continuous can co-operate within the same system. As opposed to the more distinct relation that the analog and digital share in Franklin’s study of capital or Galloway’s work on Laruelle, Spinoza’s thought conceptualizes the non-distinction between discrete and continuous.
Deleuze, Gilles. “Postscript on the Societies of Control,” October 59 (1992): 3-7.
— Spinoza: Practical Philosophy. San Francisco: City Lights, 1988.
Gerard, Ralph. “Some of the Problems Concerning Digital Notions in the Central Nervous System.” Cybernetics: Circular Causal and Feedback Mechanisms in Biological and Social Systems. Ed. Heinz Von Forester. New York: Josiah Macy, Jr. Foundation, n.d. 1950. 11-57.
Kline, Ronald R. The Cybernetics Moment : Or Why We Call Our Age the Information Age. Baltimore: Johns Hopkins University Press, 2015.
Spinoza, Benedictus de, and E. M Curley. A Spinoza Reader : the Ethics and Other Works. Princeton, N.J.: Princeton University Press, 1994.