Analog and Digital

January 2000, cybcom listserve

Ambiguities and paradoxes are revealed in this conversation about the origins as well as the historical and current distinctions between analog and digital.

Willard Uncapher

January 24, 2000

So as to shed light on a little debate, could anyone locate the origins of the distinction between analog (analogue) and digital, such as would actually be proposed using those terms? Yes, by the late 1940s, we can read about this distinction all over the place, and apply it retro-actively to, for example, Charles Babbage's designs, or Pascal's. And the terms digital and analog each have their own long histories, as a cursory glance at the Oxford English Dictionary would reveal. However, where is the analog/digital distinction clearly and finally announced and argued in terms of that contrast, and thus as one potentially between continuous (analog) and discrete (digital), with all the philosophical and organizational distinctions that could follow (e.g. Bateson, Watzlawick, Wilden)?

It is clear that von Neumann, Eckert, Mauchly are using the contrast in the early 40s (von Neumann particularly so). Many of the concepts would be inherent in existing work (e.g. neurology/math, W. Pitts). The OED lists the term 'digital' in connection with an electrical patent in 1938. Vannevar Bush's 'differential analyzer' speaks of an 'analogy machine' or analog computer as early as 1930 (Journal of the Franklin Institute). Thematically, questions of control and communication can be seen in the conceptual approaches to the design of machines and to the evolution of species/essence during the 19th century, and the evolution of society during the 18th century (e.g. political economy). But who finally states and clarifies the contrast?

My own thinking is to relate the terms to someone like Bush, and the profound extension of the underlying conceptual contrast to von Neumann's meta-logical, hierarchical, and post-Gödelian analysis of communication and control in the late 1930s. Von Neumann's talents were often to take what others had merely begun and re-conceptualize it.

Origins of the analog/digital contrast, anyone?

Gary Boyd

January 25, 2000

At the Engineering level the Analog/Digital distinction is very simple and straightforward: Any continuous signal can be sampled at intervals and the sample can be categorized in terms of two or more amplitude categories. The presence or absence of sample in each category can be signaled by binary digits (0-1, or hexadecimal, or decimal digits, though they are usually converted to binary code too)- That is analog to digital conversion (A to D). In turn any digital signal can be averaged over time smoothly(say by storing it in a capacitor etc.) and thereby can be converted back to an analog signal. That is D to A conversion.

It is easier to do logic with bits, and to code digital signals to correct for noise than to do the same with analog signals, hence the advantages of digital computers.

Beyond that people have gotten carried away with poetical, metaphorical, and mystical notions. The simple fact is that any analogue signal can be represented digitally and

vice versa, so there is nothing fundamental at stake.

 

Matthew Heaney

January 26, 2000

The values "0" and "1" are the same for hex, dec, and binary:

0 (base 16) = 0 (base 10) = 0 (base 2)

1 (base 16) = 1 (base 10) = 1 (base 2)

You have to sample the analog signal at the Nyquist frequency in order to reproduce the original analog signal from digital. The Nyquist frequency is twice the frequency of the analog wave.

This is the frequency demanded by theory. In practice, engineers usually sample at a slightly higher frequency, to build in a little fudge factor. (Just as they do when they design the maximum weight that can be carried in an elevator, say, or the maximum safe speed of a road).

Humans can hear (analog) frequencies up to about 20,000 Hz. The Nyquist theorem explains why CDs are recorded at 50,000 Hz.

klaus krippendorff

january 26, 2000

it is true that von neumann took ideas of others rather freely, e.g. from shannon the statement that "everything that is described in propositional form is also computable", and, after visiting mauchly and eckart's eniac, the proposal for digital computers.

i think the distinction between analog and digital arose with the emergence of the digital computer that worked with on/off switches in contrast to the then known simulators which worked with fluids and mechanical motions. the eniac, for example, computed complex numerical equations that actually represented continua (ballistic trajectories) which were heretofore simulated on a substitute medium. i guess this is why these simulators became computers with the attribute analogue whereas the turing machine and calculating machine remained machines.

 

Steve R. Swan

January 26, 2000

In summary, the application of the term "digital" to any process, practice or procedure that employs a computer or is computer aided is a contextual "problem" today. I liken it to the inability of some to make a distinction between Y2K (a technology term), the actual year 2000 and the ending of a named millennium.

Say an organization relies on a specific decision making model for operations. The model has been proven over and over as valid and reliable. Now the organization has the opportunity to use computers to transmit knowledge (information) that is used in the process and to report the resulting products (for example, a decision matrix). While using the electronic manipulation and transmitting methods, the organization refers to the activity as digital decision making. The tool has become the focus.

With all the above, what are the definitions of "digital" and "analogue?"

 

Willard Uncapher

January 26, 2000

Thank your for comments.

I might point out that von Neumann, as a the well known mathematician and meta-mathematician, was at the dinner table with Gödel the night before Gödel delivered his proof in 1931. It is said that von Neumann immediately accepted Gödel's results and hailed Gödel as 'the greatest logician since Aristotle.' von Neumann had been interested in axiomizing mathematics *and its statements*, and had been contributing to that line of thought that included Hilbert and Russell. Now Gödel's proof of course depends not simply on a symbolic logic, but also numbering statements.

As a von Neumann biographer says, "Computers needed the last great axiomatizer. In Johnny they got it. He even borrowed with glee from "the greatest philosopher since Aristotle." There is a parallel between Gödel’s encoding of logical statements as numbers and Johnny’s encoding of computer instructions as numbers. This continuing glee explains why it is also untrue that Johnny felt any depression at some of his stumbles in the 1920s. He felt exactly the opposite." (Macrae 1999:126)

Gödel's proofs develop from encoding acti vities and statements at a lower level, and representing and working with them at a purportedly higher level. The logical problems Gödel uncovers might be though to relate to sampling- a tact I take in my own work-, and I might point out that the questions of encoding, modeling, (sampling), error correction, and dynamics/hydraulics are a strong component to von Neumann's work during the 1930s. The notion of the 'digital' as a higher order sampling process is very connected to this exploration of 'levels.' While the question of the digital in terms of a first order sampling is still current, von Neumann's work would appear to have conceptualized this sampling process much more broadly even in the late 1930s.

Let me explain. The question of [digital] 'sampling' might seem ideally rather straightforward, a mere question of deciding the sampling rates, perfecting the best sampling process, calculating the tradeoffs between code storage size and fidelity to the (uncondenseable) complexity of the original, etc. However, this digital translation turns out to be logically more complicated than a quasi-approximate, integral calculus might reveal, particularly when there are multiple sampling levels (e.g. hierarchies)! What are the ideal number of levels in a sampling hierarchy, eh? I point out that von Neumann in the 1930s was working with the mathematics of hydraulics, hierarchies/dimensions, complexity, and geometries with continuously varying dimensions (cf. the Von Neumann algebras), including dimensions with the value of pi, or any other number. The problem is then not simply one of sampling rate, code storage, etc, but also of the 'distance' between sampling levels or dimensions, and the interaction between levels. And von Neumann immediately related these problems to solutions of complexity, something that would later be taken up by Herbert Simon in the late 50s (von Neumann dying at that point). Far from essentializing the analog and digital as 'independent' from one another, a key in my own work, and integral with that of von Neumann is the alternation of analog and digital. Anyone who has read Johnny von Neumann's Notes for the Silliman Lectures (1958), or who has even a rudimentary knowledge of the nervous system will understand the importance of considering the interdependence of analog and digital.

As for the notion that von Neumann somehow 'borrowed' the notion of 'digital computers' "after visiting mauchly and eckart's ENIAC" [at UPenn], my understanding is that this pedigree is much more complicated. Mauchley himself would later lose a lawsuit over this question when it became clear that Mauchley had borrowed key design ideas for computers design from Iowa State University researcher, John V. Atanasoff. As level headed, electrical engineering historian Irwin Lebow states,

"He [Atanasoff] and Mauchly had discussed the subject often. They had met in 1940, and a few months later, Atanasoff had invited Mauchly to visit him at Iowa State to see his machine [the first electronic digital computer] and freely discussed his ideas at those meetings. Atanasoff also visited the Moore School in 1945 where he saw the almost completed ENIAC. Apparently he did not realize the extent to which the ENIAC built on his work because the two machines looked so different. For whatever reason, Eckert and Mauchly never acknowledged any indebtedness of their design to Atanasoff. Chances are that had Sperry not been so greedy [about trying to corner the market using Eckert/Mauchly ENIAC patents]- it was trying to collect large royalties even before the patent was awarded, the suit would never have arisen and Atanasoff's role in the history of the computer would have remained obscure." (Lebow 1995:149)

The 'von Neumann architecture' of dynamically storing the program at a higher level than the activity of the program- this can clearly be related to von Neumann's work going back to the early 1930s. And as for the contention that von Neumann was somehow limiting himself to a single 'central processing unit' (CPU), von Neumann appears to have long been interested in parallel processing, particularly in the nervous system. He felt that 'at this time' (1940s) it was not feasible. However, he did see the question of parallel processing, but logically broke it down discrete processors. Not only does the analog re-enter in the actual physical limitations of digital processing, but the array and communication between digital processes begins to resemble analog sharing. In this way we might find a path from von Neumann's game theory to Stuart Kauffman's boolean networks, although von Neumann deals in more dimensions than Kauffman and most network theorists.

I would agree that the analog/digital contrast appears in the context of computer and electrical design. I was just wondering if anyone had additional insight into who first states the contrast in those terms, and the history of the early philosophical elaboration of that contrast in those terms. I'm sure we all have our take on how this distinction might fit into the broader logical problems of the 19-20th centuries.

 

klaus krippendorff

january 26, 2000

just a brief comment on shannon and von neumann. shannon's master thesis at mit developed the correspondence of propositional logic and switching algebra. this thesis is celebrated as one of mit’s greatest. it showed that anything statable in propositional logic is in principle computable. this was later adopted and is now associated with von neumann.

i have no doubt that von neumanns familiarity with goedel helped him to conceptualize putting the computer program into the very space it needs for computation. but this was not the issue, was it.

as far as eniac is concerned, atanasoff never produced a workable computing machine. i know of the law suit in which the eckert/mauchly patents were challenged, but there were other interests involved, for example the military, which didn't want these patents to come through, and there was also a judge who could not see the newness behind digital computers, and particularly eckert's engineering innovations that made the eniac a reality

but not atanasoff's somewhat vague ideas.

 

Willard Uncapher

January 27, 2000

Indeed, as graduate student during the 1930s, Shannon worked with Vannevar Bush at MIT on the Differential Analyzer and proposed a Boolean Logical Calculus of open/closed, 0/1 to simplify the working of that great electro-mechanical computer. After he got his degrees, he would develop these insights further with Weaver to propose a mathematics of information, again using binaries. However, I would like to distinguish between binary and digital. There is no reason that the 'exit values' of a heuristic or equation need to be 1's or 0s. As we all know, the 1/0s are chosen for reasons such as logical simplicity, the state of the computing machines at that time, and so on. Since all numbers of modulo anything can be represented in modulo 2 (0s and 1s), why not link the fortuitous convergence of on/off, open/closed, 1/0.

The digital result, however, can be more multi-valued. One might think of a heuristic whose activity is to take the 'spoken' word and then encode it into 'written' language. Once we more or less encode spoken sounds into written language, then we can manipulate the code independently of the original, and hopefully retranslate the code back into the original source, translate the digital back into the analog. As we all know, despite the assertion that writing is 'generally' phonetic, in fact there have been considerable changes relating coded language to spoken language, such as the great vowel shift from Middle English to Modern English.

Those of you who have taken Sanskrit (I did before my brief time at Annenberg-UPenn), will remember the complicated Sandhi rules which prioritize spoken language- these are rules to change the spelling of words to reflect how they are ideally spoken (so you can't simply look a textual word up in the dictionary). So what are the logical and conceptual problems of having additional translations like this work? The Sandhi rules work as a sort of post-Ptolemaic system of epicycle designed to keep the code (the description system) in line with the source. In the case of Sanskrit, there were a number of theological reasons to foreground the voice as spoken (vac), whereas a number of later traditions (Hebrew, Christian, Islamic) foregrounded the coded, written version as 'divine' and having priority (hence the interesting hermeneutics they develop instead).

These are the sorts of questions associated with analog/digital, with questions of sampling heuristics and coding/decoding processes. What are the logical problems? What happens if there is an error? These turn out to be questions of some depth which we can develop from the work of Gödel and von Neumann. And I might point out that von Neumann early on noted the relationship between complexity and coding, an insight that would be posthumously take up by Herbert Simon, who is less interested in systems and cybernetics and more in the logic of multi-level coding. I suggest in my work that we need to reunite these fields or directions, particularly if we are to take the 'complexity sciences' and the subset of autopoietic analysis in a new direction. Questions of the decomposability of systems, the thresholds of complexity (cf. von Neumann, Simon 1962, Gottinger 1983, etc.) are strangely undertheorized in, for example, the works of Maturana/Varela or with those loosely associated with the 'Santa Fe School' of complexity.

I have long suggested that we need to return to the underlying logic, and to bring the various schools of complexity back together. I certainly wouldn't be the first to question the issues of levels and hierarchies in the work of, say, Stuart Kauffman (cf. Depew & Weber, MIT, 1997). If we take the digital to represent the discrete space between levels, as I propose, as opposed to the active (analog) 'connection' between elements, then the opening of levels using von Neumann's concept of the digital allows a much robust view of complexity. As Gottinger says of von Neumann, "It is one of the great contributions of J.V. Neumann to have proved the (not very intuitively plausible) fact that a reliable, i.e. predictable system can be built from unreliable parts. 'Unreliability' is not to be understood in an engineering sense of 'non-functioning': rather it is meant in the logical sense of 'non-predictable.' " (Gottinger, Riedel, 1983:9).

Klaus Krippendorff (see above) commented that Von Neumanns familiarity with Goedel helped him to conceptualize putting the computer program into the very space it needs for computation. As I have said, there is more at stake here than computing machines. After all, the field of cybernetics has sought to explore more general theories and concepts of communication and control.

Regarding the Echert Mauchley law suit, or takes a judge or jury of some insight to make a good distinction between vague and clear, so that fact that Eckert and Mauchley lost the legal case can only be factor in a larger historical analysis and judgement. After all, Columbia University's brilliant Edwin Armstrong lost the patent case to capital rich RCA/De Forest lawyers in the Supreme Court (1934) over the feedback oscillator, a key in his invention of multiplexing (leading to FM Radio, Stereo radio, and much more). While he would be bankrupted by the decision, and commit suicide, most electrical historians believe Cardozo, and the Supreme Court erred. But this again leads to the question of the settling of boundaries. That is what the judge is supposed to do: to make determination of boundaries, and other judges can make different decisions. The Mauchley/Astanoff argument turns on just how "vague" (KK's term) those ideas were, and I believe that judgement reflect a degree or dimension of intentionality or motivation, or power and politics. This is the problem with any 'great man' theory of history which bestows credit and agency on any one person. How do we make that boundary?

As for the role of Atanasoff, the question of 'design' is important here, as is the question of engineering the result. There was a machine, but it needed more engineering and, from a higher order, funding. When someone or something is excluded, it is fair to ask why. Was the failure to mention Atanasoff's contribution an oversight, because as a contribution it really wasn't much of a contribution? I have not great interest in deciding the matter one way or another, of setting the boundary here... or there. While you mention the questions of patents, it could also be that Eckert/Mauchly's interest in patents and acclaim was also a factor. The people I have read from the IEEE are not so dismissive of Atanasoff's contribution, although Lebow concludes, "Nevertheless, however much this bitter dispute may have damaged Mauchly's reputation, the fact remains that his creation, the ENIAC, whatever its indebtedness to Atanasoff, was still a remarkable feat." (Lebow, IEEE, 1995:149). Just a thought.

Still, origins of the analog/digital contrast in those terms?

klaus krippendorff

january 27, 2000

a very brief correction. shannon did not work with weaver on the information theory. shannon had started his work in the 40s, when at princeton, spent the war years at bell and published his information theory in 1948 in two installments of the bell system technical journal after declassification. it must have caused quite a splash, so much so that the head of the rockefeller foundation, chester barnard, asked warren weaver, who was at the foundation at that time, to review the theory and express it in simpler terms. this was published in scientific american's june 1949 issues. wilbur schramm put shannon's theory and the substance of weaver's interpretation into a book, published in 1949. things moved fast at that time.

binary and digital are not the same, i agree. although shannon had worked on switching algebra before information theory, he used binary numbers only to count alternatives. bits (binary digits) were only convenient units of measurement. the logarithm to the base 2 had no impact on the 21 theorems of information theory he developed, but these theories presuppose distinctions, alternatives, differences which are absent in the domain of the analogue. incidentally, in his work, shannon also offered a way to apply his theory to analogue signals. it amounted to a digital approximation to continua, with which we are all familiar in digitized images and sound.

i think bateson and watzlawick have used the digital/analogue distinction somewhat cavalierly, associating language with digital and behavior and emotions with analogue. as if verbal sound would switch abruptly from one level to another and nothing in-between would matter, as if speakers and listeners had nothing to say about which differences make a difference for them. the examples that bateson gives for where differences matter are mostly non-verbal, on the level of signals (in circuit). our writing system, using freely combinable characters is digital, but speaking is of doubtful identity in these terms. your interest in translation from digital and analogue, and/or in the reverse, runs into all kinds of problems when one doesn't take into account for whom this is so.

 

Willard Uncapher

January 28, 2000

I agree that Shannon developed his ideas before meeting Weaver, and thank you for observation. I didn't mean to imply a new take on this epic work, but conflated the dissemination with the production to fill out my picture quickly.

I am still parsing statement: "i think bateson and watzlawick have used the digital/analogue distinction somewhat cavalierly, associating language with digital and behavior and emotions with analogue" (K. Krippendorff, see above) so I am unsure how to respond. But yes, my alarms go off where ever I come across a reductive dualism.

Dualisms are good for defining boundaries, but where do they come from, and who is in charge of them? The idea of distinguishing 'language' and 'emotion/behavior' calls out for deconstruction That's one of the things that is so peculiar about analog and digital: while they seem like perfect contrastive terms on paper, as logical, bounded concepts, an investigation of how they work together soon stumbles into serious logical problems. While we might come up with a table to terms like continuous/discrete, there is something seriously amuck when we apply these oppositions.

In my solution, I would invert the problem and suggest that rather than essentializing things like 'emotions' and 'language' we need to consider the paradoxical nature of their opposition as manifested in them. I posit the paradox as fundamental. As I have long pointed out, Derrida's early work of deconstruction, such as found in his "Genesis and Structure" (1959), and Edmund Husserl's Origins of Geometry: An Introduction" (1962), explicitly connect Derrida's work with that of Gödel. Just what Derrida may have learned from Gödel is a matter I take up elsewhere, where I note the connections between paradoxes of Gödel’s method, the paradoxes von Neumann deals with in working with analog and digital as complements in complexity, and the problems of categorizing activities in Derrida. I don't know if anyone has ever published anything about this.

But yes, Klaus, there is a serious, even profound problem in trying to set analog and digital up in a normal table of contrasting terms. The problem is not doing it- metaphor (analog)/ metonymy (digital), etc.- but figuring out what one has done in making those distinctions. In exploring the way analog and digital can work together, I would think that Von Neumann opens a path beyond some of the essentializing dilemmas explored by Derrida, or even a number of classical one associated with Nagarjuna, Chandrakirti, or the Pyrrhic skeptics (including Montaigne).