ASC HOME   2002 Conference Home   About this Contribution
ASC
CONFERENCE CONTRIBUTION:

American Society for Cybernetics
ASC 2002 Conference
June 13-16, Santa Cruz 

 


 
  The Deep Structure of Communication and Control

Ern Reynolds

 

 
 

Cybernetician Stafford Beer penetrates and explains the deep structure of communication and control. That deep structure underlies statecraft, warfare, religion, and commerce. It is explained below in 23 principles, labeled A through W. Their sequence is not significant. This wording is adapted from Allenna D. Leonard's technical notes on cybernetics, found in pages 199-224 of Barry Clemson, Cybernetics: A New Management Tool, (Abacus / Gordon & Breach, New York 1984).

 

A. System Holism Principle: A system has holistic properties possessed by none of its parts. Each of the system parts has properties not possessed by the system as a whole. (Reminiscent of concentration of forces. An observer who gives an "unwelcome alarm" cannot get away by commenting on a fragment of the sick system, because that fragment is connected to the whole.)

B. Darkness Principle: No system can be known completely. (Reminiscent of a chance or gamble. This is why an observer who gives an "unwelcome alarm" stirs up much more grief than anticipated.)

C. Eighty-Twenty Principle: In any large, complex system, eighty percent of the output will be produced by only twenty percent of the system. (i.e., the operations; consider concentration of force. An observer tends to inadvertently strike the tenderest of spots.)

D. Complementarity Law: Any two different perspectives (or models) about a system will reveal truths about that system that are neither entirely independent nor entirely compatible. (Reminiscent of the environment's properties and paradoxes, as well as Heisenberg's Uncertainty Principle. The observer and his nominal superiors each have distorted incomplete views of the system, neither of which is totally consistent with the other.)

E. Hierarchy Principle: Complex natural phenomena are organized in hierarchies with each level made up of several integral systems. (Reminiscent of overlaps. An observer tends to puncture multiple levels of recursively nested systems without consciously setting out to do so.)

F. Gödel's Incompleteness Theorem: All consistent axiomatic foundations of number theory include undecidable propositions. (Nobody can measure why retributions for an "unwelcome alarm" are so out of proportion to what the observer revealed.)

G. Entropy: The Second Law of Thermodynamics: In any closed system the differences in energy can only stay the same or decrease over time; or, in any closed-to-information system, the amount of order (or organization) can never increase and must eventually decrease. This natural law is not absolutely true but is merely statistically reliable. (An observer is a breath of negative entropy to a dying system, i.e., s/he imparts critical information that if heeded would preserve and heal.)

The disintegrative force of entropy is exactly half as strong as the one forcing coherence, which R. Buckminster Fuller labels "syntropy". Cosmography, (Macmillan, New York 1992), pp. 56-57.

(This measurement places an observer's truth as double the strength of a reprisor's falsity.)

H. Redundancy of Information Theorem: Errors in information transmission can be protected against (to any level of confidence required) by increasing the redundancy of the messages. (Lines of communication bring reinforcement and resupply. Retaliation works against an observer so long as the intimidation can keep the "unwelcome alarm" from being repeated.)

I. Redundancy of Resources Principle: Maintenance of stability under conditions of disturbance requires redundancy of resources. (Think once again of concentration of forces. The only person comfortably situated to be an observer needs to be independently wealthy. This applies emotionally as well as financially.)

J. Redundancy of Potential Command Principle: In any complex decision network, the potential to act effectively is conferred by an adequate concatenation of information. (A victory on main objective settles all minor issues.) (An observer gathers enough information to make the command decisions that ought to be coming from above in the hierarchy but are not. Reminiscent of synergy.)

K. Relaxation Time Principle: System stability is possible only if the system's relaxation time is shorter than the mean time between disturbances. (Retaliation against an "unwelcome alarm" must be repeated to permit the observer no respite; otherwise reprisal is too weak to suppress any recurrence.)

L. Circular Causality Principle One: Given positive feedback (i.e., a two-part system in which each stimulates any initial change in the other), radically different end states are possible from the same initial conditions. (Reminiscent of amplifiers in reverberation; equifinality. Repeating the "unwelcome alarm" disbalances an already teetering system.)

M. Circular Causality Principle Two: Given negative feedback (i.e., a two-part system in which each part tends to offset any change in the other), the equilibrial state is invariant over a wide range of initial conditions. (Reminiscent of attenuators. In "unwelcome alarm" situations this represents the peace-at-any-price purpose of retribution.)

N. Feedback Dominance Theorem: For high gain amplifiers, the feedback dominates the output over wide variation in input. (This explains why retaliation against an observer has such bite.)

O. Homeostasis Principle: A system survives only so long as all essential variables are maintained within their physiological limits. (Reminiscent of balance, coordination, integration, and boundaries. Quite simply an observer inadvertently disrupts a precarious trembly homeostasis.)

P. Steady State Principle: If a system is in a state of equilibrium (a steady state), then all sub-systems must be in equilibrium. If all sub-systems are in a state of equilibrium, then the system must be in equilibrium. (The observer does not disrupt an ongoing steady state, but merely reports that such a condition has already vanished.)

Q. Ashby's Law of Requisite Variety: The control achievable by a given regulatory sub-system over a given system is limited by (1) the variety of the regulator, and (2) by the channel capacity between the regulator and the system. Only variety in the regulator can destroy variety coming out of the environment as a disturbance. The upper limit on the amount of regulation achievable is given by the variety of the regulatory system divided by the variety of the regulated system. (Observer disruption only occurs where coping ability, or requisite variety, has already been sacrificed.)

R. Conant-Ashby Theorem: Every good regulator of a system must contain a model of that system. (Reminiscent of concentration of forces upon essential variables. An observer usually reveals that the model inside is partial or fragmentary, if not missing entirely. To the extent that master model omits something essential, the regulator will be expensive, overlarge, ineffective, or all three.)

S. Self-Organizing Systems Principle: Complex system organize themselves; the characteristic structural and behavioral patterns in a complex system are primarily a result of the interactions among the system parts. (An observer emerges spontaneously and inadvertently because a system in peril seems to cry out for rescue. "Unwelcome alarms" can only be prevented by having good effective systems, not bad ones.)

T. Basins of Stability Principle: Complex systems have basins of stability separated by thresholds of instability. A system "parked" on a ridge will roll downhill. (Reminiscent of the properties of the environment in which the system-in-focus is embedded. It also suggests why crossing system boundaries is always potentially destabilizing. "Unwelcome alarms" are ineffective short term and medium term because even if the warning gets repeated multiple times, a rotten system will seldom crack all at once. That only happens with rotten fruit or rotten eggs.)

U. Viability Principle: Viability is a function of the balance maintained along two dimensions: (1) autonomy of sub-systems versus integration of the system as a whole, and (2) adaptation versus stability. ("Unwelcome alarms" only arise where the central regulatory model is wrongly overwhelming local autonomy, where stability is being overrewarded at the expense of healthy adaptation.)

V. Recursive System Theorem: If a viable system contains a viable system, then the organizational structure must be recursive, or, in a recursive organization structure, any viable system contains, and is contained in, a viable system. (Reminiscent of overlaps. An observer is surprised by the retribution heaped upon him/her because s/he has inadvertently needled a blister down below that is a tipoff symptom for a much larger canker above.)

W. Predictable Outcomes Principle: Every system is perfectly designed to get the results that are achieved. The purpose of a system is whatever it does, produces, or yields. (There are no real surprises about what makes "unwelcome alarms" necessary. Lack of a reliable constraint to protect an essential variable is always noticeable and determinative of "unintended consequences" in advance.)


Definitions

  • System -- any set of variables selected by an observer.

  • Variety -- all the possible states of the system.

 
 
 
ASC HOME   BACK: 2002 Conference Contributions   Top of Page
ASC
ABOUT THIS ARCHIVED CONTRIBUTION:

This HTML transcription was generated from (e.g.) an electronic manuscript and/or whatever other record materials were available. The manuscript has been transcribed "as is" - i.e., with no modifications beyond those minor ones required for basic Web viewing (e.g., tagging special characters, converting graphics).

HTML transcription: Randy Whitaker, October 2002