Cybernetics

views updated May 11 2018

CYBERNETICS

Cybernetics is defined classically as the study of "control and communication in the animal and the machine" (Wiener 1948). After the decline of classical cybernetics, the field underwent a rebirth as "second-order cybernetics" in the early 1970s. Second-order cybernetics is more closely and more obviously involved with ethics than classical cybernetics (and certainly promotes a radically different worldview), but both have important contributions to make to reflections on science, technology, and ethics. Cyberculture, an increasingly important phenomenon that includes elements as diverse as email and chat rooms, electronic commerce and gaming, virtual reality and digital politics, has its origins not just in computers but also in the lesser known field of cybernetics (from which it takes its name).


Cybernetics

Cybernetics was originally promoted by the mathematician Norbert Wiener (1894–1964) in his 1948 book of that name (although W. Ross Ashby's 1956 book, An Introduction to Cybernetics, is considered the classic introductory text). The terms of cybernetics (including goals and purposiveness, feedback, and mechanism as metaphor) had been previously used, as was the concept of control as attaining and maintaining desired states, rather than restricting the actions of others—but not as concepts forged into a coherent field. In the development of cybernetics, two groups were particularly important: the informal association of Wiener, Arturo Rosenblueth (1900–1970), and Julian Bigelow (1913–2003) at the Massachusetts Institute of Technology (MIT); and the Josiah Macy Jr. Foundation meetings on "Circular, Causal, and Feedback Mechanisms" (which assumed the supertitle "Cybernetics" after the publication of Wiener's book), which included Warren McCulloch (1898–1969), Walter Pitts (1923–1969), Margaret Mead (1901–1978), Gregory Bateson (1904–1980), Heinz von Foerster (1911–2002), and Wiener and Rosenblueth.

The term cybernetics was derived from the Greek kybernetes, meaning "helmsman," and the field initially examined the behavior of (often complex) systems to develop models for improving system performance. The models were based on a notion of universally applicable mechanism: No essential differentiation was made between animate and inanimate systems. Examination of behaviors meant that systems which seemed impossibly complex or obscure no longer needed to remain so. If cyberneticians could not see what constituted a system, they could treat the system as a black box, which, through careful study of the inputs and consequent outputs, could be notionally "whitened" to the point that a viable mechanism relating input and output could be imagined, even if the actual mechanism remained unknown.

The intention was that systems would become controllable or better able to achieve the aims for which they were intended. The systems that cyberneticians studied were assumed to have observer-defined goals. Potential for error was understood to be omnipresent. To correct an aberration in the behavior of a system, differences between the (hypothesized) goal and behavior were examined, and the system adjusted to compensate for any difference (error). The process of error determination and correction continued until the system began to attain (and continue to attain) its goal.

Although the physical systems initially considered by cyberneticians were military and mechanical (starting with antiaircraft guns and developed through W. Grey Walter's electronic "tortoise" and Ashby's "homeostat," as much as through the computer and the robot), the animate quickly grew to be of equal significance. Application to social, anthropological, and psychological issues was pursued by Mead and Bateson (Bateson 1972a), especially in regard to mental health issues—a concern that Bateson shared with Ashby, also a psychologist. Management cybernetics was born of Stafford Beer (1926–2002) in the 1960s, and Gordon Pask (1928–1996) began cybernetic studies of teaching and learning in the 1950s.

There are many similarities between classical cybernetics and the slightly later mathematical theory of communication, or information theory, of Claude Shannon and Warren Weaver (1949); and general systems theory and its siblings, such as systems science, as developed by Ludwig von Bertalanffy (1950), making differentiation between these approaches difficult. Which term is used is frequently no more than a personal preference or historical accident. All of these approaches made notable contributions to such scientific and technological understandings and developments as the relationship between wholes and parts, automated control systems, approaches to complexity, developments in computing and communications hardware and software, and homeostasis in biological systems—to list but a few.

Early on, Wiener recognized ethical dangers in the cybernetic approach. The conjunction of animal and machine, even used metaphorically, has ethical implications—especially when the metaphor is predominantly of the animal as machine rather than the machine as animal. Another typical (and well-known) danger is that associated with the power of the machine, as exemplified, for example, in Isaac Asimov's "Three Laws of Robotics," from his science-fiction writings, which read:

  • First Law: A robot may not injure a human being, or, through inaction, allow a human being to come to harm.
  • Second Law: A robot must obey orders given it by human beings, except where such orders would conflict with the First Law.
  • Third Law: A robot must protect its own existence as long as such protection does not conflict with the First or Second Law. (Asimov 1942)

Wiener's Human Use of Human Beings (1950) is his attempt to come to terms with the most important of these dangers. He was not alone in this awareness. These ethical considerations, however, are not peculiar to cybernetics.


Second-Order Cybernetics

The initial promise of cybernetics was more than could be delivered, and the subject fell out of favor. By 1970 its funding base had eroded (with assistance from the Mansfield Agreement, a U.S. law introduced to prevent the military from funding any speculative research, or research that might not lead to an immediate military outcome). For some cyberneticians this indicated retrenchment, for others reconsideration leading to a new beginning: second-order cybernetics. The critical insight differentiating second-order cybernetics from classical (first-order) cybernetics is that second-order cybernetics takes cybernetic circularity more seriously.

Classical cybernetics exists within a worldview in which energy considerations reign paramount. The feedback loop is understood as requiring insignificant amounts of energy, thus creating a hierarchy. The controller, using relatively (and ignorably) little energy, controls the controlled, which is the big energy using part of the system. In second-order cybernetics, form and information are considered in preference to energy. In a second-order cybernetic control loop, the information passed between controller and controlled is understood to be of equal status. First-order hierarchy disappears. Each component in the loop contributes to the control of the whole. In effect, each component controls the other and the controller/controlled distinction is seen as a matter of role. The circular form of the cybernetic system is no longer disguised.

The difference was not initially presented this way. The originator of second-order cybernetics, von Foerster, made the following distinction on the frontispiece of his compilation "The Cybernetics of Cybernetics" (1975):

First order cybernetics—the cybernetics of observed systems / Second order cybernetics—the cybernetics of observing systems.

These two characterizations, however, appear similar if one treats observe and control as interchangeable verbs, and remembers that the observing/controlling system is observing/controlling the observed/controlled system in order to develop understanding, which requires feedback. Furthermore, these concerns are similar to those expressed in the involved observer of Ernst von Glasersfeld's Radical Constructivism (1987).

The circular systems of Second Order Cybernetics are essentially autonomous. Their stability derives from their (internal) maintenance of their circular processes. To an external observer they may appear to veer wildly. An example is the Autopoietic system of Humberto Maturana, Francisco Varela and Ricardo Uribe. This system constructs and then maintains itself, providing a model of "life"—or, rather, "living." Such systems are said to be organisationally closed but informationally open: the form of the system maintains (distinguishes) itself, is in this manner autonomous (Maturana and Varela 1992). Information enters, passes through (is processed by) and exits it. The system distinguishes itself as itself. Because these systems are autonomous, any meaning the information passing through them may have is unique, private to each system. Communication between these systems cannot be by transmission of meaning because each system builds its own meaning: Meanings are not communicated. Uncoded communication may, however, occur through conversation. Pask's conversation theory (a formalized version of everyday conversation developed, initially, to support communication in learning environments) provides a structure to sustain communication that is formally equivalent to the other circular systems of second-order cybernetics (Glanville 1996).

Admitting autonomy and conversation requires a system that accepts that, individually, one sees differently and understand uniquely, while acting as though one believes the objects one observes are the same. Otherwise, one's relativism would lead to isolation because one has nothing communicable and there is no one to communicate with. Ranulph Glanville's theory of "Objects" (1975) provides the framework that allows individuals to believe they each make different observations of the world, yet can act as if observing the same "Object"—the essential conceptual basis making second-order cybernetics and its ethical implications viable.

Second-order cybernetics has made notable contributions in such areas of human understanding as learning, conversational communication, and the emergence of the unanticipated (often through conversational processes). In particular, through the concepts and mechanisms of autopoiesis, it has aided in the understanding of how social systems acquire stability. Nevertheless, second-order cybernetics is probably better thought of more as a way of understanding than as a technology.


Ethics

There are those who would argue that, perhaps more than any other scientific or technological field, second-order cybernetics constitutes an effort to develop a scientific basis for ethics. As such it constitutes an important contribution to any discussion concerned with science, technology, and ethics. This section sketches the basis of this contribution.

Second-order cybernetics' circular systems are autonomous—the starting point for the ethical implications of second-order cybernetics. Von Foerster was among the first to register the ethical dimension in his essay, originally published in 1973, titled "On Constructing a Reality" (von Foerster 2003a); even more relevant was his 1992 essay, "Ethics and Second-Order Cybernetics." (Von Foerster's 1993 German book KybernEthik originated the term CybernEthics.)

Von Foerster proposed two imperatives:

Ethical imperative:/Act always so as to increase the number of choices. / Aesthetical imperative: / If you desire to see, learn how to act.

The ethical imperative insists that cybernetics has a dimension in ethics. Cybernetics implies generosity, increasing options. Von Foerster contrasted the essential meanness of morality (restrictions applied to others) to the generosity of ethics (which comes from within.)

The origin of this ethical concern can be seen to lie in the age-old question of what reality, if any, we can know independent of our knowing (i.e., is there a mind-independent reality [MIR]?). Although making a strong assumption of MIR is now commonplace, the question is in principle undecidable. Von Foerster remarked, "only we can decide the undecidable," leaving responsibility for answering this question (and, hence, for determining how we act) with each individual: one pursues whichever option one chooses. One's approach to one's world starts from this choice, which can be made once, or remade at will.

In second-order cybernetics, one's understanding of the world may be said to derive from a position of essential ignorance. The black box provides a mechanism for this. The understanding an observer builds through interacting with experience is (in the black box model) tentative: A reliable description of behavior emanating from the box may suggest it has been whitened, but nothing about the black box and our relationship to it has changed. It remains unopened (and unopenable)—provisional, as black as ever. Knowledge gained from using this model is based in profound ignorance. One cannot, therefore, insist on rightness and should tread warily, respecting the different views of others. The ethical implication of ignorance is respect for the views of others since one can never be certain, oneself. The views of others are considered as equal in stature to one's own—which does not mean theirs—or one's own—are either correct or viable.

Furthermore, the relationship between the behaviors (or signals), that is, the input and the output that black boxes are taken to act on—causing input to become output—results from interaction between observers and their own black boxes. Causality and its legal counterpart, blame, are seen to arise not from mechanism but from patterns observed by observers. The value of this understanding in how one acts cannot be over-emphasized, and is confirmed in many psychotherapies that depend for their effectiveness on persuading people that the blaming causality they see is their construction and responsibility. It is not what happens to one that matters, but how one responds to it.

The black box model requires that one distinguishes: If there is no distinction between behaviors there is nothing to experience. In essence, why distinguish myself if I am alone? Distinguishing myself, I distinguish myself also from another. This act of distinguishing brings into being and implies mutualism: whatever qualities may be attributed to one side of the distinction may (but need not) be attributed to the other. What I take for myself I may give you—this is von Foerster's ethical imperative again.

Distinctions, made in observing, can be considered a basis upon which observers construct experience, including experience of themselves. In order to assume experience is not solipsistic we assume that the other constructs (its experience of) itself (and us) in a reciprocal manner—another form of mutuality. Self-construction and maintenance indicate organizational closure: There is a boundary (it distinguishes its self) and the system is autonomous. An autonomous system is responsible. It has built itself, maintains itself (is organizationally closed), while it remains informationally open (communicates with its environment, thus substantiating the claim that, in distinguishing, one both distinguishes and distinguishes from). Bateson brings these ideas together when he uses the notion of difference (distinction) to define information: the difference that makes a difference (Bateson 1972b). The acceptance of responsibility grows out of autonomy (von Foerster 2003b): Autonomous systems are responsible for their actions. Here is the source of the aesthetical imperative.

There remains communication—that is, conversation. When communication is understood as individual construction of—and responsibility for—meaning and understanding by each participant (rather than the transmission of meanings and understandings), one can see that to understand the other one trusts the other's goodwill, acting with generosity, trust, honesty, and openness to build the understandings one will map onto each other's. This is an interaction. Teaching and learning (and much else beside) are interactive—the reason Pask developed conversation theory.

In turn, this understanding reveals that all one knows requires an observer's (knower's) presence, an understanding crucial in how one treats learning. Maturana said, "Everything said is said by an observer." Von Foerster retorted, "Everything said is said to an observer" (Von Foerster in Krippendorf 1979, p. 5). Respecting the observer is an ethical behavior.



Conclusion

Second-order cybernetics implies individuals are willing to treat each other, and (other, second-order) cybernetic systems, with a goodwill and generosity that can and should be understood as ethical implications. These go against some of the meaner understandings people currently and fashionably hold about their position in the world. Second-order cybernetics provides, in the ethical arena, hope and delight: those behaviors that are often considered higher, more civilized, and better are assumed and sustained in this way of understanding—a better-than-good reason for taking its lessons seriously.

RANULPH GLANVILLE

SEE ALSO Automation;Cyberspace;Posthumanism;Science, Technology, and Literature;Wiener, Norbert.



BIBLIOGRAPHY

Ashby, W. Ross. (1956). An Introduction to Cybernetics. London: Chapman and Hall. The most complete textbook introduction to cybernetics presented with simple profundity.

Asimov, Isaac. (1942). "Runaround." Reprinted in his Robot Dreams. London: Victor Gollancz, London, 1989.

Bateson, Gregory. (1972a). Steps to an Ecology of Mind: Collected Essays in Anthropology, Psychiatry, Evolution, and Epistemology. San Francisco: Chandler. Collected essays and commentaries from a multidisciplinary scholar who was one of the founding fathers of cybernetics, and whose writings can be seen to anticipate second-order cybernetics in several respects.

Bateson, Gregory. (1972b). "Pathologies of Epistemology." In Steps to an Ecology of Mind. An exploration of why we need the sort of shift that second order cybernetics embodies.

Glanville, Ranulph. (1975). A Cybernetic Development of Theories of Epistemology and Observation, with reference to Space and Time, as seen in Architecture. Ph.D. Thesis, Brunel University, 1975. Also known as The Object of Objects, the Point of Points,—or Something about Things.

Glanville, Ranulph. (1996). "Communication without Coding: Cybernetics, Meaning and Language (How Language, Becoming a System, Betrays Itself)." Invited paper in Modern Language Notes 111 (3): 441–462. Explores communication, language and meaning when the mechanism is conversation rather than code.

Glanville, Ranulph. (2002). "Second-Order Cybernetics." In Systems Science and Cybernetics: The Long Road to World Sociosystemicity, ed. F. Para-Luna. Oxford: Encyclopaedia of Life Support Systems (EOLSS) Publishers. One of the clearest explanations of a second-order cybernetic position and its epistemology.

Maturana, Humberto R., and Francisco J. Varela. (1992). The Tree of Knowledge: The Biological Roots of Human Understanding, rev. edition, trans. Robert Paolucci. Boston: Shambala.

Shannon, Claude, and Warren Weaver. (1998 [1949]). The Mathematical Theory of Communication. Urbana: University of Illinois Press. The classic text introducing what came to be called Information Theory.

Von Bertalanffy, Ludwig. (1950). "An Outline of General Systems Theory." British Journal for the Philosophy of Science 1: 134–165. One of von Bertalanffy's earlier papers discussing his proposals for a General Systems Theory.

Von Foerster, Heinz. (1979). "Cybernetics of Cybernetics." In Communication and Control in Society, ed. Klaus Krippendorf. New York: Gordon and Breach. Reprinted in Understanding Understanding: Essays on Cybernetics and Cognition, ed. Heinz Von Foerster. New York: Springer (2003).

Von Foerster, Heinz. (1991). "Ethics and Second-Order Cybernetics." In French in Systemes, Ethique, Perspectives en therapie familiale, ed. Yveline Rey and Bernard Prieur. Paris, ESF Editeur. Also in Understanding Understanding: Essays on Cybernetics and Cognition, ed. Heinz Von Foerster. New York: Springer (2003).

Von Foerster, Heinz. (1993). Kybernethik. Berlin: Merve Verlag.

Von Foerster, Heinz. (1995 [1975]). The Cybernetics of Cybernetics. Minneapolis: Future Systems Inc. A collection of cybernetic papers treated to cybernetic analysis providing a starting point for second order cybernetics. Von Foerster compiled and edited this collection, which was assembled by students at the University of Illinois and published by the publisher of the Whole Earth Catalogue, Stewart Brand.

Von Foerster, Heinz. (2003a). "On Constructing a Reality." In his Understanding Understanding: Essays on Cybernetics and Cognition. New York: Springer. Originally published in Environmental Design Research, Vol. 2, ed. Wolfgang F. E. Preiser (Stroudsburg, PA: Dowden, Hutchinson and Ross, 1973).

Von Foerster, Heinz. (2003b). "Responsibilities of Competence." In his Understanding Understanding: Essays on Cybernetics and Cognition. New York: Springer. Originally published in Journal of Cybernetics 2, no. 2 (1972): 1–6. The first venture into ethics by the father of second-order cybernetics.

Wiener, Norbert. (1948). Cybernetics; or, Communication and Control in the Animal and the Machine. New York: Wiley. 2nd edition, Cambridge, MA: MIT Press, 1961.

Wiener, Norbert. (1950). The Human Use of Human Beings: Cybernetics and Society. Boston: Houghton Mifflin.


INTERNET RESOURCE

Ashby, W. Ross. An Introduction to Cybernetics. Available from http://pespmc1.vub.ac.be/ASHBBOOK.html.

Cybernetics

views updated May 23 2018

CYBERNETICS

•••

Cybernetics, in its purest definition, is the science of control and communication in the animal and the machine. The word was devised by Norbert Wiener in the 1940s and is derived from the Greek word kybernetes , meaning "steersman." In his book The Human Use of Human Beings (1950), Wiener wrote that "society can only be understood through a study of the messages and the communication facilities which belong to it; and that in the future development of these messages and communication facilities, messages between man and machines, between machines and man, and between machine and machine, are destined to play an everincreasing part" (Wiener, p. 16). In 1957, W. Ross Ashby described the focus of this theory of machines as focusing not on what a thing is, but on what it does: "Cybernetics deals with all forms of behavior in so far as they are regular, or determinate, or reproducible. The materiality is irrelevant" (Ashby, p.1). Recognizing that there are significant similarities in biological and mechanical systems, subsequent researchers have pursued the ideal of merging biological and mechanical/electrical systems into what Manfred Clynes and Nathan Kline termed cyborgs or cybernetic organisms. In this sense, cybernetics has taken on the meaning of adding prostheses to the human or animal body to either replace lost function or augment biological activity.

Humans have long used tools to augment various functions, and for centuries have attached some of these tools to their bodies. Filled or artificial teeth, glasses and contact lenses, hearing aids, pacemakers, and artificial limbs are all examples of this phenomenon. By the late twentieth century, significant advances in the fields of neuroscience and computer technologies allowed the direct interface of animal or human nervous systems with electromechanical devices. Examples of this evolving field include the creation of neural-silicon junctions involving transistors and neurons to prepare neuronal circuits, the re-creation of visual images from signals transmitted in the optical pathways of a cat, the remote control of mechanical manipulator arms by implants inserted into the motor cortex of owl monkeys, and a remote control that can move rats over a directed path via implanted electrodes.

While the above are examples of direct internal interfaces between a nervous system and a cybernetic prosthesis, another approach to cybernetic augmentation is through the use of external or wearable computing devices. In this approach, prosthetic enhancement is achieved via miniaturization of traditional computing devices, interface mechanisms, and optical projection devices, and then seamlessly incorporating these devices into clothing, glasses, and jewelry. This form of cybernetic enhancement has moved from the academic to the commercial stage. Aside from allowing the user/wearer of such devices wireless access to the Internet and other databases on a continuous basis, they may also be used for augmented reality, which is the concept of supple-menting traditional sensory input with augmented senses or new types of sensory data. Examples include retrograde vision (seeing to one's rear), distant or projected hearing, and infrared vision. Further visual input may be analyzed and correlated with other information such as Global Positioning System (GPS) location identification. Buildings and streets could be labeled, hours of business accessed, and people visually identified (with demographic information provided), with all of this information directly projected on the user's retina.

While these developments may sound like something out of a Star Trek episode, cybernetic technology has developed at a rapid pace, and will no doubt continue to be a growing field of investigation, therapeutic intervention, and commercial development. In June 2002, the National Science Foundation and the U.S. Department of Commerce issued a report recommending substantial U.S. government investment in the development of cybernetic technologies, with the specific goal of augmenting human performance. These technologies will be produced by the synergistic convergence of biotechnology, nanotechnology, information technology, cognitive science, and neurotechnology through a proposed Human Cognome Project.

Healing versus Augmentation

As has already been indicated, the mechanical or prosthetic manipulation of human beings is not a new idea or practice. In the past, however, such interventions have almost always been in the context of repair or replacement of absent, diseased, or disordered function. The goal of visual lenses is to restore vision to biological norms, not to augment or improve function beyond normal. Similarly, prosthetic limbs replace those congenitally absent, malformed, or traumatically severed or injured. Pacemakers replace the electrical pacing of heart contractions lost through injury, aging, or disease. In this context, new tools to restore sight to the blind, hearing to the deaf, and movement and normal function to the lame or paralyzed are tremendous advances fully in keeping with the traditional goals of medicine (healing, restoring, palliation, and prevention of injury). Yet humans also use telescopes, microscopes, night vision, and other means of augmenting visual function for specific purposes. The difference is that these tools are not permanent fixtures of the body. Wearable computing devices and implantable brain chips, however, are being produced and marketed to enhance the normal, not necessarily heal the afflicted. This raises a number of challenging ethical questions, including whether or not human augmentation should even be permitted, let alone pursued?

Before the question of whether augmentation should be permitted, promoted, or prohibited can be addressed, a more basic issue must be considered: Can a distinction between healing and augmentation be delineated? This question poses equal challenges to a variety of areas in addition to cybernetics, particularly the more immediate possibilities of genetic therapy or enhancement and pharmacotherapy for behavior control, mood enhancement, and cognitive enhancement.

The difficulty lies in trying to define a clear line of demarcation between a disease state and normal structure and function. It is sometimes easy to pick out extremes of phenotype, particularly if an underlying pathophysiological mechanism for the deviation can be demonstrated. Examples include hemophilia, congenital dwarfism, and impaired vision. Other situations raise difficulties, illustrating that many times the definition of disease or abnormality can be socially, rather than objectively or scientifically determined. How much deviation from ideal body weight is within the bounds of normal variation, and when does the deviation become pathologic? While anorexia nervosa and morbid obesity are clearly pathologic in that they can influence survival and other health issues, a significant number of individuals are on the edges of the norms, where the threshold of pathology is unclear.

A striking example of the cultural variation in the definition of disease is the response of many congenitally deaf individuals to the suggestion that they are afflicted and in need of therapy. Many deaf parents of deaf children have refused to allow their children to receive cochlear implants to correct the deafness because this would remove the children from the deaf community. At a 1997 conference of deaf individuals, 16 percent of the delegates were interested in prenatal diagnosis for deafness, but, of that group, 29 percent indicated that they would use these techniques to select for deafness in the child (see Middleton, et al.).

Cognitive and neurological function, the areas most impacted by cybernetics, are particularly fraught with difficulty, in part because certain deviations from the norm may impart certain functional advantages in addition to social or behavioral liabilities. For instance, while attention-deficit/hyperactivity disorder (ADHD) and autism are diseases, many of the individuals who have these conditions also manifest significant brilliance and creativity in mathematics, music, art, science and engineering. Both the positive and negative manifestations are part of the same disease entity, and what degree of negative manifestation requires treatment becomes subjective. The treatments employed may suppress the undesired manifestation, but they may also impair the desirable expressions. The situation becomes even more complex when these challenges are extended to a measure of cognitive function such as memory, mathematical calculation, musical ability or language processing. Who doesn't think of himself or herself as being deficient in cognitive abilities or able to benefit from enhancement of cognitive function?

In addition, necessary cognitive function may be very task or profession specific. Should individuals who would be considered cognitively normal be allowed to receive enhancing technologies to permit them to pursue a career otherwise beyond their intrinsic ability? And, as these technologies become available, should professions that demand high levels of cognitive excellence be allowed to require the use of enhancing technologies? Given that books and computers are forms of information exchange enhancement that are currently required for education in the professions, one could argue that the only thing that has changed is the intimacy of the enhancing method. Because, these technologies may intrinsically carry certain risks that are absent from current information technologies, however, many believe that such means should never be mandated, but only available by free choice. The reality, however, is that competition with peers will serve as a strong coercive force to pursue enhancement.

Safety Questions

The answers to these questions require the consideration of additional issues. At the most basic, cybernetic technologies, both implantable and wearable, must demonstrate physical, emotional, and cognitive safety. While physical safety will, in general, be the most easily addressed, there are still new challenges beyond those typically encountered by medical devices. Traditional medical-device safety issues include the risk of infection, local reaction, tissue injury, and involuntary or undesired neural or muscular stimulation. Current devices, however, tend to function in isolation in the specific local environment of the recipient body. Cybernetic devices, on the other hand, will often be connected to a shifting network environment, dependent upon software and the exchange of external information as well as hardware. As such, viral code could potentially disrupt function of the device, and possibly injure the user. Even wearable devices could potentially be turned into weapons, and so need to be strongly regulated, with proof of software and hardware safeguards against injury by rogue software agents.

The issues of emotional and cognitive safety will be more challenging to understand and regulate. In the era of the Internet there is a growing literature addressing problems with personality fragmentation, breakdown of direct personal interaction in favor of cyber relationships, increasing dissatisfaction with reality, addiction to cybersex and pornography, and other psychosocial concerns. These concerns can only increase when individuals are cybernetically connected most, if not all, of the time. The long-term consequences of virtual environments are unknown. The variability of involvement and susceptibility to dysfunctional utilization will vary tremendously between individuals, making generalized regulation difficult. However, some form of registered informed consent as to potential negative consequences, with mandatory, periodic, and long-term follow-up, may be helpful.

The Nature of Medicine

The issue of safety introduces yet another question: What sorts of individuals should be involved in implanting devices for internal cybernetic enhancements or for fitting wearable devices with optical interfaces? Because of the invasive nature of the implants, it would seem a logical requirement that physicians, particularly neurosurgeons, place these devices in humans. This certainly would be necessary for cybernetic devices of a therapeutic nature, but what about devices that are solely for enhancement purposes? Placing devices for nonmedical indications leads the physician into participating in interventions that are potentially harmful, have no therapeutic necessity, and thus are outside the traditional goals of medicine. A strong argument could be made that physicians should not participate in applying these technologies for other than therapeutic purposes. Yet few would want someone with less training than a neurosurgeon to invade their nervous system.

An analogy can be made to cosmetic surgery. Some ethicists, such as Franklin Miller and Howard Brody, contend that such interventions are outside the bounds of appropriate goals of medicine and should not be performed. Others counter that an individual should have the freedom to manipulate his or her own body, and, if a physician is willing to provide the service, restriction would be wrong and counter to the cherished goals of autonomy. Anders Sandberg takes the argument further, stating that each person has the fundamental right to pursue whatever means are available that might enhance or prolong life. The implications of this approach for medicine, however, are to change the profession from a group committed to healing (with a dominant ethos of beneficence in trust and nonmaleficence) to individuals skilled in surgical technique who are merely technicians providing whatever service may be requested.

Justice and Social Values

In the end, safety considerations may mandate that physicians and healthcare resources be used to implant cybernetic devices for nontherapeutic purposes, but justice may require that third-party healthcare dollars not be used to cover the costs of the devices or resources utilized. This raises concerns that access to enhancement technologies will be accessible only to those who already possess economic, educational, and technical advantages, further widening the gap between the haves and have-nots. As some members of society become incrementally enhanced and plugged in to cybernetic communities, these individuals will share less and less in common with the unenhanced, fragmenting society; potentially generating decreasingly compatible, or even competing, separate societies.

This is not necessarily a new phenomenon, for technologies have created boundaries between social groups in the past, the Amish and some Native Americans being notable examples. The difference is that the Amish have always wished to remain a distinct society, while some individuals who wish to reject personal enhancement may still desire participation in and access to the goods of the larger social structure. Deliberate efforts to maintain tolerance of individuals and groups who choose to forgo the use of certain technologies must be pursued if democratic republican ideals are to be preserved, and inclusive means of communication must remain available to all members of society.

Cognitive cybernetic devices must also be equipped with reliable means of filtering incoming information, especially against information that might be designed for repetitive or subliminal influence. Privacy is a similar critical issue, and must be deliberately and prospectively defended in the cybernetic age. Technologies such as functional magnetic-resonance imaging are being proposed to sense, process, and interpret thought patterns. Not only is the accuracy of such technology a critical requirement, but the concept of invading the mind is at issue.

To Prohibit, Permit, or Pursue?

Cybernetics offers wonderful devises of healing for significant, age-old disabilities, and it can be welcomed when utilized in that context. It is likely that using such tools to enhance normal function will be possible, but great caution is needed, as well as a commitment to the preservation of privacy and justice. Rigorous safeguards for demonstrating the safety of cybernetics devices, and requirements for government approval and licensing, need to be set in place.

The government, the academy, and industry should commit significant resources to the exploration of the ethical and social implications of these technologies, and to the development of appropriate analysis and preparation of guidelines for implementation.

c. christopher hook

SEE ALSO: Enhancement Uses of Medical Technology; Human Dignity; Human Nature; Nanotechnology; Technology; Transhumanism and Posthumanism

BIBLIOGRAPHY

Ashby, W. Ross. 1957. An Introduction to Cybernetics. London: Chapman & Hall.

Bell, David, and Kennedy, Barbara, eds. 2000. The Cybercultures Reader. New York: Routledge.

Clynes, Manfred, and Kline, Nathan S. 1960. "Cyborgs and Space." Astronautics September, 1960: 26–27, 74–75.

Fink, Jeri. 1999. Cyberseduction: Reality in the Age of Psychotechnology. Amherst, NY: Prometheus.

Geary, James. 2002. The Body Electric: An Anatomy of the New Bionic Senses. New Brunswick, NJ: Rutgers University Press.

Gray, Chris Hables. 2001. Cyborg Citizen: Politics in the Posthuman Age. New York: Routledge.

Hillis, Ken. 1999. Digital Sensations. Minneapolis: University of Minnesota Press.

Hook, C. Christopher. 2002. "Cybernetics and Nanotechnology." In Cutting-Edge Bioethics, ed. John Kilner, C. Christopher Hook, and Diann Uustal. Grand Rapids, MI: Eerdmans.

Jenker, Martin; Muller, Bernt; and Fromherz, Peter. 2001. "Interfacing a Silicon Chip to Pairs of Snail Neurons Connected by Electrical Synapses." Biological Cybernetics 84: 239–249.

Kurzweil, Ray. 1999. The Age of Spiritual Machines. New York: Viking.

Mann, Steve. 2001. "Wearable Computing: Toward Humanistic Intelligence." IEEE Intelligent Systems. May/June 2001: 10–15.

Middleton, Anna; Hewison, J.; and Mueller, R. F. 1998. "Attitudes of Deaf Adults Toward Genetic Testing for Hereditary Deafness." American Journal of Human Genetics. 63: 1175–1180.

Miller, Franklin; Brody, Howard; and Chung, Kevin. 2000. "Cosmetic Surgery and the Internal Morality of Medicine." Cambridge Quarterly of Healthcare Ethics 9: 353–364.

Parens, Erik, ed. 1998. Enhancing Human Traits: Ethical and Social Implications. Washington, D.C.: Georgetown University Press.

Stanley, Garrett; Fei, F. Li; and Yang, Dan. 1999. "Reconstruction of Natural Senses from Ensemble Responses in the Lateral Geniculate Nucleus." Journal of Neuroscience 19: 8036–8042.

Talwar, Sanjiv; Xu, Shaohua; Hawley, Emerson; et al. 2002. "Rat Navigation Guided by Remote Control." Nature 417: 37–38.

Vassanelli, S., and Fromherz, Peter. 1997. "Neurons from Rat Brain Coupled to Transistors" Applied Physics A. 65: 85–88.

Wessberg, Johan; Stambaugh, Christopher; Kralik, Jerald; et al. 2000. "Real-Time Prediction of Hand Trajectory by Ensembles of Cortical Neurons in Primates." Nature 408: 361–365.

Wiener, Norbert. 1950. The Human Use of Human Beings: Cybernetics and Society. Boston: Houghton Mifflin; reprint, 1988. New York: Da Capo.

Wiener, Norbert. 1961. Cybernetics: or, Control and Communication in the Animal and the Machine, 2nd edition. Cambridge, MA: The MIT Press.

INTERNET RESOURCES

"Accelerating Intelligence News." Available from <www.kurzweilAI.net>.

"Principia Cybernetica Electronic Library." Available from <paspmcl.vub.ac.be/LIBRARY.html>.

MIT Media Lab. "Wearable Computing." Available from <www.media.mit.edu/wearables>.

Roco, Mikail, and Baindridge, William Sims, eds. 2002. "Converging Technologies for Enhancing Human Performance." Available from <wtec.org/ConvergingTechnologies/>.

Sandberg, Anders. "Morphological Freedom—Why We Not Just Want It, but Need It." Available from <http://www.nada.kth.se/˜asa.Texts/MorphologicalFreedom.htm>.

Cybernetics

views updated Jun 08 2018

Cybernetics

BIBLIOGRAPHY

The term “cybernetics,” designating a distinct field of activity, appeared on the scientific scene at the close of World War ii, with the publication of Norbert Wiener’s book Cybernetics: Or Control and Communication in the Animal and the Machine. Wiener defined the term “cybernetics” as “the entire field of control and communication theory, whether in the machine or the animal” (Wiener 1948, p. 19); he was unaware that the term had been used, in a more limited sense, a century earlier by André Ampère (1834).

Since 1948, research and publications related to cybernetics have proliferated, unfolding the content of cybernetic concepts and their impact on fields ranging from psychology and neurophysiology to sociology and philosophy of science. This continuing clarification of the meaning and implications of cybernetics has influenced attitudes toward and usage of the term, as well as our understanding of it, thereby blurring Wiener’s initial definition. A brief look at some of the forces that shaped its development will help in understanding what “cybernetics” means today.

In his personal review of the subject, Wiener recounts that while working on the theory of an automatic system for aiming antiaircraft guns he and his colleagues were impressed with the critical role of feedback in the proper functioning of a control system. This led them to conjecture that in order for a person to perform motor activities, his cerebellum must embody types of feedback and associated information processes comparable to those used in an artificial control system. If this were so, then the brain could be viewed as a complex communication, computer, and control system; and the concepts of feedback and control theory could account for internal homeostatic control (for temperature, blood-sugar concentration, heart action, etc.), as well as for control of those motor actions required for purposeful manipulation of external objects. Implicit in these notions was the further thesis that those cognitive activities involved in higher-level problem-solving behavior also could be interpreted mechanistically in terms of the flow and processing of information.

The concepts of cybernetics, emphasizing an information-processing analysis of the mechanisms that generate purposeful behavior, excited the interest of some psychologists, physiologists, and even psychiatrists. Psychologists saw a way of relating behavior to the underlying information processes that control behavior. Neurophysiologists found that the brain and nervous system could be analyzed as a special-purpose computing machine “designed” to generate adaptive, intelligent behavior. And for psychiatrists, Wiener argued that functional mental disorders in the human are primarily diseases of memory caused by errors introduced in the processing of information and are not necessarily indicative of a physiological or anatomical breakdown of the brain mechanism. Thus, Wiener’s writings suggested that problems in the psychology of behavior, the physiology of the nervous system, and the psychopathology of mental disorders could all be described in the neutral language of information processing and control.

Because of the central importance of the concept of information, a second major force behind the development of cybernetics was the publication in 1948 of Shannon’s paper “The Mathematical Theory of Communication.” Here was a theory that explicated quantitatively one measure for the amount of information conveyed by messages. The theory showed how to determine the capacity of a communication channel. One could now compute how much more information one channel could transmit than another. Shannon’s theory clarified the important concept of a code and showed how to determine the efficiency of a given coding system. The theory also demonstrated how to combat the destructive effects of noise by introducing redundancy into coding schemes. Shannon’s mathematical theory of communication not only explicated all of these key concepts but also proved some surprising mathematical relationships between noise, redundancy, channel capacity, and error-free transmission of messages [seeInformation theory].

Clearly, the digital computer was a third force pushing and molding the development of cybernetics. The first electronic digital computer was completed in 1946, and the following years brought swift advances in computer theory, technology, and applications. Switching speeds and memory capacities increased by several orders of magnitude. Input-output devices and information conversion equipment of great diversity were developed. Theoretical foundations emerged in the form of a theory of automata and information machines. More reliable equipment, more flexible programming languages, and a steady decline in costs all contributed to the ever widening use of computers. The application of computing machines spread from scientific calculations to automatic control and businessdata processing—in fact, into almost every facet of government, industrial, and military information processing. One of the most interesting applications is simulation, where the computer is used as a general-purpose research vehicle to generate the logical consequences of arbitrary assumptions about a complex process. Thus, one can get new insights about a complex process by having a computer simulate its behavior, whether the model be for some aspects of the economy or for some neurophysiological structure. In this way psychologists searching for theories of problem-solving behavior (for example, the cognitive behavior associated with proving theorems of logic) have attempted to simulate aspects of such behavior by using a computer. Similarly, neurophysiologists seeking an understanding of the neural organizational principles that give rise to pattern recognition, learning, and similar processes have simulated with computers the behavior of networks of idealized neurons [seeComputation; Simulation].

Wiener’s notions about the brain and the computer, Shannon’s theory of information, and the new computing technology created optimism about new ways to attack the formidable problems of thinking and knowing. This atmosphere of excitement and ferment, accentuated by hopes of interdisciplinary unification, generated much competent work. Unfortunately, it also produced serious intellectual and semantic misunderstandings about computers, brains, and people; and confusions between information and meaning, between amount of information and entropy. These difficulties caused some nonsensical claims to be offered under the banner of cybernetics. The more responsible workers criticized this pseudoscientific fringe, thereby contributing to a reversal in attitudes. The new tendency was to regard cybernetics with suspicion and disdain.

Thus, there developed—and still exist—conflicting attitudes toward cybernetics and what its subject matter really is. Vagueness about the meaning of “cybernetics” has been compounded by the fact that since around 1955 the subject of cybernetics has enjoyed a wide acceptance and publicity in the Soviet Union, where it is now interpreted most broadly and used to describe all studies and techniques that relate even in the most remote way to computers, information processing, communications, or control systems.

Today, almost two decades after Wiener’s book, “cybernetics” still means different things to different people. For some, cybernetics is not a “new science” but merely a collection of techniques, studies, and devices clustering around information processing. For those who accept this interpretation, “cybernetics” is but a fancy name for the application of certain techniques to related fields— for example, the application of information theory to analysis of coding and redundancy in the visual sensory system.

Others equate cybernetics with automation and its accelerating thrust into all facets of human activity. They recognize that cybernetics not only changes favorably the face of our society but also initiates sociological problems of great magnitude —such as technological unemployment and social conflict resulting from the increasing replacement of men by machines.

Finally, many interpret cybernetics as a new, all-inclusive, and powerful way of analyzing complex systems, from machines to society itself, in terms of the flow and processing of information. Some of these see a deeper significance to the underlying logical structure of cybernetics. The concepts of cybernetics do, in fact, offer hope for a new unity in our understanding of those processes that underlie the activities of knowing.

For some, the real intellectual wealth of cybernetics lies not in its analogies between the computer and the brain—though these analogies are fruitful—but rather in the realization that both systems, natural and artificial, can be analyzed in terms of the same cybernetic language, the language of information and control (see MacKay 1957 for a more detailed discussion). The concepts of this language are potentially rewarding because they span the traditional gap between the psychology of behavior and the physiology of those mechanisms that generate behavior, including cognitive behavior. Thus, cybernetics offers an effective new language for analyzing those information mechanisms and processes associated with behavioral aspects of thinking and knowing. The language of cybernetics may prove rich and versatile enough to permit a theory of knowing—a science of knowledge—to be expressed in terms of cybernetic concepts.

Those concepts suggest even more than how to grasp and formulate the relationship between the information-flow organization of the brain and intelligent behavior. There is no reason to believe that the human brain and nervous system is optimally organized. One might find principles, framed in cybernetic language, that show how to design an artifact able to learn more quickly, remember and associate better, act faster and more reliably, or solve problems more ingeniously than humans. One might find design principles radically different from those embodied in the human, and build (or grow) highly intelligent artifacts. All of this presupposes, of course, a theory of thinking and knowing (as exists for engineering communications) that can be used to judge which cognitive system is optimal relative to some aspect of information processing. Be clear about what this means. Cybernetics today offers no laws describing what kinds of information-flow structures are necessary to produce various dimensions of intelligent (mindlike) behavior. There exist only the faintest outlines of such organizational principles. Nothing, however, contradicts the thesis that such design principles exist, can be described, and can be implemented. Cybernetics offers both a language and a set of concepts to use in molding these principles into a theory relating information processing to the activities of learning, thinking, knowing, and understanding (see Maron 1965).

M. E. Maron

[Directly related is the biography ofWiener. Other relevant material may be found inComputation; Information theory; Simulation.]

BIBLIOGRAPHY

AmpÉre, AndrÉ; Marie (1834) 1856 Essai sur la philosophie des sciences: Ou exposition analytique d’une classification naturelle de toutes les connaissances humaines. 2d ed. Paris: Mallet-Bachelier.

Ashby, William R. (1956) 1961 An Introduction to Cybernetics. London: Chapman.

George, Frank H. 1961 The Brain as a Computer. New York: Pergamon.

Kybernetik. → Published since 1960, mostly in German; the emphasis is on bio-cybernetics.

MacKay, D. M. (1957) 1964 Information Theory in the Study of Man. Pages 214–235 in John Cohen (editor), Readings in Psychology. London: Allen & Unwin.

Maron, M. E. 1965 On Cybernetics, Information Processing, and Thinking. Pages 118–138 in Norbert Wiener and J. P. Schadé (editors), Cybernetics of the Nervous System. Progress in Brain Research, Vol. 17. Amsterdam: Elsevier.

Shannon, Claude E. 1948 The Mathematical Theory of Communication. Bell System Technical Journal 27: 379–423, 623–656.

Shannon, Claude E.; and Weaver, Warren (1949) 1959 Mathematical Theory of Communication. Urbana: Univ. of Illinois.

Wiener, Norbert (1948) 1962 Cybernetics: Or Control and Communication in the Animal and the Machine. 2d ed. Cambridge, Mass.: M.I.T. Press.

Wiener, Norbert (1950) 1954 The Human Use of Human Beings: Cybernetics and Society. 2d ed. Boston: Houghton Mifflin.

Cybernetics

views updated May 29 2018

CYBERNETICS

A term coined by Norbert Wiener (1894 to 1964) from the Greek, κυβερνήτης (steersman), to designate the science of control and communication in both animals and machines. It supplies novel instruments for investigating life and mental processes that open up new avenues of research in the study of organisms, nerve impulses, sensation, memory, and even mind. Yet it also gives rise to philosophical problems by suggesting that mental processes, hitherto regarded as distinctive of man's higher faculties, can ultimately be performed by a machine. What follows is concerned with the ways in which such problems may be resolved to the mutual benefit of the cybernetician and the philosopher.

Basic Difficulties. Confusions over the implications of cybernetics can usually be traced to one of two sources. The first is an uncritical acceptance of the assumptions that underlie cybernetics and make it feasible, while the second is the analogous terminology that is employed in work on control and communications.

Suppositions. The cybernetician is committed to a program of research in which animal and human means of communication are studied through the use of electronic and mechanical devices. Even superficial examination, however, reveals vast differences between organisms and machines, including machines of the most complicated types. In order to bridge the gap, the researcher in this area must "down-grade" living phenomena until they approach the level of the nonliving, and "up-grade" mechanical and electrical phenomena to confer on them the status of vital activities.

Thus the suppositions on which the cybernetician works commit him to a monist view of reality. He may be aware of profound differences between the living and

the nonliving in his ordinary experience, but when carrying out his research he must abstract from these differences. Following a procedure that is typically scientific, he first simplifies the phenomena he is studying in order to arrive at some type of idealized generalization; then, by various approximations, he attempts to reproduce the complexity found in nature by adding new elements to his simplified model. Although simplification thus becomes an integral part of his method, it need be no cause for concern if the investigator is aware that he is so simplifying.

Terminology. The second source of confusion is closely associated with the first. It concerns the use of terms hitherto reserved exclusively for animal or human activity but now applied somewhat indiscriminately to machines. Cybernetic devices are described as "learning," "adapting," "self-correcting," and "thinking." Such usage leads those who are philosophically naïve to imagine that these devices eliminate the need for distinction between living and nonliving or between the mental and the purely material.

Such an inference, admittedly alarming, fails to take account of the primitive state of cybernetics as a science. In the early stages of any scientific development, the investigator is forced to employ terms that are generally used in quite different contexts. For example, early theories of momentum had to employ the concept of impetus, which immediately implied a reference to the efficient cause of inertial motion. Similarly, when Newton first arrived at the notion of mass, he had to designate it by the expression quantitas materiae, or quantity of matter, because no other term was available. So also, in the beginnings of cybernetic research, it seems inevitable that analogy or equivocation be employed until a proper scientific terminology is developed.

The scientific mind cannot rest content with analogy or equivocation; scientists typically attach univocal meanings to the terms they employ. Thus, as cybernetics moves into its more advanced stages, it develops its own terminologyalready the term programming illustrates this development. As this process continues, early confusions disappear, a precise vocabulary is adopted, and distinctions become available that can be used with profit by philosophers as well as by cyberneticians.

Particular Problems. An indication of how this may be done is given by an analysis of problems presented to the philosopher by the concepts of perception, memory, and abstraction when these concepts are applied to the machine.

Perception. When a machine "perceives," in the sense of recognizing a pattern, it compares a pattern presented to it with certain test features that serve to identify this pattern as one of several possibilities. The comparison is more than one of mere juxtaposition, such as is done, for example, when two patterns are placed one on top of the other and viewed toward a light source. Rather there is an indeterminacy in the standard pattern that is resolved on a probabilistic basis and subsequently corrected whenever the result is found wrong. By trial and error, the machine advances toward more and more perfect identification of the pattern.

Yet this process is not the mechanism whereby a man perceives an object or a pattern. Human perception involves more than comparison with a standard pattern. It does presuppose identification of the pattern, but above and beyond this it involves an element of signification or intentionality not found in machine perception. For example, when the eye sees a coin at a distance, the pattern presented to it is generally oval, or elliptical, because the coin is usually seen from an angle. But the coin, when perceived, is not perceived as elliptical or oval, but as a circular disc. Similarly, a square may be presented to the eye as a rhombus, but it is perceived as a square. Spatial perception is thus self-corrective for the effects of perspective, for the angle of vision, for distortions introduced by the medium, and so forth.

Human perception, on this account, involves more than pattern recognition. The additional factor is what enables man to recognize the identity of an object perceived when it is sensed in different orientations. Man perceives because he is capable of grasping the signification of a particular representation, of knowing it in an intentional way. Signification or intentionality is the synthesizing factor that unifies many possible representations and enables the perceiver to identify a particular content through any one of them. Machine perception reaches the first stage of comparing representations, but the machine is not capable of grasping the signification associated with individual representations as is the human being.

Memory. Computers are said to have memory when they store information and make it available for future use. They do this by mechanically or electronically locating "bits" of information that are present and recorded in the machine. Machine memory is thus limited to recording the past as present, i.e., it re-presents information in the present. This operation is much like that of mechanically locating an object present in a filing system.

Human memory involves more than this. It does not consist merely in recalling a representation received in the past as it is now present, but rather in perceiving the past as past, i.e., as temporally situated in a bygone present. The person who remembers filing an object does something different from another who merely locates the same object present in the file. The remembering does not concentrate on the present action, i.e., locating the material, but rather on identifying a past action, i.e., placing the matter in the file, precisely as a past action, i.e., as done several days, weeks, or months ago.

Abstraction. If machines cannot perceive and remember in the same way as human beings, even less can they form a concept that is univocally the same as the product of the mental process of abstraction. A concept is a universal idea that expresses a nature or a quiddity without including the specific characteristics that determine "this" or "that" particular realization of the idea in matter. Thus the concept is essentially immaterial, and in this respect differs from the phantasm. The concept is the product of the intellect; the phantasm is the product of the imagination.

When machines attempt to simulate abstraction and concept formation, they substitute a minimal representation for the complexity of all the different representations that might exemplify a particular concept. This minimal representation, however, remains at the level of a material symbol, and thus lacks the immateriality and universality of the concept. The machine's method of "concept formation" consists in constructing complex representations, or drawing more and more complicated pictures, with the elements present in its simplest representation. It does not form a concept in the human sense, since it cannot grasp a meaning or a universal idea in an immaterial way.

Closely related to this topic is the subject of mechanical translation. Actually, when a machine is said to "translate," the process employed is not so much translation as it is deciphering or decoding. Decipherment is concerned with the manipulation of symbols, whereas translation deals with the meaning behind these symbols. Being concerned with meaning, translation requires intelligence and cannot be performed at a merely mechanical level. Where questions of meaning are involved, a deciphering machine is only as valuable as a translator or interpreter who does not understand what he is saying in either language. This is not meant to imply that the highly complex operations performed by such machines are valueless. They have their uses, but these can be called "concept formation" or "translation" only at the risk of equivocation.

Critique. While the computing machines used in cybernetic research are marvels of human ingenuity, they are not human and as such cannot be a proper subject of predication for human attributes. If men do not commonly attribute their own personal skills to the instruments they employ, even less should they attribute their basically human powers to machines. To do so is similar to imagining, in Aristotle's words, "not only the forms of gods, but their ways of life, to be like their own" (Pol. 1252b). Rather they should recognize computers for what they are, namely, powerful instruments for research and technology, and use them to investigate the material phenomena that underlie vital and cognitional processes.

See Also: soul, human; spirit; sensation; knowledge; universals.

Bibliography: n. moray, Cybernetics (New York 1963). d. dubarle, Scientific Humanism and Christian Thought, tr. r. trevett (New York 1956). w. a. wallace, "Cybernetics and a Christian Philosophy of Man," Philosophy in a Technological Culture, ed. g. f. mclean (Washington 1965); "A Thomist Looks at Teaching Machines," Dominican Educational Bulletin 4 (1963) 1323. m. a. bunge, Metascientific Queries (Springfield, Illinois 1959).

[w. a. wallace/

r. s. ledley]

Cybernetics

views updated May 23 2018

CYBERNETICS

CYBERNETICS . Cybernetics is the study of control and communication. Although it is often thought of as primarily the control systems in machines, cybernetic theory can also be applied to biological agents, to systems comprised of either mechanical or biological agents, or both. Of particular interest to cybernetics are systems that are complex, adaptive, and self-regulating through the use of feedback. Norbert Wiener coined the term in 1947 as a transliteration of the Greek kybernetes, which means "steersman," though it was originally used in a broader sense than merely locomotive. Plato used the term to denote the act of governing a populace as well as that of steering a boat. and the term governor derives from the same root. Both terms refer to the control and direction of complex systems.

Cybernetics describes the world in terms of systems and information. A mechanical or biological agent can be considered a hierarchy of interacting networks through which information is moved, created, or transformed. Similarly, a system of agents can also be described and studied through the same concepts of control and feedback. Cybernetics uses mathematical and logical models to describe the flow of information in a system. Since many systems are influenced by random factors, statistical methods are also used to forecast or describe information flow.

The goals of cybernetics are twofold. First, for any given system, cybernetics hopes to advance knowledge of that system by describing the processes that regulate its functioning. Second, the field of cybernetics also seeks to develop laws that describe control processes in general and that are applicable to all types of systems. Cybernetics focuses on the structure and functioning of any given system rather than on the physical makeup of its elements.

Applications

The earliest applications of cybernetics were predominantly in engineering and computer science (robotics, circuit design, aiming artillery). Early work by Wiener, Claude Shannon, and John von Neumann was closely allied with the fledgling field of artificial intelligence and machine learning. Since any system that evidences both complexity and self-adaptation can be studied using cybernetics, the basic concepts were soon applied in a variety of fields, including economics (Kenneth Boulding), political science, management and industrial theory (Jay Forrester, Stafford Beer), biology (Warren McCulloch, Humberto Maturana, William Ross Ashby), sociology and anthropology (Gregory Bateson, Stein Braten), and ethics (Valentin Turchin). As cybernetics moved into the social sciences in the 1960s and 1970s, descriptions in the field changed from those of an observer external to the system (e.g., a human observer of a mechanical system) to those of an internal participant (e.g., a human within a political or social community).

Whereas early cyberneticists thought of information as a commodity that flowed through systems, subsequent writers, such as Maturana, have viewed information as the product of a system. In a further step one can think of the system itself as consisting of information. The computer scientist Ray Kurzweil has applied this approach to his understanding of the human being, whereas the physicists Frank Tipler and Stephen Wolfram view information as the building block of the whole universe. For these writers the concept of information informs not only the system's outcomes or activities but is considered the very basis of the system itself.

Philosophical and Theological Implications

This final understanding of both mechanical and biological agents as consisting essentially of information leads to the most important philosophical and theological implications of cybernetic theory. A cybernetic view of the human person sees that person as a system composed of information. The concept of cybernetic immortality is based on the assumption that thoughts, memories, feelings, and action define the human person. These are products of consciousness, which is considered an emergent property of the complex system of the brain. In other words, to the cyberneticist, human beings are basically biological machines whose unique identity is found in the patterns that arise and are stored in the neuronal structures of the brain. If these patterns could be replicatedin sophisticated computer technology, for examplethe defining characteristics of the person would be preserved. In such an anthropology the soul is considered that part of consciousness that exerts the highest level of control on the system that makes up the human being.

The ability to isolate the cognitive part of the system and preserve its viability past the death of the body is held by some researchers as an alternative to the metaphysical immortality proposed by many religions. Kurzweil suggests the future possibility of a computer-based immortality, in which the contents of the human mind are downloaded to a silicon-based platform. Tipler envisions an eschatology in which the universe will contract to an "omega point" that will contain all the information that has ever existed, including that which makes up each human being. God is essentially the highest level of control in the cybernetic system of the universe, thus becoming identical with the omega point at the final contraction. Tipler notes that this omega point could allow for something not unlike the Christian concept of resurrection of the body, in that the information that makes up any given human being would be available, thus allowing for a reinstantiation of that individual. A cybernetic view of both God and the human person provides a way to maintain belief in a reductionistic materialism without giving up the hope of immortality.

Cybernetic theories have also been used to describe the origin of religion in societies and the development of ethical systems. In general, a cybernetic view of religion sees it as an adaptive mechanism for the survival of groups as they evolve and change in an atmosphere of physical and social competition. Religion becomes one of many feedback mechanisms for regulating the functioning of individuals within the social group.

See Also

Artificial Intelligence.

Bibliography

Norbert Wiener's Cybernetics (New York, 1948) introduced the term and the field. A more popular treatment of the field is in Wiener's The Human Use of Human Beings (New York, 1988). William Ross Ashby's An Introduction to Cybernetics (New York, 1956) remains the basic textbook in cybernetic theory. The Principia Cybernetica website, constructed by Frans Heylighen and available at http://pcp.lanl.gov, provides an excellent primer in both the theory and the philosophy of cybernetic thought. Humberto R. Maturana and Francisco J. Varela, The Tree of Knowledge: The Biological Roots of Human Understanding (Boston, 1992), apply cybernetic concepts to human cognition and to human social systems. Ray Kurzweil's The Age of Spiritual Machines (New York, 1999) explores the possibility of cybernetic immortality on a computer platform, whereas Frank J. Tipler's The Physics of Immortality (New York, 1994) combines cybernetics with modern physics to present an eschatological vision. Stephen Wolfram, in A New Kind of Science (Champaign, Ill., 2002), presents another view of the universe as cybernetic system.

Noreen L. Herzfeld (2005)

Cybernetics

views updated May 21 2018

Cybernetics

The term cybernetics is much misused in the popular media. Often used to convey notions of high-technology, robotics , and even computer networks like the Internet, in reality, cybernetics refers to the study of communications and control in animal and machine.

Great mathematicians of the past such as Wilhelm Leibniz (16461716) and Blaise Pascal (16231662) had been interested in the nature of computing machinery long before these machines had ever been realized. They concerned themselves with philosophizing over what special peculiarities might be present in machines that had the ability to compute. In the mid-1930s Alan Turing (19121954) developed the idea of an abstract machine (later to become known as the "Turing Machine" ). Turing machines introduced the possibility of solving problems by mechanical processes that involved a machine stepping through a sequence of states under the guidance of a controlling element of some sort. This laid the fundamental groundwork that was then developed by Norbert Wiener (18941964) into what has become cybernetics.

In 1948 Wiener concluded that a new branch of science needed to be developed. This field would draw from the realms of communication, automatic control, and statistical mechanics. He chose the word cybernetics, deriving it from the Greek word for "steersman" which underlines one of the essential ingredients of this fieldthat of governance or control. He defined cybernetics to be "control and communication in the animal and the machine." What really makes cybernetics stand apart from other fields in science and engineering is that it focuses on what machines do rather than the details of how they actually do it.

Classically, the study of a particular piece of conventional mechanical machineryfor example, a typewriterwould not be considered complete until all of the intricacies of the physics of movement of the constituent parts had been accounted for. This constitutes a Newtonian view of systemsone that commences with a perspective of Newtonian mechanics and builds from there. Cybernetics, on the other hand, accentuates the behavior and function of the machine as a whole. The result of this stance is that cybernetics is not restricted to dealing with mechanical or perhaps electrical machines only; instead it applies to anything that might possibly be viewed in some way as a machineincluding organisms. That is, cybernetics looks at all the elements that are common denominators in that class of entities that might describe as machines. Wiener concluded that for a system to be classed as cybernetic, communication between parts of a system was a necessary characteristic, as was feedback from one part to another. The presence of feedback means that a cybernetic system is able to measure or perceive a quantity of some sort, then compare this to a required or desired value, and then instigate some strategy or behavior that affects change in that quantity. This is as much true of a heater and thermostat used to regulate temperature in a house, as it is of a bird that seeks refuge in a bird bath on a hot day.

Historically, the human body, in particular the human brain, has been viewed by many as a type of machine. This perception was generated by people who were hopeful of finding a way of modeling human behavior in the same way that they could model human-made machinesan approach with which they were comfortable. Much effort was directed toward understanding the operation of the human brain in this light.

Throughout the nineteenth and early twentieth centuries, significant advances were made in understanding the physiology of the human brain. Research into the structure of the cerebral cortex, the discovery of the brain as the center of perception, and the identification of neurones and synapses were all contributors to the conclusion that the brain is the regulator, controller, and seat of behavior of the human species. Because these ideas are fundamental to cybernetics, the human brain and the notion of intelligence are also considered as subjects that are within the realm of the cybernetic field. As a consequence, a great deal of research has been carried out in the areas of biological control theory, neural modeling , artificial intelligence (AI) , cognitive perception, and chaos theory from a perspective that resulted from the development of cybernetics.

With respect to computer systems, cybernetics has been prominent in two areas. The first is artificial intelligence, where computer algorithms have been developed that attempt to exhibit some traits of intelligent behaviorinitially by playing games and later by processing speech and carrying out complex image and pattern manipulation operations. The second is in robotics, which frequently encompasses artificial intelligence and other cybernetic areas such as communication and automatic control using feedback. Early robotic systems were nothing more than complex servo-mechanisms that carried out manual tasks in place of a human laborer; however, the modern cybernetic approach is to attempt to construct robots that can communicate and be guided toward acting together as a team to achieve a collective goal. This has generated interest in a new type of adaptive machine that has the capacity to re-organize its strategies and behavior if its environment or mission changes.

Finally, beyond a computing context, cybernetics offers some advantages in our understanding of nature. First, it permits a unified approach to studying and understanding machine-like systems. This results from the distinct way in which the cybernetic viewpoint of systems is formulated; it is not restricted to particular machine or system types. For example, we can draw a correspondence between an electro-mechanical system like a collection of servo-motors and linkages that give a robot locomotion, and a biological system like the nervous and musculo-skeletal systems of a caterpillar. One is not required to undertake greatly differing analyses to gain an appreciation of both. Secondly, it offers a manageable way of dealing with the most predominant type of systemone that is highly complex, non-linear, and changes over time.

see also Artificial Intelligence; Robotics; Space Travel and Exploration.

Stephen Murray

Bibliography

Arbib, Michael A. Brains, Machines, and Mathematics, 2nd ed. New York: Springer-Verlag, 1987.

Ashby, W. Ross. An Introduction to Cybernetics. London: Chapman and Hall Ltd., 1971.

Caianiello, E. R., and G. Musso, eds. Cybernetic Systems: Recognition, Learning, Self-Organisation. Letchworth: Research Studies Press Ltd., 1984.

Glorioso, Robert M. Engineering Cybernetics. Englewood Cliffs: Prentice-Hall Inc., 1975.

Wiener, Norbert. Cybernetics or Control and Communication in the Animal and the Machine, 2nd ed. Cambridge: MIT Press, 1961.

cybernetics

views updated May 09 2018

cybernetics is the science of control. Its name, appropriately suggested by the mathematician Norbert Wiener (1894–1964), is derived from the Greek for ‘steersman’, pointing to the essence of cybernetics as the study and design of devices for maintaining stability, or for homing in on a goal or target. Its central concept is feedback. Since the ‘devices’ may be living or man-made, cybernetics bridges biology and engineering.

Stability of the human body is achieved by its static geometry and, very differently, by its dynamic control. A statue of a human being has to have a large base or it topples over. It falls when the centre of mass is vertically outside the base of the feet. Living people make continuous corrections to maintain themselves standing. Small deviations of posture are signalled by sensory signals (proprioception) from nerve fibres in the muscles and around the joint capsules of the ankles and legs, and by the otoliths (the organs of balance in the inner ear). Corrections of posture are the result of dynamic feedback from these senses, to maintain dynamic stability. When walking towards a target, such as the door of a room, deviations from the path are noted, mainly visually, and corrected from time to time during the movement, until the goal is reached. The key to this process is continuous correction of the output system by signals representing detected errors of the output, known as ‘negative feedback’. The same principle, often called servo-control, is used in engineering, in order to maintain the stability of machinery and to seek and find goals, with many applications such as guided missiles and autopilots.

The principles of feedback apply to the body's regulation of temperature, blood pressure, and so on. Though the principles are essentially the same as in engineering, for living organisms dynamic stability by feedback is often called ‘homeostasis’, following W. B. Cannon's pioneering book The wisdom of the body (1932). In the history of engineering, there are hints of the principle back to ancient Greek devices, such as self-regulating oil lamps. From the Middle Ages the tail vane of windmills, continuously steering the sails into the veering wind, are well-known early examples of guidance by feedback. A more sophisticated system reduced the weight of the upper grinding stone when the wind fell, to keep the mill operating optimally in changing conditions. Servo-systems using feedback can make machines remarkably life-like. The first feedback device to be mathematically described was the rotary governor, used by James Watt to keep the rate of steam engines constant with varying loads.

Servo-systems suffer characteristic oscillations when the output overshoots the target, as occurs when the feedback is too slow or too weak to correct the output. Changing the ‘loop gain’ (i.e. the magnitude of correction resulting from a particular feedback signal) increases tremor for machines and organisms. It is tempting to believe that ‘intention tremor’ of patients who have suffered damage to the cerebellum is caused by a change in the characteristics of servo control.

Dynamic control requires the transmission of information. Concepts of information are included in cybernetics, especially following Claud Shannon's important mathematical analysis in 1949. It does not, however, cover digital computing. Cybernetic systems are usually analogue, and computing is described with very different concepts. Early Artificial Intelligence (AI) was analogue-based (reaching mental goals by correcting abstract errors) and there has recently been a return to analogue computing systems, with self-organizing ‘neural nets’.

A principal pioneer of cybernetic concepts of brain function was the Cambridge psychologist Kenneth Craik, who described thinking in terms of physical models analogous to physiological processes. Craik pointed to engineering examples, such as Kelvin's tide predictor, which predicted tides with a system of pulleys and levers. The essential cybernetic philosophy of neurophysiology is that the brain functions by such principles as feedback and information, represented by electro-chemical, physical activity in the nervous system. It is assumed that this creates mind: so, in principle, and no doubt in practice, machines can be fully mindfu.

Richard L. Gregory

Bibliography

Cannon, W. B. (1932). The wisdom of the body. New York.
Craik, K. J. W. (1943). The nature of explanation. Cambridge.
Mayr, O. (1970). The origins of feedback control. Cambridge, M.A.
Shannon, C. E. and and Weaver, W. (1949). The mathematical theory of information. Urbana.
Weiner, N. (1948). Cybernetics. New York.


See also balance; homeostasis; proprioception; vestibular system.

Cybernetics

views updated May 17 2018

Cybernetics

Cybernetics is the study of communication and control processes in living organisms and machines. Cybernetics analyzes the ability of humans, animals, and some machines to respond to or make adjustments based upon input from the environment. This process of response or adjustment is called feedback or automatic control. Feedback helps people and machines control their actions by telling them whether they are proceeding in the right direction.

For example, a household thermostat uses feedback when it turns a furnace on or off based on its measurements of temperature. A human being, on the other hand, is such a complex system that the simplest action involves complicated feedback loops. A hand picking up a glass of milk is guided continually by the brain that receives feedback from the eyes and hand. The brain decides in an instant where to grasp the glass and where to raise it in order to avoid collisions and prevent spillage.

The earliest known feedback control mechanism, the centrifugal governor, was developed by the Scottish inventor James Watt in 1788. Watt's steam engine governor kept the engine running at a constant rate.

Systems for guiding missiles

The principles for feedback control were first clearly defined by American mathematician Norbert Wiener (18941964). With his colleague Julian Bigelow, Wiener worked for the U.S. government during World War II (193945), developing radar and missile guidance systems using automatic information processing and machine controls.

After the war, Wiener continued to work in machine and human feedback research. In 1950, he published The Human Use of Human Beings: Cybernetics and Society. In this work, Wiener cautioned that an increased reliance on machines might start a decline in human intellectual capabilities. Wiener also coined the word "cybernetics," which comes from the Greek word kybernetes, meaning "steersman."

Words to Know

Artificial intelligence (AI): The science that attempts to imitate human intelligence with computers.

Feedback: Information that tells a system what the results of its actions are.

Robotics: The science that deals with the design and construction of robots.

Cybernetics and industry

With the advent of the digital computer, cybernetic principles such as those described by Wiener were applied to increasingly complex tasks. The result was machines with the practical ability to carry out meaningful work. In 1946, Delmar S. Harder devised one of the earliest such systems to automate the manufacture of car engines at the Ford Motor Company. The system involved an element of thinking: the machines regulated themselves, without human supervision, to produce the desired results. Harder's assembly-line automation produced one car engine every 14 minutes, compared with the 21 hours if had previously taken human workers.

By the 1960s and 1970s, the field of cybernetics, robotics, and artificial intelligence began to skyrocket. A large number of industrial and manufacturing plants devised and installed cybernetic systems such as robots in the workplace. In 1980, there were roughly 5,000 industrial robots in the United States. By the early twenty-first century, researchers estimated there were as many as 500,000.

Considerable research is now focused on creating computers that imitate the workings of the human mind. The eventual aim, and the continuing area of research in this field, is the production of a neural computer, in which the architecture of the human brain is reproduced. The system would be brought about by transistors and resistors acting such as neurons, axons, and dendrites do in the brain. The advantage of neural computers is they will be able to grow and adapt. They will be able to

learn from past experience and recognize patterns. This will enable them to operate intuitively, at a faster rate, and in a predictable manner.

[See also Artificial intelligence; Computer, digital; Robotics ]

Cybernetics

views updated May 21 2018

Cybernetics

The termcyberneticswas originated by American mathematician Norbert Wiener (18941964) in the late 1940s. Cybernetics is the study and analysis of control and communication systems, both artificial and biological.

As Wiener explains in his 1948 book, Cybernetics: Or, Control and Communication in the Animal and the Machine, any machine that isintelligentmust be able to modify its behavior in response to feedback from the environment. This notion has particular relevance to the field of computer science. Within modern research, considerable attention is focused on creating computers that emulate the workings of the human mind, thus improving their performance. The goal of this research is the production of computers operating on a neural network.

During the late 1990s work has progressed to the point that a neural network can be run, but, unfortunately, it is generally a computer software simulation that is run on a conventional computer. The eventual aim, and the continuing area of research in this field, is the production of a neural computer. With a neural computer, the architecture of the brain is reproduced. This system is brought about by transistors and resistors acting as neurons, axons, and dendrites. By 1998, a neural network had been produced on an integrated circuit, which contained 1024 artificial neurons. More recent work has focused primarily on the implementation of neural networks in software, not hardware, due to the greater flexibility of software systems. The advantage of these neural computers is that they are able to grow and adapt. They can learn from past experience and recognize patterns, allowing them to operate intuitively, at a faster rate, and in a predictive manner.

Another potential use of cybernetics is one much loved by science fiction authors, the replacement of ailing body parts with artificial structures and systems. If a structure, such as an organ, can take care of its own functioning, then it need not be plugged into the human nervous system, which is a very difficult operation. If the artificial organ can sense the environment around itself and act accordingly, it need only be attached to the appropriate part of the body for its correct functioning. An even more ambitious fantasy for cybernetics is the production of a fully autonomous life form, something akin to the robots often featured in popular science fiction offerings. Such an artificial life form with learning and deductive powers would be able to operate in areas inhospitable to human life. This could include long-term space travel or areas of high radioactivity. However, such visions do not reflect the state of the art in artificial intelligence or computer programming, nor even any state of the art that can be projected for the foreseeable future.

Cybernetics

views updated May 21 2018

Cybernetics


The term cybernetics is derived from the Greek word kybernetes (steersman). The term was introduced in 1948 by the mathematician Norbert Wiener (18941964) to describe how systems of information and control function in animals and machines (steersmanship). Cybernetics is inherently interdisciplinary; it is related to systems theory, chaos theory, and complexity theory, as well as artificial intelligence, neural networks, and adaptive systems. Cybernetics was formulated by thinkers such as Wiener, Ludwig von Bertalanffy (19011972), W. Ross Ashby (1903-1972), and Heinz von Foerster (1911). It developed as a consequence of multidisciplinary conversations among thinkers from a variety of disciplines, including economics, psychiatry, life sciences, sociology, anthropology, engineering, chemistry, philosophy, and mathematics. Cybernetics contributed greatly to the development of information theory, artificial intelligence, artificial life, and it foresaw much of the work in robotics and autonomous agents (hence the term cyborg for robot).

After control engineering and computer science became independent disciplines, some cyberneticists felt that more attention needed to be paid to a system's autonomy, self-organization, and cognition, and the role of the observer in modeling the system. This approach became known as second-order cybernetics in the early 1970s. Second-order cybernetics emphasizes the system as an agent in its own right and investigates how observers construct models of the systems with which they interact. At times, second-order cybernetics has resulted in the formulation of philosophical approaches that, according to some critics, are in danger of losing touch with concrete phenomena.

Cybernetics moves beyond Newtonian linear physics to describe and control complex systems of mutual causalities and nonlinear time sequences involving feedback loops. It seeks to develop general theories of communication within complex artificial and natural systems. Applications of cybernetic research are widespread and can be found in computer science, politics, education, ecology, psychology, management, and other disciplines. Cybernetics has not become established as an autonomous discipline because of the difficulty of maintaining coherence among some of its more specialized forms and spin-offs. There are thus few research or academic departments devoted to it.

Because of the diffuse interdisciplinarity of cybernetics, theological, religious, and philosophical concerns and engagements are multiple. Some conversations concern the social and economic impact of computer networks, such as the internet, on culture and nature. Others concern the development of artificial life and artificial intelligence and its impact on how human intelligence and life is understood. Other theological and philosophical concerns of cybernetics include the shape of divine activity in the world, the "constructed" nature of knowledge and of ethical values, the boundaries between bodies and machines and the implications for creation, the promises of salvific technology, and a tendency to strive for a metanarrative or grand unifying theory.


See also Artificial Intelligence; Artificial Life; Chaos Theory; Complexity; Cyborg; Process Thought; Systems Theory


Bibliography

ashby, w. ross. an introduction to cybernetics (1956). london: chapman and hall, 1999.

hayles, n. katherine. how we became post-human: virtual bodies in cybernetics, literature, and informatics. chicago: university of chicago press, 1999.

heylighen, francis, and joslyn, cliff. "cybernetics and second-order cybernetics." in: encyclopedia of physical science and technology, 3rd edition, ed. r. a. meyers. new york: academic press, 2001.

heylighen, francis; cliff, joslyn; and turchin, v., eds.: principia cybernetica web. brussels, belgium: principia cybernetica. available from: http://pespmc1.vub.ac.be.

marion grau