Whereas cosmology explores the boundaries of the very large, and quantum theory the nature of the very small, complexity theory aims to understand the emergence and development of orders at every level, including the medium size world. To the riddles of the macroscopic and the microscopic are added the puzzles of complex pattern formation in semistable dynamical systems known from everyday life.
Semistable systems are usually nonlinear, so small inputs may trigger dramatic changes. Examples are volcanos and tornados, embryologic and ecological evolution, traffic systems, and stock markets. These are not new areas of research, but the computerization of science since the 1970s has made possible new formalistic approaches to the study of dynamical systems. The question is hereby not so much "What are the constituents of nature (quarks, protons, electrons, etc.)?" but rather "How does nature work?"
Complexity theory, however, is not the name of a single theory comparable to, say, Albert Einstein's Theory of Relativity. There hardly exists one overarching "law of complexity" waiting to be discovered. Rather, complexity research is an umbrella term for a wide variety of studies on pattern formation, some more general, some arising under specific organizational conditions. The field builds on thermodynamics, information theory, cybernetics, evolutionary biology, economics, systems theory, and other disciplines. Since complexity research consistently crosses the boundaries between the inorganic and the organic, the natural and the cultural, it is likely to influence the science-religion dialogue significantly.
There is no consensus on a general definition of complexity. The complex is usually defined in contrast to the simple, but the distinction between simple and complex is a relative one. What is simple in one frame of reference may be complex in another. Walking downstairs, for example, is simple from the perspective of a healthy person, but physiologically it is highly complex. On the other hand, chaos theory shows that complex phenomena can be described by simple nonlinear equations.
An exact measure of algorithmic complexity has been available since the 1960s. In the Kolmogorov-Chaitin model, the complexity of a digital code consisting of 0s and 1s is measured by the length of the computer program needed to describe it. Even a long series of digits (e.g., 01010101010101010101 . . . ) can be compressed into a compact description: "write 01 x times." By contrast, a complex code is a series without a discernable pattern; in the worst case, the series would simply need to be repeated by the computer program (e.g., 1001110010011000011 . . . ). Such systems are by definition random. However, one can never know with certainty whether a series that one sees as random could be further compressed. This is an information-theoretical version of Gödel's incompleteness theorem discovered by Gregory Chaitin.
Similarly, C. H. Bennett suggested a measure for a structure's degree of complexity by referring to its logical depth, defined as the time needed (measured by the number of computational steps) for the shortest possible program to generate that structure. Both Chaitin and Bennett presuppose Claude Shannon's mathematical concept of information: The more disordered a message is, the higher is its informational content. While Chaitin's basic definition has the advantage of being extremely economic, Bennett's definition is capable of measuring the discrete operational steps involved in problem solutions. However, none of these formal definitions of complexity can distinguish organized complexity from pure randomness. The main interest of complexity studies, though, is to understand the self-organized complexity that arises in the creative zones between pure order and pure randomness.
To catch the idea of organized complexity, it may be useful to distinguish between descriptive, causal, and ontological aspects of complexity of natural and social systems. Systems that require different sorts of analyses have been called descriptively complex (Wimsatt, 1976). Fruit flies, for example, require a variety of descriptions, such as physical descriptions of their thermal conductivity, biochemical descriptions of their constitution, morphological descriptions of their anatomical organs, functional descriptions, and so on. This idea of descriptive complexity lends support to an explanatory pluralism, which emphasizes the need for different types of explanation at different levels.
Systems, however, can also be pathway complex while simple in structure if their causal effects are highly sensitive to environmental conditions. A hormone is a natural-kind entity with an easily specifiable molecular composition, but since the effects of the hormone depend on a variety of bodily constellations (which cannot be finitely determined), the causal trajectory of the hormone is complex. Systems theory and organicist proposals in biology have focused on this aspect of complexity.
The most difficult thing to define is ontological complexity. An element-based definition of complexity defines complexity by its large number of variegated elements (Bak, 1997, p. 5). This definition centers on the fact that many large-scale systems (mountains, geological plates, etc.) do not allow for an analytical approach of their microphysical states. A relation-based definition of complexity will rather focus on the multiple couplings of a system in relation to its environments (Luhmann, 1995, p. 23–28). The human brain with its high number of flexible neurons exemplifies that more possibilities of couplings exist than can be actualized in a life history. Since the capacity for complex interactions with the environment is usually increased by operational subsystems, organizational features can be added to the definition of complexity. An organization-based definition of complexity thus emphasizes the hierarchical structure of interacting levels. Analogously, a performance-based definition focuses on the capacity for self-organizing activity. Systems are thus ontologically complex if they (1) consist of many variegated elements (in terms of sizes and types), (2) have a high capacity for relating to themselves and their environments, (3) are highly organized in subsystems, multilevel structures, and internal programs, and (4) can perform self-organized activities by flexible couplings to the environment. On this scheme it is possible to evaluate different aspects of complexity. A volcano will be more complex than an amoeba on (1) elements and perhaps on (2) relations, but far less complex on the score of (3) hierarchical order and (4) self-organizing activity.
On this scheme, the complex can also be distinguished from the merely complicated (Cilliers, 1998). Even "primitive" natural entities such as genes may be ontologically more complex than sophisticated artificial systems such as airplanes. A Boeing 747 jet consists of highly specified elements, related to one another in predescribed ways, and there exists a clear recipe for how to assemble the elements into a unified system, which again has a predesigned purpose: being able to take off, fly, and land safely. The Boeing is a highly complicated machine but not terribly complex. In this sense, the complex is more than the simple but also more than the complicated.
Complexity studies fall into two main families of research, one more conceptual (organicism, emergentism, and systems theory) and another more formalistic (information theory, cybernetics, and computational complexity). Both types of research continue to interact in understanding complex phenomena. While a conceptual preunderstanding of complex phenomena guides the construction of computational models, these will afterwards have to be tested on real-world situations.
Complexity studies did not start with computers. The idea that the whole is more than the sum of the parts goes back to Plato's notion of divine providence (Laws 903 B-C), and in Critique of Judgment (1790) Immanuel Kant describes a naturalized version of the same idea of self-adjustment: "parts bind themselves mutually into the unity of a whole in such a way that they are mutually cause and effect of one another"(B 292).
Embryologists from Karl Ernst von Baer and Hans Spemann up to C. H. Waddington embraced organicism as the middle course between vitalism and reductionism. In organicism a materialist ontology ("there exists nothing but matter") was combined with the observation that new properties are causally effective within higher-order wholes. Molecules are not semipermeable, but cell membranes are, and without this capacity organisms cannot survive. In the 1920s writers such as C. Lloyd Morgan, C. D. Broad, and Samuel Alexander developed organicism from an empirical research program into a metaphysical program of emergent evolutionism. The point here was that in the course of evolution higher-order levels are formed in which new properties emerge. Whereas the solidity of a table is a mere "resultant" of solid state physics, the evolution of life is ripe with "emergent" properties (for example, metabolism) that require new forms of description and eventually will have real causal feedback effects on the physical system (the atmosphere) that nourished life in the first place.
After World War II the general systems theory combined organicistic intuitions with cybernetics. Ludwig von Bertalanffy replaced the traditional whole/part difference by the difference between systems and environments. Systems are constitutionally open for environmental inputs and are bound to develop beyond equilibrium under selective pressures. Thereby systems theory established itself as a theory combining thermodynamics and evolutionary theory. Systems are structures of dynamic relations, not frozen objects.
In the 1960s Heinz von Foerster and others introduced the theory of self-referential systems according to which all systems relate to their environments by relating to their own internal codes or programs. Brains don't respond to cats in the world, but only to the internal firings of its neurons within the brain. "Click, click is the vocabulary of neural language" (von Foerster). In this perspective, closure is the precondition of openness, not its preclusion. In this vein, biologists Francisco J. Varela and Humberto Maturana developed a constructionist research program of autopoietic (selfproductive) systems. The sociologist Niklas Luhmann has further emphasized how systems proceed by self-differentiation and can no longer be analyzed by reference to global physical features of the world-as-a-whole. In this perspective, each system needs to reduce, by its own internal operations, the complexities produced by other systems. Different systems (for example, biological, social, psychic) operate with different codes (energy, communication, consciousness), and even though they coevolve they cannot communicate with one another on neutral ground. The fleeting experience of consciousness, for example, remains coupled to physiological processes and to social communication, yet has its own irreducible life.
Computational complexity presupposes the idea of algorithmic compression and embodies the spirit of cybernetics. The dictum of Norbert Wiener that "Cybernetics is nothing if it is not mathematical" (1990, p. 88) could also be said of computational complexity.
The field of cybernetics was developed after World War II by John von Neumann, Ross Ashby, Norbert Wiener, and others. Central to cybernetics is the concept of automata, defined as machines open for information input but leading to an output modified by an internal program. In cybernetic learning machines, the output functions are reintroduced into the input function, so that the internal program can be tested via trial and error processes. However, the measure for success or failure is still fixed by preset criteria.
The cybernetic automata were the direct precursors of cellular automata, used in the artificial life models designed by John Conway and Chris Langton in the 1970s. Cellular automata use individual based modeling: "Organisms" are placed in cubic cells in a two-dimensional grid, and their "actions" (die or divide) are specified by the number of living cells in their immediate neighborhood. In this way, the positive feedback of breeding can be modeled as well as the negative feedback of competition. The result is self-reproducing loops generated by very simple rules.
With the establishment of the Santa Fe Institute in New Mexico in 1984 a multidisciplinary center for computational complexity was formed. Physicist Murray Gell-Mann, computer scientist John Holland, and others introduced the idea of complex adaptive systems. As opposed to simple adaptation (as in a thermostat), there are no preset goals for complex adaptive systems. Like cellular automata, complex adaptive systems are individually modeled systems, but complex adaptive systems also involve "cognition." Complex adaptive systems are able to identify perceived regularities and to compress regular inputs into various schemata. Unlike cybernetic learning machines, there may be several different schemata competing simultaneously, thus simulating cognitive selection processes. In this manner self-adaptation coevolves with adaptation beyond a preset design. Whereas Gell-Mann uses complex adaptive systems as a general concept, Holland uses the term only about interacting individual agents. Complex adaptive systems agents thus proceed by a limited set of interaction rules, governed by simple stimulus-response mechanisms such as (1) tags (e.g., if something big, then flee; if something small, then approach), (2) internal models (or schemata), and (3) rules for connecting building blocks to one another (e.g., eyes and mouth to facial recognition). The result of these local mechanisms, however, is the emergence of global properties such as nonlinearity, flow, diversity, and recognition.
Insofar as complex patterns are generated by simple mechanisms, computational complexity can be seen as a reductionist research paradigm; in contradistinction to physical reductionism, however, the reduction is to interaction rules, not to physical entities. But insofar as higher-level systems can be shown to exert a "downwards" feedback influence on lower-level interaction-rules, computational complexity may also count as an antireductionist research program. The issues of reductionism versus antireductionism, bottom-up versus top-down causality, are still debated within the computational complexity community. But anyway, it is information and not physics that matters.
Computational complexity and real-world complexity
The spirit of computational complexity is not to collect empirical evidence and "reflect" reality, but to "generate" reality and explore virtual worlds of possibility. Computational complexity is nonetheless empirically motivated and aims to understand real-world complexity by computer modeling. The aspiration is to uncover deep mathematical structures common to virtual worlds and real-world dynamical systems.
The mathematical chaos theory is an example of a computer-generated science that has succeeded in explaining many dynamical patterns in nature. Yet the relation between chaos theory and computational complexity is disputed. While chaotic systems (in the technical sense) are extremely sensitive to the initial conditions, complex systems are more robust (that is, they can start from different conditions and still end up in almost the same states). Accordingly, chaos theory can predict the immediate next states but not long-term developments, whereas complex systems can reliably describe long-term prospects but cannot predict the immediate following steps. Moreover, chaos systems do not display the kind of evolutionary ascent and learning characteristic of complex adaptive systems, but oscillate or bifurcate endlessly. It therefore seems fair to say that chaos theory is only a small pane in the much larger window of complexity studies. Chaos, in the colloquial sense of disorder, is everywhere in complex systems (and so are fractals and strange attractors), but the equations of chaos in the technical sense (the specific Lyapunov-exponent, etc.) cannot explain self-organized complexity.
There are also connections between thermodynamics and complexity theory. Beginning in the 1960s, the chemist Ilya Prigogine studied the socalled dissipative structures that arise spontaneously in systems dissipated by energy. While classical thermodynamics described isolated systems where nonhomogeneities tend to even out over time, Prigogine studied nonequilibrium processes of "order out of chaos" (chaos in the nontechnical sense). Famous examples are the convection patterns of Bénards cells formed spontaneously under heating or the beautiful chemical clocks of the Belousov-Zhabotinski reaction. While Ludwig Boltzmann's law of entropy from 1865 still holds for the universe as a whole, the formation of local orders is produced by nonequilibrium thermodynamics. The averaging laws of statistical mechanics are not contradicted, but they simply do not explain the specific trajectories that develop beyond thermodynamical equilibrium.
In the wake of Prigogine, a new search for the thermodynamical basis of evolution began (Wicken, 1987). The bifurcation diagrams of Prigogine showed amazing similarities to evolutionary trees. Reaching back to the seminal work of D'Arcy Wentworth Thompson in On Growth and Form (1916), many began to think that the interplay of selection and mutation is not self-sufficient for explaining the evolutionary tendency towards complexification. Evolution may be driven by gene selection and prebiotic laws of physical economy.
Since the 1970s, theoretical biologist Stuart Kauffmann has constructed computational models of self-organizing principles at many levels. Motivated by the almost ubiquitous tendency of chemical systems to produce autocatalytic systems, Kauffman theorizes that life may have emerged quite suddenly through phase transitions where chemical reactions function as catalysts for one another far below the threshold of the RNA-DNA cyclus. Kauffman uses a similar model for simulating the empirical findings of Francois Jacob and Jacques Monod, who showed that genes switch on and off depending on the network in which they are situated. In the simplest model of Boolean networks, each "gene" is coupled randomly to two other genes with only two possible states, on or off (states that are determined by the states of the two other genes). Running this small system with only three genes and two activities recurrently (and later with much larger networks), Kauffman was able to show that the number of state cycles (attractors) increase with the number of genes. Moreover, their relation is constant so that the number of state cycles is roughly the square root of the number of genes, and Kauffman points out in The Origins of Order (1993) that in real-world species one finds roughly the same relation between the number of genes and the number of cell types in a given species. Thus, agents in coupled systems seem to tune themselves to the optimal number of couplings with other agents. In addition, when investigating fitness landscapes of interacting species at the ecological level, Kauffman finds the principle of "order for free." Evolutionary innovations tend to happen "at the edge of chaos," between the strategy of evolutionarily stable orders and the strategy of the constant evolutionary arms race. In Investigations (2000), Kauffman pursues a search for laws by which the biosphere is coconstructed by "autonomous agents" who are able run up against the stream of entropy. Kauffman hereby acknowledges the impossibility of prestating finitely what will come to be within the vast configuration space of the biosphere.
The theory of self-organized criticality formulated by Per Bak and his colleagues starts in empirically confirmed regularities (such as the Gutenberg-Richter law of earthquakes). Many systems show slow variation over long periods, rare catastrophic activities over short time, and some critical phases in between. The building up of sand piles shows these phase transitions, but so do earthquakes, extinction rates, and light from quasars. Bak's point is that self-organized criticality systems are self-organizing, since they (1) are robust and do not depend on specific initial conditions, (2) emerge spontaneously over time (with no external designer), and (3) are governed by the same mathematical principles in stationary, critical, and catastrophic states. Bak has made both real-world experiments and simplified computer-models of self-organized criticality systems, but he believes that self-organized criticality is only a first approximation of stronger explanations of nature's tendency to build up balances between order and disorder.
Relevance for the science-religion discussion.
While organicist programs of noncomputational complexity have played a major role in the science-religion dialogue since the seminal works of Ian Barbour, Arthur Peacocke, and others, the relevance of computational complexity for theology largely remains to be explored. The following issues are therefore to be taken more as pointers than as conclusions:
- The sciences of complexity study pattern formations in the midst of the world rather than in a hidden world beyond imagination. The features of organized complexity resonate with the experiences of being-part-of-a-whole, experiences that since Friedrich Schleiermacher's On Religion (1799) have been taken to be essential to religious intuition.
- While presupposing a robust naturalism, complexity theory suggests that "information" is as seminal to nature as are the substance and energy aspects of matter. Complexity theory may thus give further impetus to the dematerialization of the scientific idea of matter in the wake of relativity theory.
- By focusing on relations and interactions rather than on particular objects, complexity theory supports a shift in worldview from a mechanical clockwork view of the world to an emergentist view of the world as an inter-connected network, where flows of information take precedence over localized entities. Complexity theory also offers a road for understanding the evolution of coevolution. By balancing the principle of individual selection by principles of self-organization, the focus on individual genes is supplemented by the importance of interconnected living organisms, a view closer to ethical and religious sentiments than the inherited view of the omnipotence of selection.
- Even though natural evils (from earthquakes to selection) remain a challenge to religions that presuppose a loving almighty God, the costs of evolutionary creativity are now being placed in a wider framework of evolution. If the same underlying dynamics of self-organized criticality produce both stability, criticality, and catastrophes, and the constructive aspects of nature cannot exist without the destructive aspects, a theodicy of natural evils may be facilitated.
- The idea of complex adaptive systems gives biological learning and human culture (including science, ethics, and religion) a pivotal role in the understanding of what nature is, and what makes human and animal life grow and flourish. In addition, since complexity theory consistently crosses the boundaries between physics, biology, and the cultural sciences, theologians and human scientists may be prompted to rethink human culture (including religion) in terms of the creative interactions between the inorganic, the organic, and the cultural.
- From an external scientific perspective, computational complexity may be used to explain a variety of religious phenomena that arise at the critical interface between adaptation and self-adaptation, such as the interaction between religious groups, individual conversion experiences, and so on. The first computer models in this area have already been completed.
- From an internal religious perspective, complexity theory offers religious thought a new set of thought models and metaphors, which (when adopted) can stimulate the heuristics of theology when complex phenomena are redescribed from the perspective of religious symbolics. Self-organization, coupled networks, and adaptation by self-adaption are candidates for such religious self-interpretation. The principles of complexity are in particular consonant with the idea that a divine Logos is creatively at work in the pattern formations of nature and drives nature towards further complexification.
- The computational complexity idea of self-organization is a challenge to the Enlightenment idea of a divine designer of all natural processes. Self-organization is also a challenge to the creationist Intelligent Design movement, which gives priority to the idea of "original creation" and tends to perceive novelties as perversions of pre-established designs. However, self-organizational processes never happen from scratch, but always presuppose a framework of laws and natural tendencies that could well be said to be "designed" by God. While a design of specific evolutionary outcomes is obsolete in light of self-organized complexity, the coordination of laws leading towards self-organization and coevolution may be explained by a divine metadesign.
- Since emergence takes place in the merging of coupled systems, theology may escape the alternative between an interventionist God, who acts by breaking natural laws, and a God who only sustains the laws of nature uniformly over time. In higher-organized systems, new informational pathways are continuously tried out in adventurelike processes. If the local interaction rules and the overall probability patterns are constantly changed over time, the actual pathways of large-scale coupled systems are not reducible to the general laws of physics. Special divine interaction with the evolving world can thus no longer be said to "break laws" in an interventionist manner, since there are no fixed laws to break in coupled systems.
- The seminal idea of self-organization may help overcome the idea that God and nature are contraries, so that God is powerless, if nature is powerful, and vice versa. A more adequate view may be to understand God as the creator who continuously hands over creativity to nature so that natural processes are the signs of a divine self-divestment into the very heart of nature's creativity. On this view, God is at work "in, with, and under" natural and social processes, and self-organization takes place within a world already created and continuously gifted by God.
See also Automata, Cellular; Autopoiesis; Chaos Theory; Cybernetics; Emergence; Information Theory; Intelligent Design; Systems Theory; Thermodynamics, Second Law of
ball, philip. the self-made tapestry: pattern formation in nature. oxford: oxford university press, 1999.
bak, per. how nature works: the science of self-organized criticality. oxford: oxford university press, 1997.
bennett, c. h. "logical depth and physical complexity." the universal turing machine. a half-century survey, ed. rolf herken. oxford: oxford university press, 1987.
chaitin, gregory. algorithmic information theory. cambridge: cambridge university press, 1975.
cilliers, paul. complexity and postmodernism: understanding complex systems. london: routledge, 1998.
clayton, philip. god and contemporary science. grand rapids, mich.: eerdmans, 1997.
cowan, george a.; pines, david; and meltzer, david, eds. complexity: metaphors, models, and reality. cambridge, mass.: perseus books, 1994.
emmeche, claus. the garden in the machine: the emerging science of artificial life. princeton, n.j.: princeton university press, 1994.
gell-mann, murray. the quark and the jaguar: adventures in the simple and the complex. new york: w. h. freeman, 1994.
gilbert, scott f., and sarkar, sahotra. "embracing complexity: organicism for the 21st century." developmental dynamics 219 (2000): 1–9.
gregersen, niels henrik. "the idea of creation and the theory of autopoetic processes." zygon 33, no. 3 (1998): 333–367.
gregersen, niels henrik, ed. from complexity to life: on the emergence of life and meaning. new york: oxford university press, 2002.
holland, john. hidden order: how adaptation builds complexity. reading, mass.: addison-wesley, 1995.
holland, john. emergence: from chaos to order. oxford: oxford university press, 1998.
kauffman, stuart. the origins of order: self-organization and selection in evolution. new york: oxford university press, 1993.
kauffman, stuart. at home in the universe: the search for laws of self-organization and complexity. new york: oxford university press, 1995.
kauffman, stuart. investigations. new york: oxford university press, 2000.
luhmann, niklas. social systems, trans. john bednarz jr. with dirk baecker. stanford, calif.: stanford university press, 1995.
maturana, humberto r., and varela, fransisco. the tree of knowledge: the biological roots of human understanding, rev. edition. boston: shambala, 1992.
peacocke, arthur. theology for a scientific age: being and becoming—natural, divine and human, enlarged edition. london: scm press, 1993.
rasch, william, and wolfe, cary, eds. observing complexity: systems theory and postmodernity. minneapolis, minn.: university of minnesota press, 2000.
russell, robert john; murphy, nancey; and peacocke, arthur a., eds. chaos and complexity: scientific perpectives on divine action. berkeley, calif.: center for theology and the natural sciences; notre dame, ind.: notre dame university press, 1995.
russell, robert john; murphy, nancey; meyering, theo; and arbib, michael a., eds. neuroscience and the human person: scientific perpectives on divine action. berkeley, calif: center for theology and the natural sciences; notre dame, ind.: notre dame university press, 1999.
schmidt, siegfried j., ed. der diskurs des radikalen konstruktivismus. frankfurt am main, germany: suhrkamp, 1988.
solé, richard, and goodwin, brian. signs of life: how complexity pervades biology. new york: basic books, 2000.
stengers, isabelle. la vie et l'artifice: visage de l'émergence. paris: la découverte, 1997.
thompson, d'arcy wentworth. on growth and form (1916). new york: dover, 1992.
waldrop, m. mitchell. complexity: the emerging science at the edge of order and chaos. new york: simon and schuster, 1992.
wicken, j. s. evolution, information, and thermodynamics: extending the darwinian paradigm. new york: oxford university press, 1987.
wiener, norbert god and golem inc.: a comment on certain points where cybernetics impinges on religion. 1964. reprint, cambridge, mass: mit press, 1990.
wimsatt, william c. "complexity and organization." in topics in the philosophy of biology, eds. marjorie grene and everett mendelsohn. dordrecht, netherlands: kluwer, 1986.
niels henrik gregersen
A complex is a group of partially or totally unconscious psychic content (representations, memories, fantasies, affects, and so on), which constitutes a more or less organized whole, such that the activation of one of its components leads to the activation of others.
Freud did not coin the term "complex." At the end of the nineteenth century, it was occasionally used in psychiatry to designate a collection of ideas belonging to a subject. Freud used it in this sense in 1892 in a sketch written in preparation for his "Preliminary Communication." He wrote that in hysteria, "the content of consciousness easily becomes temporarily dissociated and certain complexes of ideas which are not associatively connected easily fly apart" (1940-41 , p. 149). Shortly after, he used the term again in Studies on Hysteria (1895d), specifically in the case of Emmy von N. Josef Breuer, coauthor of the Studies, wrote, "It is almost always a question of complexes of ideas, of recollections of external events and trains of thought of the subject's own. It may sometimes happen that every one of the individual ideas comprised in such a complex of ideas is thought of consciously, and that what is exiled from consciousness is only the particular combination of them" (1895d, p. 215n).
In the ensuing years, the idea that at the heart of a neurosis there was a collection of ideas and affects specific to the subject and organized around a traumatic sexual experience became central to the development of psychoanalysis—even though subsequently Freud rarely used the term "complex" to designate such a set of ideas. He did add an essential modification to the previous psychiatric conception in positing that, for the most part, such a collection is made up of unconscious processes and remains unconscious itself.
In 1906 he explicitly discussed the term "complex" for the first time in an article on "Psycho-Analysis and the Establishment of Facts in Legal Proceedings" (1906c). He paid homage to Eugen Bleuler, and more particularly to Carl Gustav Jung, whom he had just met, and praised the method of "word association," which was developed by Wilhelm M. Wundt and practiced by Jung. This experimental method consisted of giving a subject a series of "stimulus words" and asking the subject to react as quickly as possible. The time that it takes the person to respond and the nature of their response are assumed to indicate a "complex." Freud, in this work, defined a complex as "ideational content" charged with affect (p. 104).
From then on, he used the term frequently to designate the "nuclear complex of neurosis," that is, "the father complex" (1909d, p. 208n; p. 200ff.), which he designated as the "Oedipus complex" starting in 1910 (1910h, p. 171). Similarly, he began to speak of the "castration complex" (1909b, p. 8).
After he split with Jung, Freud withdrew the praise that that he previously bestowed upon him. Thus he wrote, in his History of the Psycho-Analytic Movement (1914d), that the "theory of complexes" proposed by Jung did not actually attain the status of a theory and had not "proved capable of easy incorporation into the context of psycho-analytic theory" (p. 29), even though the term itself had become common in psychoanalytic usage. In other words, Freud adopted the term to give meaning to his own metapsychological constructions, but rejected the theory of Jung himself.
The following points should be emphasized:
- There is an obvious difference between the popular use of the term "complex" in contemporary culture and its more specific usage in the psychoanalytic literature. In this regard, what Freud described in 1914 remains the same today: "None of the other terms coined by psycho-analysis for its own needs has achieved such widespread popularity or been so misapplied to the detriment of the construction of clearer concepts" (1914d, pp. 29-30). In contemporary psychoanalytic writings, the term is hardly ever encountered anymore except in two closely connected situations: references to the Oedipus complex and the castration complex.
- As surprising as it may seem, there has been scarcely any coherent theoretical reflection on the notion of the "complex" as such, except insofar as it is related to other terms used to designate an organized set of mental processes and activities ("structure" and "system," for example). The difficulty arises from the need to distinguish and yet coordinate two different levels. The first describes the structure of the psyche as being the same, at least in its broad outlines, in every human being; such features would be, at least in theory, constitutive of the psyche itself (this is the case with the Oedipus complex and its corollary, the castration complex). The other level is that of individual variations, that is, the particularities of such a fundamental structure taken as a function of personal history, of imagos, of the play of identification, etc. The study of such particularities is the very object of psychoanalytic treatment. But the temptation to group complexes into "families" led over time to the proliferation of "new complexes," generally named after mythological figures. There was the "Electra complex," the supposed feminine version of the Oedipus complex; the "Jocasta complex," which designated the maternal counter-Oedipus; and even the "Ajase complex," which referred to the guilt that is linked in Japanese culture to the desire to kill the mother (Kosawa, 1931/1954). Thus there is a danger of falling into a purely descriptive typology in which the coherence of the Freudian metapsychology disappears and its explanatory power is lost. But in fact, not one of these proposed complexes has survived.
- Insofar as it relates to a fundamental structure, a complex is in itself not characteristic of this or that neurosis. Only its functionally disruptive manifestations and fixations can rise to the level of pathology.
In the definitions given above, a complex is "a group of ideas." Josef Breuer correctly noted that these ideas could be or could become conscious, but that what is "exiled from consciousness" is their "particular combination." However, we cannot remain at the level of ideas in the strict sense: memory traces, fantasies (at every level, from conscious to unconscious), and imagos, for example, all enter into this "combination." Moreover, what accounts for the effect of the complex is its quota of affect, and also its drive force. Thus, the study of an individual complex in the treatment leads to a topological consideration of all the related defenses and the retroactive reworkings that combined to set up a functional structure of this kind.
See also: Compensation (analytical psychology); Complex (analytical psychology); Ego (analytical psychology); Femininity; Nuclear complex; Oedipus complex; Oedipus complex, early; Projection and "participation mystique" (analytical psychology); Psychology of Dementia praecox ; Word association.
Freud, Sigmund. (1906c). Psycho-analysis and the establishment of facts in legal proceedings. SE, 9: 97-114.
——. (1909b). Analysis of a phobia in a five-year-old boy. SE, 10: 1-149.
——. (1909d). Notes upon a case of obsessional neurosis. SE, 10: 151-318.
——. (1910h). A special type of choice of object made by men. SE, 11: 163-175.
——. (1914d). On the history of the psycho-analytic movement. SE, 14: 1-66.
——. (1940-41 ). Sketches for the "Preliminary Communication" of 1893. SE, 1: 147-154.
Freud, Sigmund, and Breuer, Josef. (1895d). Studies on Hysteria. SE, 2: 48-106.
Kosawa, Heisaku. (1954). Two kinds of guilt feelings: the Ajase complex. Japanese Journal of Psychoanalysis, 11. (Original work published 1931)
Complex (Analytical Psychology)
COMPLEX (ANALYTICAL PSYCHOLOGY)
A complex is the more- or less-repressed standardization of emotionally strong conflictual experiences. When these experiences are triggered, either by certain themes (such as new pieces of information), or emotions (in which case they are called "constellations"), the complex produces a reaction, such that the individual perceives the situation in terms of the complex (with a distortion of perception), and responds with an emotional overreaction, which mobilizes the processes of stereotyped defense.
Carl Gustav Jung developed his concept of the complex at the same time as he was engaged in his experiments with association. It is within this context that the concept appeared for the first time, in 1904, in an essay called "Experimentelle Untersuchungen über Assoziationen Gesunder " ("The associations of normal subjects," with Franz Riklin). But he had already used the term, without any particular specificity, in his thesis of 1902.
When, at the turn of the century, Jung and Riklin eagerly turned to research on association in order to construct typologies, they studied what they considered normal disturbances of experience. They showed that a test subject could not uniformly form associations with ideas that were attached to highly emotionally-charged experiences and personal difficulties. They went on to hypothesize that such complexes might constitute the background of consciousness, and that in any neurosis of psychical origin, there would be a complex characterized by a particularly strong emotional charge.
Later, in 1907, Jung established that any event charged with affect gives rise to a complex and reinforces those that are already in place. Complexes act from the unconscious and can at any moment either inhibit, or on the contrary, activate conscious behavior. They reveal conflicts, but are also defined by Jung as crucial hot points of psychic life.
In 1934, Jung summarized his theory of complexes and emphasized that, even outside of the effects of any individual constellation, complexes involve the active forces that determine the interests of everyone and thus serve as the basis for the symbol formation. This conception of complexes, which he continued to develop afterwards, led him to emphasize their creative effects. From a therapeutic perspective, this is an important aspect of his psychology and his clinical work. From it he developed the idea of promoting creative development through the integration of complexes. This idea plays a large role in many of the techniques developed by the Jungian school. Finally, it is from this insight that Jung came to see archetypes at the heart of complexes.
The experiments in association, as well as the concepts of the complex-ego, of the symbol and the archetype, imagination and emotion, and transference and counter-transference, all refer to Jung's idea that the complex is caused by the painful confrontation of the individual with the "necessity to adapt." Thus the very concept of complexes takes on an even more dynamic dimension: each one appears as an effect of the condensation and generalization of experiences that might, at any moment, be associated by analogy with a new piece of information or emotion. This is why the concept takes on decisive importance for understanding what is at play in the transference and the counter-transference.
See also: Castration complex; Dead mother complex; "On the History of the Psychoanalytic Movement"; Libidinal development; Ethnopsychoanalysis; Identification; Imago; Masculine protest (analytical psychology); Penis envy; Phallus; Primal fantasies; Primitive horde; Psychanalyse et Pédiatrie (psychoanalysis and pediatrics); Psychoanalysis of Fire, The ; Repression; Sexual differences;Structural theories; Word association; Word-Presentation.
Jung, Carl Gustav. (1902). On the psychology and the pathology of the so-called occult phenomena. In Coll. works, vol. I. London: Routledge & Kegan Paul, 1957.
——. (1904). The associations of normal subjects. In Coll. works, vol. II. London, Routledge & Kegan Paul.
——. (1907). The psychology of dementia præcox. In Coll. works, vol. III. London: Routledge & Kegan Paul.
——. (1934 ). A review of the complex theory. In Coll. works, vol. VIII. London: Routledge & Kegan Paul.
Kast, Verena. (1992). The Dynamics of Symbols: Fundamentals of Jungian Psychotherapy. (Susan A. Schwarz, Trans.). New York: Fromm International Publishing Corporation.
A complex is a species in which the central atom is surrounded by a group of Lewis bases that have covalent bonds to the central atom. The Lewis bases that surround the central atom are generally referred to as ligands. Complexes are so named because when they were first studied, they seemed unusual and difficult to understand. Primarily, transition metals form complexes and their most observable property is their vivid color. The color of transition metal complexes is dependent on the identity and oxidation state of the central atom and the identity of the ligand.
Acids and bases were originally defined by Swedish physical chemist Svante August Arrhenius (1859–1927) as substances that donated protons (positively charged hydrogen ions) and hydroxide ions (consisting of one hydrogen atom bonded to one oxygen atom and having an overall charge of minus one), respectively. Subsequently, Danish physical chemist Johannes Nicolaus Bronsted (1879–1947) and English physical chemist Thomas Martin Lowrey (1874–1936) redefined acids and bases as being proton donors and proton acceptors, respectively. This broadened definition made it possible to include substances that were known to behave as bases but did not contain the hydroxide ion. Much later, American chemist Gilbert Lewis (1875–1946) defined acids as substances that could accept an electron pair and bases as substances that could donate an electron pair, and this definition is currently the broadest definition of acids and bases as it includes substances that have neither protons nor hydroxide ions. Thus, the Lewis bases, or ligands, in complexes have electron pairs they can share with the central atom. Covalent bonds are bonds in which a pair of electrons is shared between two atoms; as opposed to ionic bonds in which one atom more or less appropriates the electron(s), acquiring a negative charge, while the other atom loses the electrons, resulting in a positive charge.
Transition metals are the elements that appear in the central block of the periodic table (atomic numbers 21 to 30 and the columns below them). The transition metals are capable of having different oxidation states (a measure of the number of electrons arranged around the central atom). A complex having the same ligands and the same central atom but with different oxidation states will have different colors. A complex with the same ligands but different central atoms with the same oxidation state will have different colors. Similarly, a complex with the same central atom and oxidation state, but having different ligands will have different colors.
A number of biologically important molecules are dependent on the presence of transition metals, and the biological role of the transition elements usually depends on the formation of complexes. For instance, hemoglobin is an iron complex that is important in the transport of oxygen in the body. Chromium is a part of the glucose tolerance factor that, along with insulin, controls the removal of glucose from the blood. More than 300 enzymes contain zinc, one of them being the digestive enzyme that hydrolyzes protein. In addition, many synthetic dyes and pigments are transition metal complexes, such as Prussian blue. Transition metal complexes are also used as catalysts in many important industrial processes, such as the formation of aldehydes from alkenes, the extraction of gold from ore, and the purification of nickel.
com·plex • adj. / kämˈpleks; kəmˈpleks; ˈkämˌpleks/ 1. consisting of many different and connected parts: a complex network. ∎ not easy to analyze or understand; complicated or intricate: a complex personality. 2. Math. denoting or involving numbers or quantities containing both a real and an imaginary part. 3. Chem. denoting an ion or molecule in which one or more groups are linked to a metal atom by coordinate bonds. • n. / ˈkämˌpleks/ 1. a group of similar buildings or facilities on the same site: a new apartment complex. ∎ a group or system of different things that are linked in a close or complicated way; a network: a complex of mountain roads. 2. Psychoanalysis a related group of emotionally significant ideas that are completely or partly repressed and that cause psychic conflict leading to abnormal mental states or behavior. ∎ inf. a disproportionate concern or anxiety about something. 3. Chem. an ion or molecule in which one or more groups are linked to a metal atom by coordinate bonds. ∎ any loosely bonded species formed by the association of two molecules: cross-linked protein—DNA complexes. • v. / kämˈpleks; kəmˈpleks; ˈkämˌpleks/ [tr.] (usu. be complexed) Chem. make (an atom or compound) form a complex with another. ∎ [intr.] form a complex. DERIVATIVES: com·plex·a·tion / kämˌplekˈsāshən; kəm-/ n. ( Chem. ) com·plex·ly adv.
A complex is a species in which the central atom is surrounded by a group of Lewis bases that have covalent bonds to the central atom. The Lewis bases that surround the central atom are generally referred to as ligands. Complexes are so named because when they were first studied, they seemed unusual and difficult to understand. Primarily, transition metals form complexes and their most observable property is their vivid color . The color of transition metal complexes is dependent on the identity and oxidation state of the central atom and the identity of the ligand .
Acids and bases were originally defined by Arrhenius as substances that donated protons (positively charged hydrogen ions) and hydroxide ions (consisting of one hydrogen atom bonded to one oxygen atom and having an overall charge of minus one) respectively. Subsequently, Bronsted and Lowry redefined acids and bases as being proton donors and proton acceptors, respectively. This broadened definition made it possible to include substances that were known to behave as bases but did not contain the hydroxide ion. Much later, Lewis defined acids as substances that could accept an electron pair and bases as substances that could donate an electron pair, and this is currently the broadest definition of acids and bases as it includes substances that have neither protons nor hydroxide ions. Thus, the Lewis bases, or ligands, in complexes have electron pairs they can share with the central atom. Covalent bonds are bonds in which a pair of electrons is shared between two atoms ; as opposed to ionic bonds in which one atom more or less appropriates the electron(s), acquiring a negative charge, while the other atom loses the electrons, resulting in a positive charge.
Transition metals are the elements that appear in the central block of the periodic table (atomic numbers 21–30 and the columns below them). The transition metals are capable of having different oxidation states (a measure of the number of electrons arranged around the central atom). A complex having the same ligands and the same central atom but with different oxidation states will have different colors. A complex with the same ligands but different central atoms with the same oxidation state will have different colors. Similarly, a complex with the same central atom and oxidation state, but having different ligands will have different colors.
A number of biologically important molecules are dependent on the presence of transition metals and the biological role of the transition elements usually depends on the formation of complexes. For example, hemoglobin is an iron complex important in the transport of oxygen in the body. Chromium is a part of the glucose tolerance factor that, along with insulin , controls the removal of glucose from the blood . More than 300 enzymes contain zinc, one of them being the digestive enzyme that hydrolyzes protein. In addition, many synthetic dyes and pigments are transition metal complexes, such as Prussian blue. Transition metal complexes are also used as catalysts in many important industrial processes, such as the formation of aldehydes from alkenes, the extraction of gold from ore and the purification of nickel.
complexity, in science, field of study devoted to the process of self-organization. The basic concept of complexity is that all things tend to organize themselves into patterns, e.g., ant colonies, immune systems, and human cultures; further, they go through cycles of growth, mass extinction, regeneration, and evolution. Complexity looks for the mathematical equations that describe the middle ground between equilibrium (see statics) and chaos (see chaos theory), such as the interplay between supply and demand in an economy or the relationship among living organisms in an ecosystem.
Complexity theory had its beginnings with American mathematician Norbert Wiener's development of cybernetics, Canadian biologist Ludwig von Bertalanffy's development of general system theory, and American mathematician John H. Holland's development of a computerized artificial life simulation. More recent efforts are centered at the Santa Fe Institute in New Mexico, which was established in 1984, and are found in the work of multidisciplinary researchers such as American economist Kenneth Arrow and American physicist Murray Gell-Mann. Because complex systems typically cross the boundaries of traditional disciplines, the study of complexity is an interdisciplinary science. Much of the progress in the field can be attributed to advances in nonlinear dynamics, in the power of computers and in computer graphics, and in adaptive programs and fuzzy logic.
See M. M. Waldrop, Complexity: The Emerging Science at the Edge of Order and Chaos (1992); R. Lewin, Complexity: Life at the Edge of Chaos (1993); J. H. Holland, Hidden Order (1995).