2.1 Causality - determinism

Classical physics was based on the assumption that the future is determined by the present and that, in this way, an accurate knowledge of the present can reveal the future. This belief in a fundamental limitless predictability was expressed in perhaps the most elegant and comprehensible way by Laplace in 1814 through his fictional demon:

"Une intelligence qui pour un instant donné, connaîtrait toutes les forces dont la nature est animée, et la situation respective des êtres qui la composent, si d'ailleurs elle était assez vaste pour soumettre ces données à l'analyse, em­brasserait dans la même formule, les mouvements des plus grands corps de l'univers et ceux du plus léger atome: rien ne serait incertain pour elle, et l'avenir comme le passé, serait présent à ses yeux." (Laplace, 1814, p. 2)

Anyone expressing the certitude of natural laws so pointedly, who, indeed, inter­prets causality so narrowly and means determinism, is asking to be contradicted, particularly at a time when the consequences of the quantum theory and the de­velopment of atomic physics affect not only philosophy but our thought processes.

This confrontation makes it clear how generation-dependent the principle of causal­ity or the law of cause and effect is and how it changes in the course of evolutions in thinking and language. We must conclude that the concept of causality and our understanding of it is linked with the concepts and the meanings associated with them at any one time! Formulations such as causality, causal law, causal expla­nation or causal principle are always confusing as long as their explanations are not transparent. In this connection, it appears helpful to begin with the historical development of these concepts.

The fact that causality sets up a link between cause and effect is, historically speak­ing, relatively recent. For Aristotle, the originator of many scientific disciplines, the word causa had a much more general meaning than nowadays. He expressed his thoughts in the two pithy concepts "to airiov nai to aiTiaràv". With direct reference to Aristotle, scholasticism, the philosophy of the Middle Ages, taught that there are are four types of cause: 1. causa materialis (— material cause); 2. causa formalis (= formal cause); 3. causa finalis (= final cause); 4. causa ef- ficiens (= effecting cause). In addition, Aristotle differentiated between internal and external causes, the first two mentioned above being internal, the second two external. The one we roughly associate with the word "cause" nowadays is causa efficiens, the initiating or effecting cause.

With the emergence of the Renaissance and the birth of modern scientific thinking - usually associated with names such as Nicolaus Copernicus, Galileo Galilei and Johannes Kepler - a change in the concept of causa also came about. As man turned away from metaphysics and towards physics, i.e. towards quantification and metrology, the word causa was linked with the material incident which preceded and somehow caused the phenomenon to be explained. At first, it was held that if the cause had been discovered, the natural phenomenon could be explained. In this connection, cause was synonymous with explanation. Gradually, however, the understanding spread that it was the natural laws, the interconnection of cause and effect, which clearly determine and thus explain natural processes. The culmination of this development was the epoch-making work of Isaac Newton. Immanuel Kant, who, in his "Critique", equated science as such with the science of Newton, found a formulation for causality which was still valid in the 19th century: "Wenn wir erfahren, daß etwas geschieht, so setzen wir dabei jederzeit voraus, daß etwas vorhergehe, woraus es nach einer Regel folgt."

Einstein praised Newton's outstanding intellectual achievement as "perhaps the greatest progress in thinking that a single individual ever had the privilege to accomplish". His most eminent achievement was his comprehensive mathematical theory of mechanics which remained the basis of scientific thinking until well into the 20th century. It is hardly surprising that his general laws of motion - obeyed by all objects in the solar system from the apple falling from the tree to the planets - or their universal applicability fed the expectation that processes in nature were in principle clearly determined, assuming they are known as a whole or in part after careful study. Newton's universe was a grand mechanical system which functioned in accordance with exact deterministic laws so that the motion of a system in the future could be calculated in advance from the state of this system at any given time. If one is convinced that nature fundamentally behaves in this way, the next logical step is the one Laplace formulated for his fictional supernatural intelligence mentioned above: all the future as well as the past of our universe is calculable from the precise knowledge of the position and velocity of all atoms at any one instant.

This concept of pre-calculability in all eternity is not consistent with 20th century physics and particularly not with atomic physics. It is not that physics is in fundamental opposition to man's longing for predictablity but that the two, atomic physics and determinism, are incompatible with one another in their philosophical approaches.

The atomistic ideas of Democritus and Leucippus handed down from classical an­tiquity assume that regular, ordered processes on a global level derive their mor­phology from the irregular, random behaviour on a local level. Such considerations are indeed plausible, as is substantiated by numerous examples from everyday life. A fisherman, for example, fighting against the wind and the waves, only has to ascertain the rhythm of the waves in order to be able to react to them; he does not have to know the motion of each individual drop of water. If we explain the processes we can perceive with our senses by the interaction of many individual processes on the local level, is it not an inevitable consequence to regard the reg­ularities of nature as statistical? Statistical laws can indeed provide statements of a very high degree of probability, bordering on infallibility; yet there will never be complete certainty since exceptions are, in principle, always possible. More­over, statistical laws only apply to a phenomenon as a whole and never to a single manifestation; this will always remain indeterminate.

In spite of such conceptual difficulties, we are continually setting up statistical laws in day-to-day life as a basis for our practical actions. Engineers, for example — whether they are acting in the design of aircraft, buildings or machines - cannot base their work on precise loads or material data. They have to rely as a matter of course on mean, that is to say statistical characteristic values. Yet nevertheless, when our attention is drawn to such "semi-exact" regularities, we feel uneasy and consider them less trustworthy. We would prefer either precisely definable processes in nature or, on the other hand, chaotic, totally irregular ones. Is there an explanation for such an attitude? One generally uses statistical laws when the physical system in question is only partially known. The simplest example of this is the game of heads or tails. Since no one side of the coin has an advantage over the other, we have to come to terms with the fact that - when playing a large number of games - we can only predict one of the two results with 50% certainty. Games of dice are similar, except that the number of possible results has increased to six. The likelihood of predicting a chosen number is thus reduced to 1/6. If the die is thrown often enough, then the number of throws resulting in a 1 is approximately one-sixth of all the throws. This is, of course, only true if we assume a perfect die and an identical throwing technique; only then can all six possible results be considered equally probable. What we assume is that these conditions are satisfied approximately.

In modern times, it was Robert Boyle who took up the idea from classical an­tiquity by not only describing the material behaviour on the macroscopic level of observation qualitatively as the result of the statistical behaviour of the molecules in a gas but by also quantifying the well-known relationship between pressure and volume. His idea was that the pressure, a quantity measurable on the macroscopic level, was built up by the numerous impacts of molecules on the wall of the vessel; this interpretation was an miraculous brainwave at the time. In a similar way, the laws of thermodynamics could be conceived when it proved possible to state in a mathematical precise form the fact that atoms move more violently in hot bodies than in cold ones.

While Laplace raised the hopes that one day, the whole world could, in principle, be calculable, in the second half of the last century, the idea spread that Newton's mechanics might be unrestrictedly valid, but that systems within the kinetic gas theory could never be completely determinable due to the immense number of gas molecules. It was mainly Josiah Willard Gibbs and Ludwig Boltzmann who put the incomplete knowledge of these systems into mathematical language by using statistical laws. Gibbs went one step further by introducing temperature as a physical concept for the first time; this only makes sense when the knowledge of the system is incomplete. If indeed the velocity and the position of all the molecules in a gas were known, it would be completely superfluous or meaningless to talk of the temperature of the gas.

The concept of temperature can only be used meaningfully when the system on the microscopic, level of observation is incompletely known and one nevertheless does not want to forgo a "qualitative" statement on the macroscopic level. Using such a concept, one does not describe the behaviour of a system by taking an increasing number of degrees of freedom into account - in the ideal case, infinitely many - but by proceeding to new, essential and thus considerably fewer degrees of freedom on a more general level of observation. The motto is not "more precise, more de­tailed and infinitely many" but "more global, fewer and nevertheless informative". In this highly simplified statement, we share the view of Max Born (1959) who says that absolute accuracy is not a physically meaningful concept and can only be found in the conceptual world of mathematicians. Felix Klein called for the application of "approximation mathematics" side by side with the usual "precision mathematics". Since his suggestion remained without response at the time, the physicists at the turn of the century solved their problems in their own conceptual framework, using methods of probability and statistical laws. What Max Born's statement basically implied was that non-linear laws and deterministic equations may supply unpredictable answers. At the same time, he in effect pointed out that non-linear equations often react with unexpected sensitivity to the slightest changes in the initial conditions and thus suddenly supply unexpectedly differing answers. This was a revolutionary perception, first expressed by Poincaré at the turn of the century but unnoted for a long time:

"Une cause très petite, qui nous échappe, détermine un effet considérable que nous ne pouvons pas ne pas voir, et alors nous disons que cet effet est dû au hasard. Si nous connaissions exactement les lois de la nature et la situation de l'univers à l'instant initial, nour pourrions prédire exactement la situation de ce même univers à un instant ultérieur. Mais, lors même que les lois naturelles n'auraient plus de secret our nous, nous ne pourrons connaître la situation initiale qu'approximativement. Si cela nous permet de prévoir la situation ultérieure avec la même approximation, c'est tout ce qu'il nous faut, nous disons que le phénomène a été prévu, qu'il est régi par les lois; mais il n'en est pas toujours ainsi, il peut arriver que de petites différences dans les conditions initiales en engendrent de très grandes dans les phénomènes finaux; une petite erreur sur les premières produirait une erreur énorme sur les derniers. La prédiction devient impossible et nous avons le phénomène fortuit." (Poincaré, 1908)

In the first two decades of this century, the attempt was initally made to explain the motion of the atoms and molecules according to the basic precepts of classical mechanics, in the spirit of Newton. The result, however, was an entanglement of inextricable contradictions. For example, in accordance with classical physics, a charged electron was supposed to orbit around the atomic nucleus and constantly emit radiation until it collapsed into the nucleus due to loss of energy. However, the electron paths devised in the model were unstable. It was obvious to Niels Bohr that on the basis of Planck's theory, the paths of the electrons are stationary, that the electron, as long as it sticks to its path, does not emit any radiation, but that a change of path is accompanied by a loss of energy. Bohr solved this contradiction not by changing his concept of the model, as could have been expected, but rather by maintaining that classical physics was not applicable to the description of the dynamic behaviour of atomic relations. The fact that loss of energy which takes place in irregular bursts and in stages when an electron changes its path should lead to the assumption that the radiation emittance of atoms is a statistical phe­nomenon was acceptable. It is also possible to tolerate, at least temporarily, a new type of mechanics, quantum mechanics, albeit at the expense of determinacy, when the stability of the atoms can thus be mathematically ensured. But the bold assertion that it is fundamentally impossible to know all the necessary defining el­ements in order to achieve a complete determination of the processes - even Albert Einstein could not and would not accept this as true, even for atomic phenomena. For him, the quantum theory was only an instrument, temporarily necessary; it had emerged due to our lack of knowledge of all the canonical variables of the atomic process but could be suspended as soon as all these unknowns had been clarified. His opinion of quantum mechanics culminated in the statement, "God does not play dice", to which Niels Bohr replied, "It would be presumptuous of us human beings to prescribe to the Almighty how he is to take his decisions."

What is it about the quantum theory that is so challengingly unfamiliar? It is the fact that the concept of the trajectory has been banned from a mechanical theory of the atom shell and replaced by irreducible probabilistic elements. This was imperative in order to ensure discrete energy levels, whose existence had been revealed by spectroscopy, in a mathematical equation for systems such as the atom. It was Werner Heisenberg who took this radical step in 1925 in his deci­sive work with the title "Uber quantentheoretische Umdeutungen kinematischer und mechanischer Beziehungen" (Heisenberg, 1925). Here, the canonical variables position and impulse become non-commutative quantities; Max Born and Pascual Jordan recognised their matrix character. Although the fundamental features of this mathematical theory had already been set up, Heisenberg described the prob­lem of atomic physics in a letter to Pauli in 1926 as completely unsolved. What caused him to make this negative statement? What was still missing was the link to the physical experiment (Heisenberg, 1969).The solution was provided, again by Heisenberg, by the indeterminacy relation in 1927 (Heisenberg, 1927). Position and velocity, the classical quantities for the determination of a particle trajectory, can be stated individually with arbitrary accuracy; simultaneously, however, this is impossible in the quantum philosophy, at least in the microworld of atomic physics. This is not an assumption in quantum mechanics, but the consequence of the laws of the quantum theory. It was thus clear that Newton's mechanics, which assume the exact knowledge of both position and velocity in order to calculate a mechanical course of motion, are not applicable to the atomic world. Although fifty years have passed since Bohr, Heisenberg, Born and others came to the conclusion that the quantum theory forces us to formulate the laws on the quantum level as statistical laws and to bid farewell to determinism, it is difficult to incorporate this into our general philosophy. For the atomic field, this call to abandon pure determinism may be valid. But to speak of absolute randomness and to assert that the classical idea of predictability is invalid on principle although it was so successful in its search for order, regularity and natural laws merely due to spontaneous nuclear disintegration for which there is neither cause nor explanation - this is something we still cannot and do not want to bring ourselves to accept.

In contrast to this, it should be remembered that classical physics tacitly comple­ments the principle of "identical" causality with the principle of "similar" causality. Laplace's principle, "identical causes have identical effects", was extended by the principle "similar causes have similar effects". The reason is that it is a justifiable expectation to demand exactly identical causes so as to guarantee the reproducibil­ity of physical processes, but that this is not feasible in practice. Any experimenter knows that although he is at pains to obtain the same results for repeated mea- surings under the same test conditions, exactly identical repetitions of the test conditions are basically imposible. The accuracy of measurements is limited, even though it may be extremely high with today's technical sophistication. When the errors in the measurement results are of the same order as the inaccuracies of the experimental set-up, they are not exactly the same, but similar. In spite of sta­tistical laws which we take as the basis of our experimenting, we still speak of the reproducibility of physical behaviour. Due to this discrepancy between abstract mathematical precision and unavoidable physical approximation, Max Born (1959) demands the re-formulation of the question of determinability in mechanics and, in its place, the differentiation between stable and unstable motions.

Heisenberg and Born's renunciation of the dogma of predictability may be valid for the field of atomics but appears strange on the macroscopic level which is directly accessible to our senses. We are, of course, aware that, in weather forecasting, we have to rely on probability estimates although the motion of the earth's atmosphere follows exactly the same physical laws as the motion of the planets. Nevertheless, the weather retains a random or stochastic element since the definite connection between cause and effect is unknown. Until not so very long ago, there was little reason to doubt that, in this case, at least in principle, accurate predictions would ultimately be possible. In the spirit of Laplace's demon, it was assumed that it was only necessary to gather sufficient information about the system and to process it with the necessary effort. But this expectation proved unfulfillable due to the extreme sensitivity of the system to small deviations in the initial conditions.

This mechanical conception of the world which was first shaken by quantum me­chanics received a second blow as a result of an astonishing discovery, namely that even simple non-linear systems are subject to irregular behaviour. We have to come to terms with the idea that such random behaviour is an intrinsic reality as it does not disappear, even after more information has been gathered. Such random behaviour which is generated in non-linear deterministic dynamics is called "de­terministic chaos". The fact that deterministic laws without stochastic elements causes chaos sounds as paradox as the concept of deterministic chaos itself. At this point, we would like to stress that there are various manifestations of chaotic behaviour which must be carefully distinguished between. In a vessel filled with gas, the atoms fly around wildly and collide and only in this case, as Boltzmann al­ready established, does microscopic chaos prevail. In contrast to this, macroscopic or deterministic chaos dominates when purely random oscillations occur although the laws which describe the dynamics are deterministic.

We have repeatedly mentioned the diametrically opposing conceptions of scientific knowledge: on the one hand, the idea of the atomists who stress random collision and, on the other hand, the mechanistic view of the world which is based in timeless dynamical laws. Both conceptions fail when it is a question of explaining both spatial and temporal structures, i.e. as occurring in undamped oscillations. Equilibrium thermodynamics and conventional statistical physics do not provide the methods to deal with such an oscillatory behaviour. A way out of this dilemma begins to emerge.

What is this hope based on? It is the study of the physics of non-equilibrium states and on the other hand of the theory of dynamical systems. The physics of non-equilibrium states deals with systems far from thermodynamic equilibrium where chaos prevails on the microscopic level but is completely concealed on the macroscopic level by well-organised patterns. In the second discipline, the theory of dynamical systems, it is the instabilities that play the central role. At the point of instability, the system must "choose" between various possibilities. A small, non-predictable fluctuation decides which course the dynamic process finally takes. This occurrence of unpredictability forces us to re-think our understanding of deterministic systems and the "long-term" prognoses we have won.

Processes which are sustained by a continuous flow of energy and possibly matter from outside and which are characterised by their ordered, self-organised, collective behaviour are called evolutionary. A significant aspect of evolutionary processes is their causal coherence, although they may be interspersed with random outbursts. Anyone would reject the idea of regarding the random occurrences of a lottery as an evolutionary development. Intuitively, we demand that the current state of a system is fundamentally moulded, if not determined, by the preceding causes. This does not mean, however, that this demand for causality totally excludes random occurrences; evolutionary processes without mutation, symmetry breaking etc. are unthinkable. Random occurrences reduce the chance of exact predictions, they cannot be specified by laws, they weaken the deterministic net of cause and effect; yet it would be wrong to draw the conclusion that the pattern of reality were chaotic.

Thus, it is not only quantum physics, but also the chaos theory which throws con­siderable doubt on Laplace's demon of absolute determinism. "Für manchen mag dies eine Enttäuschung sein. Aber vielleicht ist ja eine Welt sogar menschlicher, in der nicht alles determiniert und nicht alles berechenbar ist, eine Welt, in der es - dank der Quantenereignisse - Zufall, und damit auch Glück, gibt, in der - weil nicht alle Probleme algorithmisch lösbar sind - Phantasie und Einfallsreichtum, Raten und Probieren, Kreativität und Originalität noch gefragt sind und in der man, wie die Chaos-Theorie zeigt, auch bei chaotischem Verhalten immer noch sinnvoll nach einfachen Grundgesetzen suchen kann." (Vollmer, 1988, p. 350).

It is certainly a result of the historical development of the sciences that the unpre­dictable (and thus also the chaotic state) was first explicitly suspected and then formulated mathematically in the microworld of atomic physics. This does not mean that these phenomena do not appear in our daily macroscopic life. Chaotic responses of non-linear dynamical systems are almost a matter of course to us today. We draw attention to the appearance of such phenomena in a pendulum with a large amplitude and particularly so in the case of spatial oscillations. The well-known Duffing equation is another example. A further historical example is the well-known anorganic chemical reaction of Belousov-Zhabotinsky which offers a memorable display of colour. Yet the most impressive and still the most myste­rious is the onset and development of turbulence in a flowing medium. Research into this problem - which is highly important in technology and of decisive sig­nificance in meteorology and combustion processes - has inspired many famous physicists since Osborne Reynolds to look for a physical-mathematical solution. The great Werner Heisenberg also applied his artistic intuition to explain this pro­cess (Heisenberg, 1948). In spite of admirable contributions by many scientists, not even he succeeded in disentangling the problem. However, lie was the first to express the conjecture (correctly, as we now know) that while the standard Navier-Stokes equation are pertinent to research into the initiation of turbulence, this is not true in their present form for fully developed turbulence. Turbulence is, of course, also a type of chaotic motion. Indeed, in recent times, remarkable mathematical and physical progress has been achieved with the aid of the chaos theory in the research into the onset of turbulence in an initially laminar flow. For fully developed turbulence, however, today's chaos theory is still not broadly enough developed. In the light of these remarks, we shall attempt in Chapter 8 to elucidate at least the basics of the fundamental problem of the initiation of turbulence (Argyris et al., 1991, 1993). Apart from the difficulties involved in an adequate comprehension of a complex process, there is a further obstacle that everyone breaking new ground must surmount, even - or especially - in science. No one but Werner Heisenberg could have expressed this so memorably: "Wenn wirkliches Neuland betreten wird, kann es aber vorkommen, daß nicht nur neue Inhalte aufzunehmen sind, sondern daß sich die Struktur des Denkens ändern muß, wenn man das Neue verstehen will. Dazu sind offenbar viele nicht bereit oder nicht in der Lage." (Heisenberg, 1969, p. 102)