by Charles Eisenstein

The Age of Separation, the Age of Reunion, and the convergence of crises that is birthing the transition


The Quest for Certainty

 

Despite the elusiveness of a Theory of Everything, all seemed well with the project of perfect understanding and perfect control up through the early 20th century. The new laws of electricity and magnetism shared with Newtonian kinetics the key features of determinism, reductionism, and objectivity. Under these laws, all that is necessary to understand and predict any phenomena is to add up all the forces on all the parts.

In the kinetics of Newton, if we know the initial state of a system (e.g. the initial velocity and angle of a cannonball, plus such details as air resistance if you want to get technical) we also know, with mathematical certainty, its state at any time in the future. The universe of classical science is itself such a system, composed of masses large and small governed by the same deterministic laws. If you can measure the state of anything in the universe accurately enough, you will know its future (and past) with perfect certainty. If you measure the state of everything in the universe, you know the future of the whole universe! Understanding of the whole comes through the measurement of its parts. Knowing the parts, you know the whole. This is the doctrine of reductionism, a key component of what Fritjof Capra called “the Newtonian World-machine”.

Perfect deterministic, reductionistic knowledge suggests the possibility of perfect control.[11] Control the initial conditions of any system, and you control the outcome. If the outcome differs from expectations, that can only be because of some unknown variable or inaccuracy in the initial measurement—our knowledge is imperfect, or our control insufficiently precise.

The Technological Program depends on the principle of determinism. If nature is inherently unpredictable and mysterious, if it somehow eludes complete description in the form of numbers, if identical setups produce different results each time, then the goal of perfect control is unattainable. The Technological Program also depends on reductionism. An irreducible whole is essentially mysterious, impervious to analysis and only conditionally subject to control. If we can understand something in terms of the interactions of its parts, then we can engineer it by altering or replacing those parts. It is under our control.

Science is the logical extension of the labeling and numbering of the world described in Chapter Two. The original conceit was that by naming and measuring nature, we make it ours. Science fleshed out this primitive intuition by saying, in essence, that if we can name and measure everything, we can apply to everything the tools of mathematics and deductive logic. The final destination of language, the categorization and naming of everything in the universe, and of number, the measurement and quantification of everything in the universe, now manifests in science as the profusion of technical vocabulary that defines every field and in the program of quantification that has virtually usurped the “soft” sciences.

In the centuries after Newton, more and more fields of human inquiry aspired to that magisterial appellation, science. The category of science expanded to include biology, medicine, archeology, anthropology, economics, sociology, and psychology, progressively annexing the territory of the qualitative and the subjective. Chemistry was to be reduced to physics, biology to chemistry, the organism to the cells, the brain to the neurons, economics to individual behavior. The goal of all this was to ground all science upon the certainty of physics, expressed as a system of axioms and therefore borrowing its infallibility from mathematics.

To this day, every field of thought that presumes to the status of a science turns to mathematics for validation, either directly or by implying through hypertrophied technical jargon an exactitude of meaning that admits to the methods of deductive reasoning. We buy into this every time we succumb to the urge to start a paper by “stating our definitions” or “stating our assumptions”—a blatant imitation of the axiomatic method. The same reductionist mentality motivates our pedagogic emphasis on “analysis”, which literally means to cut apart. To analyze a situation is to break it down into its constituent pieces. That’s what we do collectively too, with the splintering of knowledge into fields, subfields, and specializations and subspecializations within these.

When outright reduction to something more physical was impossible, the aspiring field would imitate physics instead. Thus in the “social sciences” we hear constantly of “laws”—the laws of history, the laws of economics, the laws of human behavior—as well as psychological “tensions”, historical “forces”, economic “mechanisms”, and political “momentum”. These justify an engineering approach to human institutions and foster the illusion that they, too, might be understood—and eventually controlled—through the abstract methods of scientific reason. Even in fields where outright physicalism of metaphor is absent, the quest for certainty remains in the reductionist campaign to explain, in parallel with physics, the complex in terms of the simple, the measurable, the quantifiable, the controllable.

Witness economics, which applies the “law” of rational self-interest to its atomic units, individual human beings. Higher-order laws like that of supply and demand are akin to the laws of planetary motion, in that they derive from the lower-order laws. Once the axioms are set, everything else is just mathematics. Of course, success has been notoriously elusive in economics’ quest for mathematical certainty, judging by its inability to produce accurate predictions based on “initial conditions”. But economists do not therefore conclude that their subject is impervious to mathematical method; quite the opposite, they believe that the mathematization has not gone far enough. Better data, perhaps, or more precise characterization of various uncontrolled variables of human behavior, and economics would finally live up to its scientific pretenses.

At the bottom of the entire reductionist program is nothing less than a Theory of Everything, a modern version of Newton’s Universal Law of Gravitation that would encompass all known forces. From this would be derived, progressively, all the higher-level laws of chemistry, biology, psychology, sociology, geology, all the way up to cosmology, so that every question would become a question of science, mathematically deducible from physical laws and data. This ambition was articulated very early on in the Scientific Revolution by Leibniz, who wrote, “If controversies were to arise, there would be no more need of disputation between two philosophers than between two accountants. For it would suffice to take their pencils in their hands, and say to each other, ‘Let us calculate.'”[12]

Of course, such a reduction of nature to mathematics is only as powerful and as reliable as the math that lies beneath it. During the Scientific Revolution mathematics was considered a source of infallible knowledge, Kant’s “necessary truths”–which could not be any other way—as opposed to the “contingent truths” of empirical observation, which one could coherently imagine as something else. (In other words, 2+2=4 is a necessary truth that could not be otherwise, while we could imagine a world with different laws of physics, which are thus contingent truths. Even here we find an unstated assumption of objectivity, that the laws of physics are separate from our selves that ponder those laws.) Because mathematics is the bedrock upon which the entire edifice of science rests, much energy in the early 20th century went into establishing mathematics on a firm axiomatic foundation. This program hit a brick wall in the 1930’s with the work of Godel, Turing, and Church as described in Chapter Six, but like the rest of the Newtonian World-machine it still exerts its influence in the general unspoken assumption that the most reliable, most valid knowledge is that which can be put in the form of numbers.

The world of control that determinism opens up extends far beyond science and technology. In politics too, and indeed in personal life, control rests upon a similar foundation: the application of force based on precise data about the world. Accordingly, power-hungry politicians and totalitarian governments are obsessed with controlling the flow of information because, in their view, knowledge is power. The same goes for all controlling personalities. They want to be privy to inside information; they want to know your secrets; they hate it when something happens behind their backs.

That knowledge is equivalent to information is a direct consequence of the world-view that arose with Newtonian kinetics. We try to discover all the “forces” bearing on a situation; knowing them, we can control the outcome, provided we have enough force of our own. Whether in physics or politics, force plus information equals control.

Keep that formula in mind next time you read the news. F + I = C.

Actually, this strategy only works in certain limited circumstances. It fails miserably in non-linear systems, in which effects feed back into causes. Tiny errors lead to huge uncertainties and radically unpredictable results. The situation spins out of control. Trapped in a Newtonian mindset, we are helpless to respond except with more control. The current Bush administration is a good example of this, desperately lying, hiding, and manipulating information to control the effects of earlier lies and manipulations. I also think of an addict trying to manage a life that is falling apart, or an adulterous spouse trying to hide the multiplying evidence of his infidelity. Newtonian-based logic says that this should work. It is simply a matter of being thorough, of finding all the possible causes of a breakdown. With enough information and enough force, we should be able to manage any situation.

If only I could “figure it out”, then maybe my life too would become manageable. If only I could apply enough force, enough willpower, maybe I could conform it to the image of my desire. Whether collectively or personally, what a fraudulent promise that has proven to be!

Another way to look at it is that trying harder will never work. Yet that is our typical response to failure, both on the individual and collective level. When we fail to fulfill our New Year’s resolutions, what do we conclude? We didn’t try hard enough, we didn’t apply enough willpower. Similarly, we act as though more technology, more laws, more vigilance will succeed where previous technology, laws, and vigilance failed. But greater effort from our present state of being only serves to reinforce that state of being. Using more force promotes the mentality of using force. Methods born of separation exacerbate separation. Does anyone today remember that World War One was fervently believed to be the “war to end all wars”? Despite that stupendous failure, equivalent logic lives on in the current “war on terror”.

Thanks to science, we have more information about the world than ever before. Thanks to technology, our ability to apply force is likewise unprecedented. Yet despite centuries of progress in the technologies of control, we have made little overall progress in improving ecological, social, and political conditions. To the contrary: our planet veers toward disaster. What is the source of this failure? We have attempted to apply the linear strategy of F + I = C beyond its proper domain. Faced with a complex problem, the engineer or the manager breaks it down into parts, in which each process is a discrete module performing one of a series of specialized functions. Organic interdependency and feedback is fatal to such systems of management and control. The performance of any part of an organic non-linear system cannot be understood or predicted in isolation from the rest, but only in relationship to the rest. Such parts are no longer freely interchangeable, and the methodologies of reductionism are impotent.

The solution for several centuries has been to attempt to make linear what is actually not. Sometimes we do so quite benignly: In mathematics, for example, we use numeric methods to approximate solutions to nonlinear differential equations, which are generally unsolvable analytically. Much more sinister is the attempt to impose linearity on human institutions and human beings, which necessitates the destruction of all that is organic, traditional, and autonomous. All is to be rationally planned out, and each person made into a discrete standardized element of a vast machine. In medicine the consequences of reducing the organic to the linear are equally horrific. These are but two facets of the totalizing “World Under Control” I describe in Chapter Five.

I am not saying that reductionistic science, and the technology based upon it, is powerless. On the contrary, it has generated the entire infrastructure of our society, from bridges and skyscrapers to batteries and microchips, using standard components and according to generalized principles. The reduction of the infinitely diverse natural world into a finite set of standard inputs and processes actually works—for applications in which the inevitable remaining differences don’t matter. For these practical purposes, each electron, each iron atom, each drop of water, each block of granite, each “qualified human resource” is the same. For these practical purposes, yes, but not for all practical purposes—that is the delusion into which our success has led us. Reductionistic science and rationality itself, both based on the abstraction of regularities in nature, have allowed us to build very high indeed, an edifice reaching farther than ever before toward the heavens. Yet paradoxically, we are no closer to heaven than when we started.

We have achieved mastery of the linear domain, and attempted to expand that domain to cover the universe. Most real-world systems, however, including living organisms, are hopelessly non-linear. From this realization will arise a new approach to engineering and to problem-solving in general that does not start by breaking the problem into pieces. Our whole concept of “design” will evolve as well, away from hierarchical, modular approaches toward those based on self-organization and emergence. In doing so, a certain degree of certainty and control will be lost. Our relationship to nature, to each other, and to the universe will be radically transformed. This shift can only happen as part of a profound transformation of our sense of self, of who we are in relation to the universe. We will have to let go of control and face the fear behind that compulsion. Lewis Mumford identified it this way: “Today this almost pathological fear of what cannot be directly examined and brought under control—external, preferably mechanical, electronic, or chemical control—survives as a scientific equivalent of a much older atavism, fear of the dark”.[13] Letting go of control means, then, that the Age of Fire, in which the circle of domesticity defined the human realm as the illuminated, is coming to an end. We will become once more at home in the dark: at home in mystery, in uncertainty, in unreasonableness. Or at least we will no longer fear to venture there.

< Previous | Next >


 

[11] Actually, except in simple linear systems, as a practical matter  neither predictability nor control follows from determinism, but this was not widely understood until the advent of chaos theory in the last half-century.

[12] This passage created a very strong impression on me when I first read it in a college philosophy class. It is not insignificant that disputes can and most certainly do arise between accountants! Propositions can be proven within a formal system, but the correspondence of that system to reality cannot be proven mathematically but only argued for empirically.

[13] Mumford,  p. 72

Creative Commons Non-Commercial Copyright2008 Charles Eisenstein