Modern perspectives
[edit] In biology
Although it was once thought by scientists that any indeterminism in quantum mechanics occurred at too small a scale to influence biological or neurological systems, there is evidence that nervous systems are indeterministic,
[15] and it has been argued that "[classical] physical determinism is out: the future is not fully determined by the current facts".
[16]
[edit] Cause and effect
Since the early twentieth century when astronomer
Edwin Hubble first hypothesized that
redshift shows the universe is expanding, prevailing scientific opinion has been that the current state of the universe is the result of a process described by the
Big Bang. Many
theists and
deists claim that it therefore has a finite age, pointing out that something cannot come from nothing. The big bang does not describe from where the compressed universe came; instead it leaves the question open. Different astrophysicists hold different views about precisely how the universe originated (
Cosmogony). The philosophical argument here would be that the big bang triggered every single action, and possibly mental thought, through the system of cause and effect.
[edit] Generative processes
Some proponents of emergentist or
generative philosophy,
cognitive sciences and
evolutionary psychology, argue that free will does not exist.
[17][18] They suggest instead that an illusion of free will is experienced due to the generation of infinite behaviour from the interaction of finite-deterministic set of rules and parameters. Thus the unpredictability of the emerging behaviour from deterministic processes leads to a perception of free will, even though free will as an
ontological entity does not exist.
[17][18]
As an illustration, the strategy board-games
chess and
Go have rigorous rules in which no information (such as cards' face-values) is hidden from either player and no
random events (such as dice-rolling) happen within the game. Yet, chess and especially Go with its extremely simple deterministic rules, can still have an extremely large number of unpredictable moves. By this analogy, it is suggested, the experience of free will emerges from the interaction of finite rules and deterministic parameters that generate infinite and unpredictable behaviour. Yet, if
all these events were accounted for, and there were a known way to evaluate these events, the seemingly unpredictable behaviour would become predictable.
[17][18]
[edit] In mathematical models
Many
mathematical models of physical systems are deterministic. This is true of most models involving
differential equations (notably, those measuring rate of change over time). Mathematical models that are not deterministic because they involve randomness are called
stochastic. Because of
sensitive dependence on initial conditions, some deterministic models may appear to behave non-deterministically; in such cases, a deterministic interpretation of the model may not be useful due to
numerical instability and a finite amount of
precision in measurement. Such considerations can motivate the consideration of a stochastic model even though the underlying system is governed by deterministic equations.
[19][20][21] A truly non-deterministic event is independent of the time and observer, thus it is called
intrinsic random event.
[edit] Arguments
Compatibilism is the acceptance of both
Free Will and Determinism. The negation of determinism is called
Indeterminism.
[edit] Quantum mechanics and classical physics
Main article:
Superdeterminism
Since the beginning of the 20th century,
quantum mechanics has revealed previously concealed aspects of
events.
Newtonian physics, taken in isolation rather than as an
approximation to quantum mechanics, depicts a universe in which objects move in perfectly determinative ways. At human scale levels of interaction, Newtonian mechanics makes predictions that are agreed with, within the accuracy of measurement. Poorly designed and fabricated guns and ammunition scatter their shots rather widely around the center of a target, and better guns produce tighter patterns.
Absolute knowledge of the forces accelerating a bullet should produce absolutely reliable predictions of its path, or so it was thought. However, knowledge is never absolute in practice and the equations of Newtonian mechanics can exhibit
sensitive dependence on initial conditions, meaning small errors in knowledge of initial conditions can result in arbitrarily large deviations from predicted behavior.
At atomic scales the paths of objects can only be predicted in a probabilistic way. The paths may not be exactly specified in a full quantum description of the particles; "path" is a classical concept which quantum particles do not exactly possess. The probability arises from the measurement of the perceived path of the particle. In some cases, a quantum particle may trace an exact path, and the probability of finding the particles in that path is one. The quantum development is at least as predictable as the classical motion, but it describes
wave functions that cannot be easily expressed in ordinary language. In
double-slit experiments,
photons are fired singly through a double-slit apparatus at a distant screen and do not arrive at a single point, nor do the photons arrive in a scattered pattern analogous to bullets fired by a fixed gun at a distant target. Instead, the light arrives in varying concentrations at widely separated points, and the distribution of its collisions with the target can be calculated reliably. In that sense the behavior of light in this apparatus is deterministic, but there is no way to predict where in the resulting
interference pattern an individual
photon will make its contribution (see
Heisenberg Uncertainty Principle).
Some have argued
[22] that, in addition to the conditions humans can observe and the laws we can deduce, there are hidden factors or "
hidden variables" that determine absolutely in which order photons reach the detector screen. They argue that the course of the universe is absolutely determined, but that humans are screened from knowledge of the determinative factors. So, they say, it only appears that things proceed in a merely probabilistically-determinative way. In actuality, they proceed in an absolutely deterministic way. Although matters are still subject to some measure of dispute, quantum mechanics makes
statistical predictions which would be violated if some
local hidden variables existed. There have been a number of experiments to verify those predictions, and so far they do not appear to be violated, though many physicists believe better experiments are needed to conclusively settle the question. (See
Bell test experiments.) It is possible, however, to augment quantum mechanics with
non-local hidden variables to achieve a deterministic theory that is in agreement with experiment. An example is the
Bohm interpretation of quantum mechanics.
On the macro scale it can matter very much whether a bullet arrives at a specific point at a specific time; there are analogous quantum events that have macro- as well as quantum-level consequences. It is easy to contrive situations in which the arrival of an electron at a screen at a certain point and time would trigger one event and its arrival at another point would trigger an entirely different event. (See
.)
Even before the laws of quantum mechanics were developed to their present level, the phenomenon of
radioactivity posed a challenge to determinism. A gram of
uranium-238, a commonly occurring radioactive substance, contains some 2.5 x 1021 atoms. By all tests known to science these atoms are identical and indistinguishable. Yet about 12600 times a second one of the atoms in that gram will decay, giving off an
alpha particle. This decay does not depend on external stimulus and no extant theory of physics predicts when any given atom will decay, with realistically obtainable knowledge. The uranium found on earth is thought to have been synthesized during a
supernova explosion that occurred roughly 5 billion years ago. For determinism to hold, every uranium atom must contain some internal "clock" that specifies the exact time it will decay.[
citation needed] And somehow the laws of physics must specify exactly how those clocks were set as each uranium atom was formed during the supernova collapse.
Exposure to alpha radiation can cause cancer. For this to happen, at some point a specific alpha particle must alter some chemical reaction in a cell in a way that results in a mutation. Since molecules are in constant thermal motion, the exact timing of the radioactive decay that produced the fatal alpha particle matters. If probabilistically determined events do have an impact on the macro events-such as when a person who could have been historically important dies in youth of a cancer caused by a random mutation-then the course of history is not predictable from the dawn of time.
The time dependent
gives the first time
derivative of the
quantum state. That is, it explicitly and uniquely predicts the development of the
wave function with time.
So if the wave function itself is reality (rather than probability of classical coordinates), quantum mechanics can be said to be deterministic. Since we have no practical way of knowing the exact magnitudes, and especially the phases, in a full quantum mechanical description of the causes of an observable event, this turns out to be philosophically similar to the "hidden variable" doctrine[
citation needed].
According to some[
citation needed], quantum mechanics is more strongly ordered than Classical Mechanics, because while Classical Mechanics is
chaotic, quantum mechanics is not. For example, the
classical problem of three bodies under a force such as
gravity is not
integrable, while the quantum mechanical three body problem is tractable and integrable, using the
Faddeev Equations. This does not mean that quantum mechanics describes the world as more deterministic, unless one already considers the wave function to be the true reality. Even so, this does not get rid of the probabilities, because we can't do anything without using classical descriptions, but it assigns the probabilities to the classical approximation, rather than to the quantum reality.
Asserting that quantum mechanics is deterministic by treating the wave function itself as reality implies a
single wave function for the entire universe, starting at the origin of the universe. Such a "wave function of everything" would carry the probabilities of not just the world we know, but every other possible world that could have evolved. For example, large voids in the distributions of
galaxies are believed by many cosmologists to have originated in quantum fluctuations during the big bang. (
See cosmic inflation and
primordial fluctuations.) If so, the "wave function of everything" would carry the possibility that the region where our Milky Way galaxy is located could have been a void and the Earth never existed at all. (
See large-scale structure of the cosmos.)
[edit] First cause
The
neutrality of this article is disputed. Please see the discussion on the
talk page. Please do not remove this message until the
dispute is resolved. (January 2010) Intrinsic to the debate concerning determinism is the issue of
first cause.
Deism, a philosophy articulated in the seventeenth century, holds that the universe has been deterministic since creation, but ascribes the creation to a metaphysical God or first cause outside of the chain of determinism. God may have begun the process,
Deism argues, but God has not influenced its progression. This perspective illustrates a puzzle underlying any conception of determinism[
citation needed]:
Assume: All events have causes, and their causes are all prior events. There is no cycle of events such that an event (possibly indirectly) causes itself.
The picture this gives us is that Event A
N is preceded by A
N-1, which is preceded by A
N-2, and so forth.[
citation needed]
Under these assumptions, two possibilities seem clear, and both of them question the validity of the original assumptions:
(1) There is an event A
0 prior to which there was no other event that could serve as its cause.(2) There is no event A
0 prior to which there was no other event, which means that we are presented with an infinite series of causally related events, which is itself an event[
dubious - discuss], and yet there is no cause for this infinite series of events. Under this analysis the original assumption must have something wrong with it. It can be fixed by admitting one exception, a creation event (either the creation of the original event or events, or the creation of the infinite series of events) that is itself not a caused event in the sense of the word "caused" used in the formulation of the original assumption. Some agency, which many systems of thought call God, creates space, time, and the entities found in the universe by means of some process that is analogous to causation but is not causation as we know it. This solution to the original difficulty has led people to question whether there is any reason for there only being one divine quasi-causal act, whether there have not been a number of events that have occurred outside the ordinary sequence of events. Others[
citation needed] argue that this is simply redefining the question.
Another possibility is that the "last event" loops back to the "first event" causing an infinite loop. If you were to call the Big Bang the first event, you would see the end of the Universe as the "last event". In theory, the end of the Universe would be the cause of the beginning of the Universe. You would be left with an infinite loop of time with no real beginning or end. This theory eliminates the need for a first cause, but does not explain why there should be a loop in time.
Immanuel Kant carried forth this idea of Leibniz in his idea of
transcendental relations, and as a result, this had profound effects on later philosophical attempts to sort these issues out. His most influential immediate successor, a strong critic whose ideas were yet strongly influenced by Kant, was
Edmund Husserl, the developer of the school of philosophy called
phenomenology. But the central concern of that school was to elucidate not physics but the grounding of information that physicists and others regard as
empirical. In an indirect way, this train of investigation appears to have contributed much to the philosophy of science called
logical positivism and particularly to the thought of members of the
Vienna Circle, all of which have had much to say, at least indirectly, about ideas of determinism.