Friday, March 21, 2014

An Open Question: the Multiverse

The multiverse (or meta-universe) is the hypothetical set of infinite or finite possible universes (including the historical universe we consistently experience) that together comprise everything that exists and can exist: the entirety of space, time, matter, and energy as well as the physical laws and constants that describe them. The various universes within the multiverse are sometimes called parallel universes.

The structure of the multiverse, the nature of each universe within it and the relationship between the various constituent universes, depend on the specific multiverse hypothesis considered. Multiple universes have been hypothesized in cosmology, physics, astronomy, religion, philosophy, transpersonal psychology and fiction, particularly in science fiction and fantasy. In these contexts, parallel universes are also called "alternative universes", "quantum universes", "interpenetrating dimensions", "parallel dimensions", "parallel worlds", "alternative realities", "alternative timelines", and "dimensional planes," among others. The term 'multiverse' was coined in 1895 by the American philosopher and psychologist William James in a different context.


The multiverse hypothesis is a source of disagreement within the physics community. Physicists disagree about whether the multiverse exists, and whether the multiverse is a proper subject of scientific inquiry. Supporters of one of the multiverse hypotheses include Stephen Hawking, Steven Weinberg, Brian Greene,and Max Tegmark. In contrast, critics such as David Gross, Paul Steinhardt and Paul Davies have argued that the multiverse question is philosophical rather than scientific, or even that the multiverse hypothesis is harmful or pseudoscientific.

Occam’s Razor
Proponents and critics disagree about how to apply Occam’s Razor. Critics argue that to postulate a practically infinite number of unobservable universes just to explain our own seems contrary to Occam's razor. In contrast, proponents argue that, in terms of Kolmorgorov complexity, the proposed multiverse is simpler than a single idiosyncratic universe.

http://en.wikipedia.org/wiki/Multiverse

= = = = = = = = = = = = = = = = = = = = = = = = = = = = = =

Kolmogorov Complexity
In algorithmic information theory (a subfield of computer scienceand mathematics), the Kolmogorov complexity (also known as descriptive complexity, Kolmogorov–Chaitin complexity, algorithmic entropy, or program-size complexity) of an object, such as a piece of text, is a measure of the computability resources needed to specify the object. It is named after Andrey Kolmogorov, who first published on the subject in 1963

History and Context
Algorithmic information theory is the area of computer science that studies Kolmogorov complexity and other complexity measures on strings (or other data structures).

The concept and theory of Kolmogorov Complexity is based on a crucial theorem first discovered by Ray Solomonoff, who published it in 1960, describing it in "A Preliminary Report on a General Theory of Inductive Inference" as part of his invention of algorithmic probability. He gave a more complete description in his 1964 publications, "A Formal Theory of Inductive Inference," Part 1 and Part 2 in Information and Control.

Andrey Kolmogorov later independently published this theorem in Problems Inform. Transmission, Gregory Chaitin also presents this theorem in J. ACM – Chaitin's paper was submitted October 1966 and revised in December 1968, and cites both Solomonoff's and Kolmogorov's papers.

The theorem says that, among algorithms that decode strings from their descriptions (codes), there exists an optimal one. This algorithm, for all strings, allows codes as short as allowed by any other algorithm up to an additive constant that depends on the algorithms, but not on the strings themselves. Solomonoff used this algorithm, and the code lengths it allows, to define a "universal probability" of a string on which inductive inference of the subsequent digits of the string can be based. Kolmogorov used this theorem to define several functions of strings, including complexity, randomness, and information.

When Kolmogorov became aware of Solomonoff's work, he acknowledged Solomonoff's priority. For several years, Solomonoff's work was better known in the Soviet Union than in the Western World. The general consensus in the scientific community, however, was to associate this type of complexity with Kolmogorov, who was concerned with randomness of a sequence, while Algorithmic Probability became associated with Solomonoff, who focused on prediction using his invention of the universal prior probability distribution. The broader area encompassing descriptional complexity and probability is often called Kolmogorov complexity. The computer scientist Ming Li considers this an example of the Matthew effect: "... to everyone who has more will be given ..."

There are several other variants of Kolmogorov complexity or algorithmic information. The most widely used one is based on self-delimiting programs, and is mainly due to Leonid Levin (1974).
An axiomatic approach to Kolmogorov complexity based on Blum axioms (Blum 1967) was introduced by Mark Burgin in the paper presented for publication by Andrey Kolmogorov (Burgin 1982).
  http://en.wikipedia.org/wiki/Kolmogorov_complexity

No comments:

Post a Comment