- BQO - https://www.bigquestionsonline.com -

Is Information the Basis of Reality?

No, but it may be an important component. Imagine trying to spend a week without any information. No books or broadcasts, no entertainment or news. No communication with another person, whether written or spoken or visual or just touching. It would be a diminished kind of existence. That is why extended solitary confinement is such a terrible punishment. Information plays a vital role in everyday human life. But to what extent is information the basis of reality?

The significance of a communication cannot be measured by the amount of information contained in it. It takes the same number of words to say “Thank you for tea” as to say “Will you marry me?”, but they may have vastly different consequences. On 18 April 1775 Robert Newman and his friend Captain John Pulling climbed the steeple of the North Church in Boston Massachusetts with two lanterns. This was a one bit message, encoded as “one if by land and two if by sea”. The lanterns were displayed for less than a minute, but this was sufficient to inform the patriots in Charlestown about the movements of the British army. A rider was dispatched to Lexington, though he never arrived and his name is lost. The message was spread by others, as immortalized in Longfellow’s poem Paul Revere’s Ride. A single bit of information had decisive significance.

In the industrial revolution, concepts of energy were crucial for building and optimising engines. A Big Question at the time might have been, “Is energy the basis of reality?” Alongside the understanding of energy there grew an explanation of the behaviour of gases in terms of the motion of molecules. In principle, if you knew at some instant the position and momentum of each molecule, then you would be able to predict the behaviour of the gas ever after. But the number of molecules is very large (6 × 10 23) in every 22.4 litres at standard temperature and pressure). Even with advanced computers it would be impractical to keep track of all of them. Instead scientists used statistical concepts to describe the behaviour of gases. The success of these methods convinced skeptics of the reality of molecules long before they could be seen by modern methods of microscopy. The thermodynamic concept of entropy in statistical mechanics is closely related to information.

If in the nineteenth century technological progress was driven by mechanical engines, then the twentieth century it was driven by communication and computation. In 1948 Claude Shannon published A Mathematical Theory of Communication. He formulated a theorem of source coding, which states how many bits are needed to specify an uncertain event, and of channel coding, which shows how reliable communication is possible even in the presence of random noise. This description does not depend on the significance of the message. But it gives mathematical rigor to communication, and is used by every electrical engineer designing systems for information technology. Because of the similarity of the equations to those of statistical mechanics, terms from thermodynamics such as entropy have entered the vocabulary of information engineering.

Meanwhile biologists have been discovering the value of information as a concept. Information may become even more important than evolution in some rapidly developing areas of research in the life sciences. Consider DNA: when Crick and Watson determined its structure they announced in the local pub that they had found the secret of life, though James Watson later reflected that this may have been too strong a claim. Their work revealed the mechanism whereby living organisms encode and transmit genetic information. No biologist would assert that DNA provides a self-contained set of instructions; the growth of living systems is much more subtle than that. Epigenetics studies the crucial interplay between the encoded information and the environment, and how information management plays a central role in generating biological organisation.

Classical information comes in bits; this is how the computer on which you may be reading this essay works. This year, 2012, marks the centenary of the birth of Alan Turing, who formulated the concept of a universal computing machine. In 1985 David Deutsch published an analysis of the Turing machine in what became a manifesto for quantum computing. Quantum computers will be fundamentally different from classical computers. Whereas classical bits are each either a 0 or a 1, superposition allows quantum bits to be both 0 and 1 at the same time, making possible computational operations which would be impossible in CMOS logic circuits. Entanglement extends this to allow two quantum bits to be correlated beyond anything that can be explained in terms of classical intuition, giving quantum computing exponential potential for certain applications.  Remarkable progress is being made in laboratory scale quantum computers. Many of the key components have been demonstrated in optical circuits, ions trapped in a vacuum, superconducting circuits, and in electron and nuclear spins in gallium arsenide, silicon, molecules, and diamond. It will be exciting to see which of these, and in what combination, will eventually form the heart of industrial production of quantum computers.

Alongside the remarkable development of quantum technologies, quantum information has raised fresh questions about the nature of reality. In the nineteenth century, science seemed to describe a reality which is objectively there regardless of any observations. Quantum mechanics brought all this into question; since the early days it has been an open question whether quantum states are states of fact or states of knowledge. Albert Einstein wanted science to describe what is actually the case, and was frustrated that quantum mechanics seemed to be incomplete. Neils Bohr argued that science was restricted to what we could say about an experiment and its probable outcome. In 1985, the same year that David Deutsch published his seminal paper about the universal quantum computer, Tony Leggett and Anupam Garg published a test which would allow an experimenter to determine whether an experiment was consistent with a classical macrorealism which they were careful to define. Part of their motivation was to address the apparent inconsistency between the quantum description of atoms and the classical experience of our us-sized everyday lives. More than twenty five years later, experiments have been performed which establish a methodology for implementing their test, and which experimentally rule out a certain kind of realism at the atomic scale. Related experiments have shown cases where measurements must have disturbed the quantum state even though their effect cannot be detected. Some realist interpretations of quantum states could never be ruled out by any experimental results consistent with quantum theory, but there is growing proof that wholly statistical interpretations are not tenable.

Rapid progress is being made in explaining how information is processed by the brain, and this may contribute to a deeper understanding of the relationship between myself and my body. The information that is integral to my experience and my memories contributes crucially to being me. One of the biggest questions anyone can ask is what happens after I die? Answers range from annihilation to reincarnation or some nirvana-like state. For those who have confidence in the evidence of the resurrection, the concept of information may offer insights into how the self endures. It will not be in the same body, which decays following death, but neither will it be disembodied. In some way information may give continuity to the reality of a person after death.

Within the foreseeable future information may become as fundamental a concept in the natural sciences as, say, energy, though it is too early to say what form this generalisation might take. There is no law of conservation of information comparable to the conservation of energy, which is why intelligent design arguments based on information do not work, and there is no equation relating information to mass. But information as a basic category of scientific thought seems to be ripe for development, and as that happens new insight may be gained into how it is a component of the nature of reality. In so far as reality has to be understood relationally, the flow of information may play a crucial role. But we need to learn much more about the nature of information and its physical instantiation before we can say how it contributes to the basis of reality.

Most of this essay has been about information. Reality is fairly widely understood as the way the world is, irrespective of what we might think it to be. Leaving aside for a moment the issues raised by quantum theory, this leaves open the question: what is at the core of reality?

Is that a question for science alone, or other modes of enquiry also necessary?

Is it a matter for individual choice, or should all rational people agree?

Richard Feynman once wrote that if he could pass on only one sentence to a future generation, it would be that all things are made of atoms. If we wished to pass on the greatest insight about reality in the fewest words, what would it be?

Discussion Summary

Here are two postulates about reality and information.

1.     There is an objective reality, to be equated with what is so. Whether a scientific theory is valid depends on whether it accurately corresponds to reality.

2.     Information is factive. It must be true in order to qualify as information. Although the oxymoron false information is comprehensible in the sense of deliberate or unintentional disinformation, to misinform is not to inform.

In the discussion three areas of the natural sciences and engineering were identified in which the concept of information proves powerful.

In information engineering we come closest to being able to equate information with a clearly fundamental quantity like energy. It is well established that the energy which is required to reset a bit in a memory, or which is dissipated when a bit of information is lost (as in a Boolean NAND gate of a classical computer), cannot be less than kT ln2. This sets a fundamental limit to how energy efficient a classical computer can be, rather as the second law of thermodynamics sets a limit to the efficiency of a gas turbine engine depending on the temperature that the materials of the turbine blades can withstand. This is a rather encouraging observation technologically, because by this criterion current computers are far less efficient than they could be, and there is plenty of scope for improvement.

In the life sciences the question of how life began can be cast in informational terms. Few eminent biologists think of the information in DNA as a set of digital CAD-CAM instructions for the construction of an organism, but there is increasing awareness of the central role that information management plays in generating biological organization. It may be that fresh insights into the origin of life will arise from understanding how biologically functional information arose from materials whose state is described rather in terms of mass and energy and chemical arrangement. This will require descriptions of information which extend beyond information engineering quantities such as entropy, and include contextuality and semantic content.

Almost since the beginning of quantum mechanics it has been asked whether quantum states are ontic or epistemic, whether they describe mind-independent physical states of the world or states of knowledge of observers, whether they describe what is the case or simply what you can say. Simple descriptions in terms of local ‘hidden variables’ were long ago ruled out by experiments that demonstrated entanglement using a criterion for correlations due to the Irish mathematician John Bell. Alongside the development of quantum information processing, over the past ten years or so there has been progress in considering quantum states in terms of statistical distributions of underlying physical states, i.e., in considering quantum states to be states of information. Experimental and theoretical advances are progressively ruling out certain kinds of classical realism, though they are also putting limits on the applicability of these kinds of statistical interpretations.

The biggest questions left open by this debate are, I believe, too hard to answer now, and may require a great deal of meticulous scientific investigation combined with inspirational scientific genius. Like all good questions, they demand a concise definition of the terms used.

Two New Big Questions

1.     Will information become a fundamental concept of science, and if so what form might such a generalisation take?

2.     As progress is made in understanding information, will information be found to have causal power? If so, how should the informational causal narrative relate to the familiar underlying physical causal narrative?