Coherence States

Note: While we are still in the process of trying to describe some of the key elements of the mathematics now used to describe quantum systems, it might be a good time to step back from some of this mathematical abstraction. For it is important not lose sight of some of the wider issues associated with the current mathematical approach, at least, in terms of how it might ultimately constitute any meaningful physical description of reality.

1In this context, we might introduce what is known as the ‘measurement problem’ of quantum mechanics, which may appear to question how, or even if, the wave function collapse occurs. For, to-date, there has been no direct experimental verification of the idea of the wave function collapse, such that the reality of this event is still open to many different interpretations. In-line with the mathematical description being discussed, the wave function is assumed to evolve according to the ‘Schrödinger equation into a linear superposition of different quantum states. However, any attempt to correlate the quantum state via an actual measurement always finds the system in a definite physical state, i.e. an eigenstate. Any further evolution of this observed state must then take place from this point, such that we are confronted with the idea that the process of measurement did ‘something’ to the system as a whole. However, whatever this ‘something’ might be, it is not directly explained by the basic theory and only subsequently addressed in the form of a wide variety of interpretations, which although eventually arriving at the same probabilistic conclusion, as required by quantum theory, do so based on many different assumptions. Therefore, as suggested, now might not be unreasonable point to ask:

Why does quantum theory not appear immediately applicable to macroscopic systems?

From a descriptive perspective, we might address this question in terms of an idea called decoherence. The core of this idea is that classical macroscopic properties depend on the decoherence of quantum properties, especially when initially described in terms of the interference sum of a coherent quantum system. However, in many respects, it is often easier not to think of classical properties ‘emerging’ due to decoherence, but rather in reverse, such that quantum properties ‘emerge’ as we approach the quantum scale, i.e. small enough to exhibit coherence. Therefore, a quantum system might be defined as one that cannot be measured without disturbance. In this context, decoherence implies that the very act of trying to measure a quantum system destroys the coherence that existed in a given quantum state. Paul Dirac defined an object to be ‘big’ when the disturbance accompanying its observation may be neglected and conversely ‘small’  when the disturbance cannot be neglected. However, in practice, there is always a size when all and every attempt to minimize the disturbance fails. To quote Dirac:

"There is a limit to the fineness of our powers of observation and the smallness of the accompanying disturbance - a limit which is inherent in the nature of things and can never be surpassed by improved techniques or skill on the part of the observer."

Therefore, if a system is ‘small’ in the quantum sense, it cannot be observed without producing a disturbance that affects the causal and deterministic connection to any measurement. As such, there is an unavoidable indeterminacy associated with any measurement of the original quantum system caused by the interaction with the measurement system at the quantum level.

2

So, with reference to the double-slit experiment, we might label the probability-amplitude wave function passing through the slits as [ΨL] and [ΨR]. i.e. the left and right slits respectively. When these wave functions can be described as coherent, i.e. undisturbed, they display the characteristic interference fringes on the target-screen. As discussed, this interference effect will persist  even if the intensity of wave-particles is so low that only one wave-particle is fired at a time. In an attempt to verify the effect of decoherence, an experiment was carried out with Rubidium atoms as the source of the ‘matter waves’, but where the left  slit could also be irradiated with microwaves so that it would excite the hyperfine structure of any atoms passing through that slit. As the intensity was increased, from zero, the interference fringes diminished in proportion to the number of microwave photons falling on the left slit. As such, it was argued that the original coherent quantum wave functions were disturbed and the inference pattern disappeared as a consequence of this disturbance and not simply because the information now existed regarding which slit the atoms passed through.

So can decoherence be offered up as an explanation of some of the apparent ambiguity surrounding the quantum description?

Well, first of all, some explanation of the ‘ambiguity’ in question might be required. What is meant by ambiguity is that the mathematics of quantum theory is not necessarily making any assumptions that the ‘objects’ being described have any existence in physical reality. As such, we might characterise quantum theory as starting at some known point, i.e. state |Ψ>, which then appears to evolve through the superposition of all possible intermediate states, i.e. |qn>, to arrive at some final state |Φ>, which may ultimately be verified by observation. However, the process between <Φ|Ψ> is essentially a mathematical abstraction, which ‘may or may not’ bear any resemblance to any ‘physical’ process taking place in the quantum realm. This is clearly an ambiguity in terms of any classical description, such that we might wish to consider the issue of decoherence from two perspectives:

  • Classical Reality:
    From the perspective of modern physics, the most fundamental idea about the ‘nature of things ’ are now based on the ideas of quantum mechanics, even though the everyday world appears more conformant to classical physics, i.e. there is no obvious existence of quantum superposition states or any associated  wave function collapse in the description of classical reality.

  • Quantum Reality:
    In contrast, the description of quantum reality appears to be based primarily on a mathematical formalism contained within the remit of quantum theory. While this formalism provides a methodology for calculating the probabilities of a given outcome, it does not necessarily explain how the measured outcome physically comes about.

At this point, it is easy to become lost, not only in the mathematics, but the scope of the philosophical semantics often used to described the ‘problem space’ itself. So, first, let us try to clarify some of the semantics of two different perspectives, which can appear in many philosophical discussions of this subject, i.e. ontology and epistemology. First, ontology is normally defined in terms of it being the study of what exists and the nature of what exists, while epistemology is usually described as the study of knowledge and any justification of this knowledge. However, within the scope of the two types of reality introduced above, classical reality may be seen to align more to an ontological description, if physical reality is assumed to exist. In contrast, quantum reality might be seen to better align to an epistemological description because if it is constrained to mathematical knowledge and logical justification, it is making no explicit reference to any form of physical reality, at least, in terms of its model of the intermediate superposition states.

So, in this context, does the idea of decoherence provide any clarification?

At one level, the idea of decoherence appears to offer up a rational explanation as to why a quantum system, evolving in abstract isolation, might be ‘disturbed’ by the interaction with some measurement system.  However, at another level, there does not appear to be any real explanation of how the ‘physical’ process of measurement affects the mathematical concept of the superposition of quantum states. For example, while we might see the interference effect within the double-slit experiment disappear, as described above, it is unclear whether this disappearance is still connected with the abstract concept of the wave function collapse. However, the real purpose of injecting this topic into what is still primarily a mathematical discussion is simply to highlight the potential level of abstraction on which the current mathematical ‘epistemology’ is based.

Note: At a speculative level, we might still perceive that any measurement of any wave structure passing through either of the slits might completely disrupt its wave structure such that the interference pattern with its original 'twin' passing through the other slit may be effectively destroyed.