page 11: Quantization: the mathematical theory of communication
Table of contents
11.1: Communication
11.2: The Nyquist-Shannon theorem and the digitization of continuous signals
11.3: Shannon's prescription for overcoming noise
11.4: The application of Shannon's work
11.5: Entropy and information
11.6: Superposition and quantum representation of information
11.1: Communication
The standard model of quantum mechanics comes in two parts. The first deals with the evolution through time of undisturbed quantum systems. This process is essentially invisible, so our knowledge of it is speculative. The second desribes the evolution of systems when they are disturbed by observation or measurement. Here quantum theory differs radically from classical theory. Classical physics assumes that it is possible to observe a natural system without changing it in any way. In contrast, quantum theory imagines that the act of observation changes the observed system. Both seeing and being seen require action. We will explore this second mode on page 14: Measurement: the interface between Hilbert and Minkowski spaces.
Here have left quantum mechanics for a moment to talk about communication in the classical world, the everyday world of experience whose behaviour is controlled by the invisible world of quantum mechanics. Communication is the copying of information existing at one point in spacetime to another. This may or may not involve the deletion of the initial information. The unavoidable flow of classical Minkowski space-time expressed by special relativity and illustrated by the light cone diagram tells us that messages emitted from a particular source can only be received in the forward light cone of that source. Conversely, a source can only receive messages from its past light cone. You can't talk to the past. You can't listen to the future. Minkowski space - Wikipedia, Light cone - Wikipedia
This structure is determined by the velocity of light, the maximum rate that physical information can be transmitted through physical space. Quantum theory, kinetic but prior to the emergence of spacetime, can partly avoid this classical constraint through entanglement but this process cannot transmit specific information. Physical Minkowski space is to be introduced on page 12: The quantum creation of Minkowski space. Quantum entanglement - Wikipedia
We assume that the structure of the Universe is maintained by communications between its various components. If the Universe is to be stable, we further assume that these communications must be at least to some degree error free. The mathematical theory of communication developed by Shannon establishes that in Minkowski space, quantization and error prevention are very closely related.
11.2: Nyquist-Shannon theorem and the digitization of continuous signals
The Shannon-Nyquist theorem tells us that continuous modulated analogue signals like sound waves with limited bandwidth can be accurately represented by digital signals that are sampled at twice the rate of the highest frequency in the analogue signal. Since the upper limit of human hearing is about 20 000 Hz, digital audio is often sampled at 44 kHz. Nyquist-Shannon sampling theorem - Wikipedia, Digital audio - Wikipedia
In practical engineering terms, digitization enables every message and every signal to be identified as a point in a rational function space. Shannon provides and example of the size of the space necessary by estimating that one hour of a 5MHz television signal would be represented by a point in a space of 3.6 × 1010 dimensions. Function space - Wikipedia
this page 11: toc
11.3: Shannon's prescription for overcoming noise
In communication terms, the message to be sent is a point in message space and the signal transmitted is a point in signal space. The role of the transmitter is to encode or map the message into the signal. The receiver does the opposite. The computational procedure that does this work is called coder-decoder or codec. The error control is embodied in this mapping. Codec - Wikipedia
The key to the distinction of signals is:
. . . two signals can be reliably distinguished if they differ by only a small amount, provided this difference is sustained over a long period of time. Each sample of the received signal then gives a small amount of statistical information concerning the transmitted signal; in combination, these statistical indications result in near certainty.
The technique is to package or quantize the message and the signal to produce units extended in time which are clearly distinguishable:
The transmitter will take long sequences of binary digits and represent this entire sequence by a particular signal function of long duration. The delay is required because the transmitter must wait for the full sequence before the signal is determined. Similarly, the receiver must wait for the full signal function before decoding into binary digits. Claude Shannon (1949): Communication in the Presence of Noise
This idea is implemented by Shannon's Theorem 2: Let P be the average transmitter power, and suppose the noise is white thermal noise of power N in the band W. By sufficiently complicated encoding systems it is possible to transmit binary digits at a rate :
C = W log2 [ (P + N) / N ]
with as small a frequency of errors as desired. It is not possible by any encoding method to send at a higher rate and have an arbitrarily low frequency of errors.
Shannon goes on to proves this theorem using geometric methods in a real function space. He concludes with a summary of the properties of a system that transmits without error at the limiting rate C called an ideal system. Some features of an ideal system are embodied in quantum mechanics, particularly quantization.
1. To avoid error there must be no overlap between signals representing different messages. They must, in other words, be orthogonal, as with the bases of a Hilbert space.
2. The basis signals or letters of the source alphabet may be chosen at random in the signal space, provided only that they are orthogonal. The same message may be encoded into any satisfactory basis provided that the transformations (the codec) used by the transmitter to encode the message into the signal and receiver to decode the signal back to the message are inverses of one another. Quantum processes are reversible in time in the sense that the unitary evolution of an isolated quantum system acts as though it is processed by a lossless codec. Unitary operator - Wikipedia
3. The signals transmitted by an ideal system have maximum entropy and so are indistinguishable from random noise. The fact that a set of physical observations looks like a random sequence is not therefore evidence for meaninglessness. Until the algorithms used to encode and decode such a sequence are known, little can be said about its significance. Many codecs are designed to hide the contents of messages from sources that do not possess the decoding algorithm. A widely used derivative of this approach is public key cryptography. Tamper evident quantum methods are also used for key distribution. Public Key Cryptography - Wikipedia, Quantum cryptography - Wikipedia
4. Only in the simplest cases are the mappings used to encode and decode messages linear and topological. For practical purposes, however, they must all be computable with available machines. How this applies in quantum theory is closely related to the measurement problem and the so called collapse of the wave function (see page 14: Measurement: the interface between Hilbert and Minkowski).
5. As a system approaches the ideal, the length of the transmitted packets, the delay at the transmitter while it takes in a chunk of message for encoding, and the corresponding delay at the receiver while the message is decoded, increase indefinitely.
this page 11: toc
11.4: The application of Shannon's work
The difficulty with Shannon's theory lies in the phrase by sufficiently complicated encoding systems. In the seventy years since Shannon wrote, a large number of ingenious codecs have been developed which have established the internet as an all purpose transporter of all forms of information, open and secret. The advent of powerful digital computers capable of executing non-linear and non-topological algorithms has taken care of the sufficiently complicated aspect of the problem.
In the error free world of formal mathematics, noise is not a problem. In the physical world, however, things are not so clear cut.
this page 11: toc
11.5: Entropy and information
In his Mathematical Theory of Communication Shannon framed the communication problem in terms of entropy, a concept derived from thermodynamics. Claude E Shannon (1948): A Mathematical Theory of Communication, Entropy in thermodynamics and information theory - Wikipedia
Shannon's paper begins with a definition of the entropy of a communication source. The entropy of a source A capable of emitting i different symbols or letters ai with probabilities pi such that Σi pi = 1 is
H = Σi pi log2 pi.
Alexandr Y. Khinchin (1957): Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman)
Quantum systems described by algorithms such as the Schrödinger equation are constrained to evolve through time in a unitary and reversible manner. The Schrödinger equation defines an error free communication channel which is nevertheless invisible to us. This process is interrupted when systems interact, just as computers in a network are interrupted when they are required to deal with an incoming message. Unitarity (physics) - Wikipedia
An important role of unitarity in quantum mechanics is to constrain the potential outcomes of quantum measurements so that they are a complete system of events identical to the output of a communication source as defined above.
this page 11: toc
11.6: Superposition and quantum representation of information
Hilbert space is a vector space. The orthonormalization of vectors in Hilbert space introduces quantization into this space and the effect of superposition is to add vectors changing their direction while maintaining orthonormality. Orthonormality - Wikipedia, Quantum superposition - Wikipedia
Information in Hilbert space is therefore carried by the direction of vectors. Since Hilbert space is linear, we can easily change the basis in which vectors are represented. This means that the directions of individual vectors are not significant but the angles between vectors are.
The eigenvector of an operator is a vector whose direction or phase is unchanged by the operator. It is a stationary point in the operation of the operator. This is observable, consistent with the idea that we can only observe stationary entities. The Born Rule extracts the lengths of the eigenvectors which are the probability amplitudes which are converted to probabilities by computing their absolute square. This yields the spectrum of the coincidences between the measurement operator and the operators associated with the system measured. As in a human conversation, a message is successfully shared if the people conversing speak the same language. Quantum mehcanically, this means in the same basis. Eigenvalues and eigenvectors - Wikipedia, Born rule - Wikipedia
These probabilities associated with an interaction are discrete and precise. Their sum, 1, represents the outcomes of a complete set of events. There is no uncertainty associated with these values in a particular case. The measurement process is uncertain insofar as it does not determine which particular eigenvalue will appear in the observation, only the probabilities of the results that will be obtained. Nature, like us, pronounces its words with precision but uses them in random sequences. Collectively exhaustive events - Wikipedia
By tradition this feature of quantum measurement has come to be called the 'collapse' of the wave function but a more reasonable explanation is simply that the measurement operation is an information source consistent with the demands of the theory of communication. This theory assumes that a communication source yields only one character of its alphabet at a time and the sum of the probabilities of observing these characters is normalized to 1 consistent with the properties of a communication source. Quantum mechanics, like the theory of communication, is not concerned with specific messages but rather the constraints on error free communication of all possible messages established by the statistics of a particular source. Zurek explains how this constraint follows from a simple set of axioms for quantum mechanics.
We return to this story on page 14: "Measurement": the interface between Hilbert and Minkowski spaces.
Wave function collapse - Wikipedia, Wojciech Hubert Zurek (2008): Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical
We assume that every quantum communication involves a quantum of action, analogous to a packet of data on a communication link. The history of nuclear physics began with the discovery of radioactivity in 1896 by Henri Becquerel and Marie Skłodowska-Curie, while working with phosphorescent materials. It is a random process with a frequency characteristic of each nucleus measured by its half life. We may think of it as a the result of a coding process within the nucleus which at some moment completes the construction of a transmissable packet of information which is the decay product. All quantum events have a similar structure, making it impossible to predict the precise moment at which an event will occur. Radioactive decay - Wikipedia, Half-life - Wikipedia
this page 11: toc
(revised Tuesday 6 August 2024)
Back to site toc
|
Copyright:
You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.
Notes and references
Further readingBooks
Khinchin (1957), Aleksandr Yakovlevich, Mathematical Foundations of Information Theory (translated by P A Silvermann and M D Friedman), Dover 1957 Jacket: 'The first comprehensive introduction to information theory, this book places the work begun by Shannon and continued by McMillan, Feinstein and Khinchin on a rigorous mathematical basis. For the first time, mathematicians, statisticians, physicists, cyberneticists and communications engineers are offered a lucid, comprehensive introduction to this rapidly growing field.'
Amazon
back |
Links
Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, ' The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see bra-ket notation), then
the measured result will be one of the eigenvalues λ of A, and
the probability of measuring a given eigenvalue λi will equal <ψ|Pi|ψ> where Pi is the projection onto the eigenspace of A corresponding to λi'.' back |
Claude E Shannon (1948), A Mathematical Theory of Communication, ' The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back |
Claude Shannon (1949), Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' [C. E. Shannon , “Communication in the presence of noise,” Proc. IRE,
vol. 37, pp. 10–21, Jan. 1949.] back |
Codec - Wikipedia, Codec - Wikipedia, the free encyclopedia, 'A codec is a device or computer program that encodes or decodes a data stream or signal. Codec is a portmanteau of coder/decoder. . . .
IA coder or encoder encodes a data stream or a signal for transmission or storage, possibly in encrypted form, and the decoder function reverses the encoding for playback or editing. Codecs are used in videoconferencing, streaming media, and video editing applications.
In the mid-20th century, a codec was a device that coded analog signals into digital form using pulse-code modulation (PCM). Later, the name was also applied to software for converting between digital signal formats, including companding functions. ' back |
Collectively exhaustive events - Wikipedia, Collectively exhaustive events - Wikipedia, the free encyclopedia, ' In probability theory and logic, a set of events is jointly or collectively exhaustive if at least one of the events must occur. For example, when rolling a six-sided die, the events 1, 2, 3, 4, 5, and 6 balls of a single outcome are collectively exhaustive, because they encompass the entire range of possible outcomes.' back |
Digital audio - Wikipedia, Digital audio - Wikipedia, the free encyclopedia, ' Digital audio is a representation of sound recorded in, or converted into, digital form. In digital audio, the sound wave of the audio signal is typically encoded as numerical samples in a continuous sequence. For example, in CD audio, samples are taken 44,100 times per second, each with 16-bit sample depth. Digital audio is also the name for the entire technology of sound recording and reproduction using audio signals that have been encoded in digital form.' back |
Eigenvalues and eigenvectors - Wikipedia, Eigenvalues and eigenvectors - Wikipedia, the free encyclopedia, ' In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by λ, is the factor by which the eigenvector is scaled.
Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated.' back |
Entropy in thermodynamics and information theory - Wikipedia, Entropy in thermodynamics and information theory - Wikipedia, the free encyclopedia, 'There are close parallels between the mathematical expressions for the thermodynamic entropy, usually denoted by S, of a physical system in the statistical thermodynamics established by Ludwig Boltzmann and J. Willard Gibbs in the 1870s, and the information-theoretic entropy, usually expressed as H, of Claude Shannon and Ralph Hartley developed in the 1940s. Shannon, although not initially aware of this similarity, commented on it upon publicizing information theory in A Mathematical Theory of Communication.
This article explores what links there are between the two concepts, and how far they can be regarded as connected.' back |
Function space - Wikipedia, Function space - Wikipedia, the free encyclopedia, 'In mathematics, a function space is a set of functions of a given kind from a set X to a set Y. It is called a space because in many applications, it is a topological space or a vector space or both' back |
Half-life - Wikipedia, Half-life - Wikipedia, the free encyclopedia, 'Half-life (symbol t½) is the time required for a quantity to reduce to half its initial value. The term is commonly used in nuclear physics to describe how quickly unstable atoms undergo, or how long stable atoms survive, radioactive decay. The term is also used more generally to characterize any type of exponential or non-exponential decay. For example, the medical sciences refer to the biological half-life of drugs and other chemicals in the human body. The converse of half-life is doubling time.' back |
Light cone - Wikipedia, Light cone - Wikipedia, the free encyclopedia, 'A Light cone is the path that a flash of light, emanating from a single event E (localized to a single point in space and a single moment in time) and traveling in all directions, would take through spacetime. Imagine the light confined to a two-dimensional plane, the light from the flash spreads out in a circle after the event E occurs—and when graphed the growing circle with the vertical axis of the graph representing time, the result is a cone, known as the future light cone (some animated diagrams depicting this concept can be seen here.) ' back |
Metric space - Wikipedia, Metric space - Wikipedia, the free encyclopedia, ' In mathematics, a metric space is a set together with a metric on the set. The metric is a function that defines a concept of distance between any two members of the set, which are usually called points. The metric satisfies a few simple properties. Informally:
the distance from A to B is zero if and only if A and B are the same point,
the distance between two distinct points is positive,
the distance from A to B is the same as the distance from B to A, and
the distance from A to B is less than or equal to the distance from A to B via any third point C.' back |
Minkowski space - Wikipedia, Minkowski space - Wikipedia, the free encyclopedia, ' By 1908 Minkowski realized that the special theory of relativity, introduced by his former student Albert Einstein in 1905 and based on the previous work of Lorentz and Poincaré, could best be understood in a four-dimensional space, since known as the "Minkowski spacetime", in which time and space are not separated entities but intermingled in a four-dimensional space–time, and in which the Lorentz geometry of special relativity can be effectively represented using the invariant interval x2 + y2 + z2 − c2 t2.' back |
Nyquist-Shannon sampling theorem - Wikipedia, Nyquist-Shannon sampling theorem - Wikipedia, the free encyclopedia, ' In the field of digital signal processing, the sampling theorem is a fundamental bridge between continuous-time signals (often called "analog signals") and discrete-time signals (often called "digital signals"). It establishes a sufficient condition for a sample rate that permits a discrete sequence of samples to capture all the information from a continuous-time signal of finite bandwidth.' back |
Orthonormality - Wikipedia, Orthonormality - Wikipedia, the free encyclopedia, ' In linear algebra, two vectors in an inner product space are orthonormal if they are orthogonal (or perpendicular along a line) unit vectors. A set of vectors form an orthonormal set if all vectors in the set are mutually orthogonal and all of unit length. An orthonormal set which forms a basis is called an orthonormal basis.' back |
Planck-Einstein relation - Wikipedia, Planck-Einstein relation - Wikipedia, the free encyclopedia, 'The Planck–Einstein relation. . . refers to a formula integral to quantum mechanics, which states that the energy of a photon (E) is proportional to its frequency (ν). E = hν. The constant of proportionality, h, is known as the Planck constant.' back |
Public Key Cryptography - Wikipedia, Public Key Cryptography - Wikipedia, the free encyclopedia, 'Public-key cryptography is a form of modern cryptography which allows users to communicate securely without previously agreeing on a shared secret key. For most of the history of cryptography, a key had to be kept absolutely secret and would be agreed upon beforehand using a secure, but non-cryptographic, method; for example, a face-to-face meeting or a trusted courier. There are a number of significant practical difficulties in this approach to distributing keys. Public-key cryptography was invented to address these drawbacks — with public-key cryptography, users can communicate securely over an insecure channel without having to agree upon a key beforehand.'
back |
Quantum cryptography - Wikipedia, Quantum cryptography - Wikipedia, the free encyclopedia, ' Quantum cryptography is the science of exploiting quantum mechanical properties to perform cryptographic tasks.The best known example of quantum cryptography is quantum key distribution which offers an information-theoretically secure solution to the key exchange problem. The advantage of quantum cryptography lies in the fact that it allows the completion of various cryptographic tasks that are proven or conjectured to be impossible using only classical (i.e. non-quantum) communication. For example, it is impossible to copy data encoded in a quantum state. If one attempts to read the encoded data, the quantum state will be changed due to wave function collapse (no-cloning theorem). This could be used to detect eavesdropping in quantum key distribution (QKD).' back |
Quantum entanglement - Wikipedia, Quantum entanglement - Wikipedia, the free encyclopedia, 'Quantum entanglement is a physical phenomenon which occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole. . . . Entanglement is considered fundamental to quantum mechanics, even though it wasn't recognized in the beginning. Quantum entanglement has been demonstrated experimentally with photons, neutrinos, electrons, molecules as large as buckyballs, and even small diamonds. The utilization of entanglement in communication and computation is a very active area of research.' back |
Quantum superposition - Wikipedia, Quantum superposition - Wikipedia, the free encyclopedia, 'Quantum superposition is the application of the superposition principle to quantum mechanics. The superposition principle is the addition of the amplitudes of waves from interference. In quantum mechanics it is the sum of wavefunction amplitudes, or state vectors. It occurs when an object simultaneously "possesses" two or more possible values for an observable quantity (e.g. the position or energy of a particle)' back |
Radioactive decay - Wikipedia, Radioactive decay - Wikipedia, the free encyclopedia, ' Radioactive decay (also known as nuclear decay, radioactivity, radioactive disintegration, or nuclear disintegration) is the process by which an unstable atomic nucleus loses energy by radiation. A material containing unstable nuclei is considered radioactive. Three of the most common types of decay are alpha, beta, and gamma decay. The weak force is the mechanism that is responsible for beta decay, while the other two are governed by the electromagnetism and nuclear force.' back |
Schrödinger equation - Wikipedia, Schrödinger equation - Wikipedia, the free encyclopedia, ' In quantum mechanics, the Schrödinger equation is a partial differential equation that describes how the quantum state of a quantum system changes with time. It was formulated in late 1925, and published in 1926, by the Austrian physicist Erwin Schrödinger. . . .
In classical mechanics Newton's second law, (F = ma), is used to mathematically predict what a given system will do at any time after a known initial condition. In quantum mechanics, the analogue of Newton's law is Schrödinger's equation for a quantum system (usually atoms, molecules, and subatomic particles whether free, bound, or localized). It is not a simple algebraic equation, but in general a linear partial differential equation, describing the time-evolution of the system's wave function (also called a "state function").' back |
Unitarity (physics) - Wikipedia, Unitarity (physics) - Wikipedia, the free encyclopedia, ' In quantum physics, unitarity means that the sum of probabilities of all possible outcome of any event is always 1. This is necessary for the theory to be consistent.
This implies that the operator which describes the progress of a physical system in time must be a unitary operator. This operator is e iHt where H is the Hamiltonian of the system and t is [an increasing number, not necessarily time since we are in Hilbert space where there is no space-time].' back |
Unitary operator - Wikipedia, Unitary operator - Wikipedia, the free encyclopedia, ' In functional analysis, a branch of mathematics, a unitary operator . . . is a bounded linear operator U : H → H on a Hilbert space H satisfying UU* = U*U = I where U* is the adjoint of U, and I : H → H is the identity operator. This property is equivalent to the following:
1. U preserves the inner product ( , ) of the Hilbert space, ie for all vectors x and y in the Hilbert space, (Ux, Uy) = (x, y) and
2. U is surjective.'
back |
Unmoved mover - Wikipedia, Unmoved mover - Wikipedia, the free encyclopedia, ' The unmoved mover (Ancient Greek: ὃ οὐ κινούμενον κινεῖ, lit. 'that which moves without being moved' or prime mover (Latin: primum movens) is a concept advanced by Aristotle as a primary cause (or first uncaused cause) or "mover" of all the motion in the universe. As is implicit in the name, the unmoved mover moves other things, but is not itself moved by any prior action. In Book 12 (Greek: Λ) of his Metaphysics, Aristotle describes the unmoved mover as being perfectly beautiful, indivisible, and contemplating only the perfect contemplation: self-contemplation. He equates this concept also with the active intellect. This Aristotelian concept had its roots in cosmological speculations of the earliest Greek pre-Socratic philosophers and became highly influential and widely drawn upon in medieval philosophy and theology. St. Thomas Aquinas, for example, elaborated on the unmoved mover in the Quinque viae. ' back |
Wave function collapse - Wikipedia, Wave function collapse - Wikipedia, the free encyclopedia, 'In quantum mechanics, wave function collapse is said to occur when a wave function—initially in a superposition of several eigenstates—appears to reduce to a single eigenstate (by "observation"). It is the essence of measurement in quantum mechanics and connects the wave function with classical observables like position and momentum. Collapse is one of two processes by which quantum systems evolve in time; the other is continuous evolution via the Schrödinger equation.' back |
Wojciech Hubert Zurek (2008), Quantum origin of quantum jumps: breaking of unitary symmetry induced by information transfer and the transition from quantum to classical, 'Submitted on 17 Mar 2007 (v1), last revised 18 Mar 2008 (this version, v3))
Measurements transfer information about a system to the apparatus, and then further on – to
observers and (often inadvertently) to the environment. I show that even imperfect copying essential in such situations restricts possible unperturbed outcomes to an orthogonal subset of all possible states of the system, thus breaking the unitary symmetry of its Hilbert space implied by the quantum superposition principle. Preferred outcome states emerge as a result. They provide framework
for the “wavepacket collapse”, designating terminal points of quantum jumps, and defining the
measured observable by specifying its eigenstates.' back |
|