This site documents the Divine Mind:
the Universe that enables us to create ourselves

Contact us: Click to email

page 22: Cosmic theology

A physical understanding is completely unmathematical, imprecise, an inexact thing but absolutely necessary to a physicist. Richard Feynman: Lectures on Physics II Chapter 2: Differential Calculus of Vector Fields

I'm not happy with all the analyses that go with just the classical theory [of computation], because nature isn't classical, dammit, and if you want to make a simulation of nature, you'd better make it quantum mechanical, and by golly it's a wonderful problem, because it doesn't look so easy. Feynman (1981): Simulating Physics with Computers

In conclusion one has to recall that one reason why the ontological interpretation of [Quantum Field Theory] is so difficult is the fact that it is exceptionally unclear which parts of the formalism should be taken to represent anything physical in the first place. And it looks as if that problem will persist for quite some time. Meinard Kuhlmann (Stanford Encyclopedia of Philosophy): Quantum Field Theory

Philosophy is written in this grand book - the universe, which stands continually open before our gaze. But the book cannot be understood unless one first learns to comprehend the language and to read the alphabet in which it is composed. It is written in the language of mathematics . . . Galileo Galilei: Discoveries and Opinions of Galileo: Including the Starry Messenger, p 238.

Table of contents

22.1: Wheeler's electron, angels, formalism and individuation.

22.2: Where do elementary particles come from?

22.3: Spacetime: fermionic networks need 3 + 1 dimensions for their existence

22.4: The exploration of space: from Galileo to Einstein

22.5: Is naked gravitation divine?

22.6: Matter ≡ spirit; physics ≡ theology; particle ≡ god

22.7: A meeting point: the methods of science and evolution

22.8: Lust for life: the power of faith and hope

22.9: Bootstrapping: quantum vacuum and the cosmological constant problem revisited

22.10: Quantum theory describes conversations between living particles: matrix, mind and language

22.1: Wheeler's electron, angels, formalism and the principle of individuation.

The mythology of physics has it that Wheeler called Feynman and explained that the reasons why all electrons are the same is that there is only one electron. Nevertheless, given the size of the Universe and the number of spacelike separated electronic processes happening simultaneously within it, we assume that there are a vast number of electrons distinguished by various individuating circumstances. One-electron universe - Wikipedia, Spacetime - Wikipedia, Principle of individuation - Wikipedia

What makes all these electrons different? A similar problem was faced by Aristotle long ago when he tried to make sense of the theory of forms devised by his mentor Plato. Aristotle's answer was matter and his theory of hylomorphism. There is only one form of sword. The world contains thousands of swords, all distinguished because they are made of bronze. Theory of Forms - Wikipedia

A similar problem arises in a theological context. How many angels are there? Angels are imagined to be spiritual intellectual beings, somewhere between divinity and humanity. They cannot be differentiated by material bodies, since they have none. Instead, Aquinas says, every angel must be formally different from the others, every one is a different species. How are we to make sense of this? We are all one species, but we are all different. Our differences may be subtle genetic differences. Yet even identical twins, with almost identical genomes, are different. Different life experiences give them different personalities, different knowledge, different friends, perhaps different diseases and disabilities. Ellen Barry (2024)

From these examples, we might conclude that the significant difference between particles is not matter or form, but personality, the internalized sum of experience. How does this apply in quantum theory? On page 14: "Measurement": the interface between Hilbert and Minkowski spaces we introduce the idea that measurement is a conversation between two particles whose vocabulary comprises the spectra of the two particles and whose actual results are selected by conditions for transmitting information between the particles as described by Zurek's interpretation of "the collapse of the wave function" described on that page.

this page 22: toc

22.2: Where do elementary particles come from?

The big bang theory of cosmogenesis has been built by extrapolating backwards from high energy physical research conducted over the last century or so. At the high energy of the collision scale, these experiments produce the standard suite of elementary particles that form the standard model. Repeated experiments produce the same particles, so that we can assume that these particles are the natural decay products of the big bang once the temperature that fallen to regions that are accessible to us in out accelerators, about 10 TeV per particle. We may guess that the initial temperature of the big bang is the quantum mechanical equivalent of the Planck time calculated on page §18.2: Motion and stillness: Parmenides vs Heraclitus, ie about 1034K. What happens between the ignition of the big bang and this point is invisible to us but we are free to speculate.

Here we follow a different track, postulating an initial singularity of naked gravitation formally identical to the traditional Christian God. This singularity which generates a Hilbert space and a version of quantum mechanics within itself which leads through a series of evolutionary steps, beginning on page 9: The active creation of Hilbert space and reaching its first conclusion on page 17: Gravitation and quantum theory—in the beginning with the creation of real physical particles.

The hypothesis here is that beginning with a Hilbert space with a countable infinity of dimensions the evolutionary process selects the same initial set of elementary particles as we observe coming out of our "high" energy attempts to imitate the cooler phases of the big bang. Each of these particles, being an observable, comes with a hermitian operator which may represent the countable infinity of stationary states (the spectrum) associated with that particle. This spectrum which plays a role in the interaction of this particle in other members of the suite of elementary particles. We may speculate that the entropy of the spectra of two interacting particles is equivalent to the entropy of the full set of Feynman diagrams associated with such particle interactions in conventional quantum field theory.

At this point we have a situation rather like the angels described in the section above where in the absence of a principe of individuation we can only accommodate one of each species of particle. Like identical angels, two identical electrons are really only one electron. Another problem arising is that half these elementary particles, apart from the massless bosons, are antiparticles of the other half, so we can expect immediate annihilation and no further progress is the evolution of the Universe. This is known as the baryon asymmetry problem and it remains open in theory. Since we are here we can assume that the evolutionary process of the Universe has solved it, but we are yet to understand how. Baryon asymmetry - Wikipedia

22.3: Spacetime: fermionic networks need 3 + 1 dimensions for their existence

The Council of bishops who composed the Nicene Creed raised a serious problem for theologians. The Hebrew divinity was resolutely one, prepared to destroy anyone who claimed otherwise. After Nicea, however, the Christian divinity was, by dogmatic definition, a threesome: Father, Son and Spirit. The new creed lists their roles in the creation of humanity and our salvation. Nicene Creed - Wikipedia

Augustine, Aquinas and Lonergan have all explored answers to this conundrum. Their ideas are discussed on page 8: The theology of the Trinity. After Augustine wrote De Trinitate a new Creed, attributed to Athanasius, appeared which tried to explain the situation in more detail. The Athanasian Creed led to a graphic illustration, the Shield of the Trinity, which tried to make the situation clear, but also had the effect of suggesting that the Trinity is self contradictory, since all three person are equally divine but really distinct from one another. The saving grace of the Shield is not in its words, but in the fact that it demonstrates graphically that the role of space is to differentiate identical entities. Athanasian Creed - Wikipedia, Shield of the Trinity - Wikipedia

This idea is implicit in our modern mathematical explanation of spacetime, which comprises four discrete dimensions, time, north-south (x), east-west (y), and up-down (z), all parts of one space. Einstein's special relativity tells us that if we look at an object, maybe a planet moving in the distance, space-time still has these four dimensions, but they are rotated or mixed up, so that for instance, east-west may gain a bit of north-south and all three spatial dimensions may become mixed with time. What does not change in the distant moving view is the interval, ds between two points separated by the intervals dx, dy, dz and dt.

The formula for this relationship is ds2 = dx2 + dy2 + dz2 − (ct)2. This looks like a four dimensional version of the equation of a circle in two dimensions, dr2 = dx2 + dy2. Mathematically the infinitesimal distances between the coordinates of moving bodies form a continuous group, the Lorentz group which describes Minkowski space. Lorentz group - Wikipedia, Minkowski space - Wikipedia

The simple reason for this relationship is that everybody in inertial (that is unaccelerated) motion sees precisely the same speed of light regardless of their speed relative to the source of light, Einstein's great discovery. Does this mean that the speed of light is a property of the photon or a property of spacetime or both?

This brings us back to the evolution of the world and the relationship between Hilbert space and Minkowski space discussed on page 13: Is Hilbert space independent of Minkowski space?. There we argued that Hilbert space comes first and Minkowski space is a quantum mechanical product of Hilbert space.

How does happen? There we attributed it to the magical powers of evolution by natural selection. Given enough random trials, the world could come up with anything. This anything may be ephemeral and collapse as soon as it appears, but the core idea of evolution is that permanent thing are selected by their ability to reproduce. Some of the fundamental metabolic properties of our last universal common ancestor have existed for three billion years, which means they have been reproduced trillions and trillions of times. Can the same be true of the elementary particles of the Universe? Last universal common ancestor - Wikipedia

In the last century of experimentation we have identified about 60 elementary particles, that is particles that appear to have no internal structure. Many of these are antiparticles of each other; many of them are apparently duplications with the same properties but greater mass. All of them fall into one of two groups, bosons like the photon with integer spin measured in quanta of angular momentum and, fermions like electrons with half integer spin, half a quantum of action. The remarkable property means that a quantum of action flips an electron from spin + 1/2 to − 1/2. Elementary particle - Wikipedia, Spin-1/2 - Wikipedia

Another remarkable feature of fermions is that no two of them can occupy the same regions of space. In standard axiomatic quantum field theory, the difference between fermions and bosons is derived from the special theory of relativity by the spin-statistics theorem. But here, where we imagine that fermions and bosons evolved in Hilbert space before the advent of Minkowski space, we wonder whether this proof can be turned around to show that the special theory of relativity and Minkowski space are consequences of the quantum mechanical evolution of bosons and fermions, pairs of particles that establish their permanent role in nature by interacting with one another. Spin-statistics theorem - Wikipedia, Streater & Wightman (2000): PCT, Spin, Statistics and All That pp 146 - 161.

We can imagine the four dimensions of Minkowski space by thinking of the differences between wired and wireless transmission of signals in practical communication networks. Wireless communication is serial (one dimensional) and channels are distinguished by frequency or energy and encoding. We find this system also in quantum mechanics and WiFi. Quantum mechanics does not see space-time, only phase or angle, the measure of distance in Hilbert space. Cellular telephone networks handle large numbers of customers with a wide spread of frequencies and a dense population of towers to serve local populations. The massless photons of electromagnetic radiation travel along null geodesics, at the speed of light and account for the single negative dimension in Minkowski space. Wi-Fi - Wikipedia, P. A. M. Dirac (1933): The Lagrangian in Quantum Mechanics

Wired networks, on the other hand, need three dimensions for their existence in order to prevent (uninsulated) wires intersecting in the same plane. We may consider the case of moving fermions by analogy with city traffic. In a two dimensional road system, time division multiplexing introduced by traffic lights enables traffic streams to cross. Three dimensional structures like overpasses and tunnels enable uninterrupted two dimensional traffic flow and separation of air traffic in three dimensional space is established by requiring vehicles travelling on intersecting paths to operate at different altitudes.

If we understand the emergence of new features in the Universe as a matter of random variation and controlled selection, we may see that 3D space is adequate for complete wired connection, so spaces with 4 or more dimensions have no evolutionary raison d'etre and may be selected out.

Wired networks are like plumbing or electrical power networks. Tuning is not required to discriminate sources but switching maybe necessary for one source to connect to and disconnect from others. In each case communication sets up a one-to-one correspondence. A wired network transmitting pure unmodulated power shares three properties with gravitation: it exists in four dimensions, three of space and one of time; it can deal indiscriminately with energy in any form; and the direction of transmission of signals is determined by potentials.

back to page 22: toc

22.4: The exploration of space: from Galileo to Einstein

Galileo conducted a sustained attack on Aristotle and the Church and won in the end although he was threatened with torture and death by the Holy Inquisition. He was sustained by his knowledge of reality, particularly his telescopic observations of the phases of Venus which showed him without doubt that the Venus orbited the Sun inside the orbit of the Earth. The Inquisition forced him to recant by threatening execution to save their face, and he spent the last part of his life under house arrest, but he knew he was right and that in reality the recantation counted for nothing. He was an observer and an instrument maker and is justly counted among the originators of empirical science. Over the centuries any were tortured and murdered for "heresy" and large scale wars have been fought over belief systems, culminating in the recent "cold war" but none made such an impression on history except perhaps the crucifixion of Jesus. Galileo Galilei: Recantation of Galileo (June 22, 1633), John Lewis Gaddis (2005): The Cold War: A New History

Galileo's made three principal contributions to science: the first was practical: empiricism, careful observation and testing. The second was linguistic: he introduced mathematics into the description of nature and found a numerical expression for the acceleration of a falling body. The third was subtler, a description of inertial motion, that is motion in which, in effect, nothing happens. We know it as Newton's first law of motion: a body in motion will continue it motion in a straight line unless it is acted upon by a force. This was a major break from the Aristotelian idea (formed as a consequence of friction) that a continual force is necessary to keep something in motion. This idea opened the way for special and general relativity.

The extent of our knowledge is to a large extent expressed by our vocabulary. In Galileo's day mathematics was principally arithmetic and geometry, but it grew enormously in the nineteenth century under the influence of Cantor, Russell and Whitehead and Hilbert to embrace any consistent symbolic structure whatever. Progress in science has shown that this extension of mathematics into logic and cognitive science has been necessary to embrace the huge variety and complexity of the Universe, making us aware that it embraces all consistent structures and is fittingly called divine.

Isaac Newton built on Galileo's mathematical foundation by inventing calculus and bequeathing us a complete system of mathematics to describe the solar system, and the large scale physics of the world, with almost perfect precision. It tells us almost all that we need for engineering, mechanical and civil, the design of rockets and aeroplanes and even sports equipment. His work is half of classical physics. The other half, electricity and magnetism, which was almost unknown to Newton. Electrodynamics began with the discovery of chemical batteries that could provide continuous current. Faraday studied the relationship between electric current and magnetism and set us on the way to electric generators and motors. James Clerk Maxwell used Newton's calculus to explain that light is an electromagnetic phenomenon and completed the electrodynamic component of classical physics.

1900 saw the beginning of a new picture of the world. Max Planck caught the first glimpse of the quantum of action and set quantum mechanics on its on its rocky road to existence. Planck was an old campaigner gifted with an incredible discovery after a long career. Young Einstein appeared in 1905 with a year of miracles, the most relevant to my story at this moment is the special theory of relativity, which reconceived Newton's dynamics in the light of the new (and still classical) electrodynamics. Albert Einstein (1905): On the Electrodynamics of Moving Bodies

The pioneers of electrodynamics, particularly Faraday, had understood that moving a wire through a magnetic field and moving magnetic field past a wire produced an electrical potential and therefore as current in the wire, but they saw these two approaches to creating and electrical field as different.

Einstein, already a strong believer in symmetry and invariance, saw that both techniques are physically identical. Instead of changing electrodynamics, he found that we must change our concept of time. Two clocks, moving relative to one another in inertial (unaccelerated) motion, keep different times. This is necessary to honour the Einstein's principle of special relativity, that all observers see the same laws of mechanics in their own inertial frames of reference. In particular, this invariance holds for the speed of light. All observers in inertial motion see the same speed of light regardless of their motion with respect to the source of the light. In the extreme, if you were to move past me at the speed of light and I could see your clock, it would appear to me to be stopped. If I could measure your length in the direction of our relative motion, it would appear to me to be zero.

To convert what I see in your frame to what I am seeing in my frame, I need to use the Lorentz transformation. A few years later the mathematician Herman Minkowski expressed Einstein's insight in terms of the metric (distance measure ) of classical spacetime. Minkowski found a very succinct expression of the the structure of 4 dimensional inertial space, now called Minkowski space: page 16.2: Dynamics: from Newton through electromagnetism to general relativity

Minkowski celebrated Einstein's work in his address to the 80th Assembly of German Natural Scientists and Physicians, (Sep 21, 1908):

The views of space and time which I wish to lay before you have sprung from the soil of experimental physics, and therein lies their strength. They are radical. Henceforth, space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality. Minkowski space - Wikipedia

A remarkable consequence of the Minkowski metric is that massless particles like photons travelling at the speed of light, although they may travel across the Universe, experience no space nor time. The spacetime interval between their creation and annihilation is zero.

Einstein went on to make many more contributions to the development of physics and to win a Nobel Prize for his study of the photon, the particle of light. His most significant work from the point of view of this site is his formulation of the general theory of relativity in 1915. In low energy situations this theory gives almost identical results to Newton's theory, but the differences are significant. It explains the precession of the perihelion of Mercury and the bending of rays of starlight as they pass close to the Sun. Two spectacular results that apply to high energy situations are that it explains the large scale structure of the whole Universe and predicts the existence of black holes. Albert Einstein (1915): The Field Equations of Gravitation, Precession of the perihelion of Mercury, Hawking & Ellis : (1975): The Large Scale Structure of Space-Time, Black hole - Wikipedia

Black holes result when the local density of matter becomes so great that gravitational potential destroys the local material structure and also, perhaps, the structure of spacetime itself, resulting in a structureless singularity surrounded by an event horizon. The horizon acts as a boundary between accessible and inaccessible spacetime. In their study of the large scale structure of spacetime seen through the eyes of Einstein Hawking and Ellis speculate that the origin of the Universe may be seen as a time reversed black hole. This does not make a lot of sense. Hawking has proposed that black holes emit radiation and so might eventually evaporate but this is a very slow process. Big bang models do not seem to attempt to explain why a primordial black hole should implode and become the Universe. Hawking radiation - Wikipedia, Event horizon - Wikipedia

At first this page was entitled quantum field theory but over the last year or so this has become a little too narrow. Initially I thought of it as a methodological essay on the construction of a theory of everything. I endorsed the common belief that quantum field theory is our best route to such a theory of everything. Cosmos - Wikipedia

Things have conspired to change my mind.

First there is Kuhlman's essay in the Stanford Encyclopedia. Having read his story a few times I have begun to feel that quantum theory as we have it is an unholy concoction of classical, mathematical, and half digested quantum ideas applied to a large set of very detailed and expensive observational data whose principal purpose is to describe the simplest foundations of the Universe. It is not so much a theory of everything as an incomplete theory of elementary particles and fields that has a long way to go before we can say it tells us much about "everything". The only story we have about the elementary particles is that they emerged after the big bang went off after the temperature fell sufficiently.

Since the days of Galileo, physicists have been firmly convinced that mathematics is the natural language to describe the Universe. Newton's discovery of calculus and the application of it in his universal theory of gravitation to the Solar System cemented mathematics into the heart of physics. Cantor's transfinite numbers and the discovery of problems like Cantor's and Russell's paradoxes suggested that there could be problems in mathematics. Whitehead and Russell produced the Principia Mathematica in order to purify mathematics from natural language, and Hilbert's formalism had the same purpose. These moves reinforced Hilbert's belief that every mathematical question could be answered. Then Gödel and Turing demonstrated the presence of incompleteness and incomputability in purely formal mathematical structures. It became clear that natural language was not the only cause of trouble in mathematics. The introduction of ideals also made mathematics a dangerous tool for physicists. Russell's paradox - Wikipedia", Gödel's incompleteness theorems - Wikipedia, Alan Turing (1936): On Computable Numbers, with an application to the Entscheidungsproblem, David Hilbert (1925): On the Infinite

Cantor's work arose from the nineteenth century effort to place calculus on a logically sound footing. This required the establishment of what looks at a first glance to be a logical impossibility, the proof that a continuum could be constructed with a large number of discrete points. A standard answer is the theory of limits, which assumes that given a large enough number points crowded into a small enough space, the result is, for all practical purposes, continuous. Cantor set out to find a representation of this number, the cardinal of the continuum and invented set theory for this purpose. Once set theory had been axiomatized and purified of its paradoxes, Cohen was able to show that the the continuum hypothesis stood apart from set theory. Cantor (1897, 1955): Contributions to the Founding of the Theory of Transfinite Numbers, Paul Cohen (1980): Set Theory and the Continuum Hypothesis

Nevertheless, field theories continue to assume that spacetime is continuous in the sense required by calculus, with the result that their study of fields within infinitesimal distances of point like particles gives rise to unrealistic infinities. It seems very unlikely that these infinities exist in reality. It seems more certain that they are an artefact of the hypothesis that continuous mathematics is the ideal model to describe quantized physical reality.

The preferred solution to the infinity problem arising in physics and statistics is the renormalization group devised by Wilson. Kenneth G. Wilson (1979): Problems in Physics with many Scales of Length, Kenneth G Wilson (1982): Nobel Lecture: The Renormalisation Group and Critical Phenomena, Renormalization group - Wikipedia

Although remormalization has proved quite successful, it is a statistical process that deals with pure numbers, and can only applied to physical situation where the couplings between particles are pure dimensionless numbers. The dimensions of gravitation constant, G, M-1L3T-2, show that it is not a pure number and so attempts to produce a quantum theory of gravitation using the methods proven in particle field theories have failed.

There is nevertheless a strong conviction in the physics community that gravitation should be quantized. This has led to a new industry devoted to exploring possible new ideas like supersymmetry and string theory to supersede quantum field theory. There are two problems here. From one point of view, given that the Universe has grown within a completely simple initial singularity, this complexity violates the heuristic of simplicity which I derive from the ancient idea of apophatic theology: page 26.8: the heuristic of simplicity. This idea insists that an absolutely simple divinity is beyond our ken so all we can say about it is what it is not, not what it is. Claude E Shannon (1948): A Mathematical Theory of Communication, Supersymmetry - Wikipedia, String theory - Wikipedia, Michio Kaku (1998; Introduction to Superstrings and M-Theory ), Apophatic theology - Wikipedia

The second problem is encapsulated in chapter 26.9: Principle 9 Gravitation is code free universal communication. The point of coding in communication is to address a message to the correct recipient encoded in a language or code that the recipient can understand. Gravitation, however, simply couples energy to energy without taking into account the form of the energy. In other words it is code free. It has no specific language. It listens to and talks to everything in the world that carries energy, which means, as far as physics knows, everything that exists.

It might be objected that the gravitational signal is quite complex, as we can see from the complexity of the gravitational constant, whose dimensions are M-1L3T-2. These dimensions are computed using classical dynamics in Minkowski space, but from my point of view this view of gravitation assumes the existence of classical space. In other words it does not go back far enough.

My guess, explained on page 17: Gravitation and quantum theory—in the beginning identifies gravitation with the God of Aquinas, and proceeds on the understanding that gravitation is the source of all the energy in the Universe.

this page 22: toc

22.5: Is naked gravitation divine?

We proceed therefore on the assumption that naked gravitation is divine. Page 8: The theology of the Trinity begins the development of a model of the creation of the Universe by analogy to the Christian doctrine of the Trinity. Our starting point is identical to the God of Aristotle and Aquinas, absolutely simple and structureless, the omnipotent creator of the Universe. We introduce one particular difference. Instead of being omniscient like the classical God, creating the world to a preconceived plan, this starting point carries no information or power of control because modern communication theory demands physical marks or symbols to represent information. The only constraint upon this singularity is that anything that it creates which contains inconsistent structure is eventually destined to fail and die. In other words, it embodies the algorithm of evolution: random variation— the ability to try anything; and selection, which which guarantees that systems with inconsistencies that prevent reproduction fail. Things that work exist; those that do not work cannot exist.

We can formulate the ideas of evolution by random variation and selection in terms of the P versus NP problem, which these days carries a million dollar prize. The kinematic random variation in Hilbert space described on page 9.3: Hilbert space Seems capable of discovering NP structures. If these are to become permanent by reproduction however, there must be P processes available to make this reproduction reliable. Carlson, Jaffe & Wiles (2006): The Millennium Prize Problems, pp 87 sqq.

The idea here is that instead of being faced, like modern scientists, with the enormous complexity of a quasi-infinite Universe that has been evolving and adding to its complexity for fourteen billion years, we use the somewhat mystical insights of ancient theology into the nature of divinity to simplify the problem down to a manageable size.

We build the Universe, step by step, beginning with the idea that the initial singularity can be viewed as a set that meets the axioms for fixed point theorems, which we guess will establish something like Hilbert space which sets upon the scene for linear operators. As noted on page 14: "Measurement": the interface between Hilbert and Minkowski spaces quantum mechanics is symmetrical with respect to complexity, so that its theoretical foundations are the same spaces ranging from 2 to 0 dimensions.

When we look through the more or less settled physics of the Universe naked gravitation scores well as as a starting point. The points noted above which qualify it for the operation of fixed point theory, continuity, convexity and compactness are also properties of the structureless empty set. In the conclusion to his paper on the field equation of gravitation, Einstein notes that the field equation adds nothing to gravitation that was not already implicit in Minkowsky space. Conversely, we should not expect to find quantization where error is impossible, that is in a regime where every possible message is a valid message. Since gravitation couples universally to energy alone, and is blind to the particular nature of the particles or fields associated with that energy, we can imagine that gravitation involves unconstrained and therefore intrinsically error free and non-quantized interaction, what we might call codeless communication.

Another suggestive property of naked gravitation is the fact that Einstein gravitation is a topological continuum described by differential geometry based on the Gaussian coordinate system. As she says, The Gaussian coordinates individuate but neither relate nor measure Sunny Y. Auyang (1995): How is Quantum Field Theory Possible?, page 27.

Using the approach taken here, the Minkowski metric is a structural addition to naked gravitation consequent upon the emergence of quantum mechanics decribed on page 12: The quantum creation of Minkowski space. One particular which suggests this origin of Minkowski space is the fact that the Minkowski metric makes possible the null geodesics that enable quantum states to be carried through spacetime by massless bosons.

When this page was first written I assumed that the hypothetical evolutionary algorithm implemented by the universal structure emerged within the initial singularity. In the next pages I explore the entropy increases made possible by the relationship between Hilbert space and Minkowski space which I understand to be a feature of this development which provides it with a selective advantage which will encourage its establishment in the structure of the Universe. This structure underlies the "bootstrap" facilitating the reproduction of the initial singularity to be explained below. The advent of spacetime is caused by the quantum mechanical discovery of fermions and bosons, and solves the angel problem by enabling spin ½ particles to exist independently of one another while communicating through spin 1 bosons.

this page 22: toc

22.6: Matter ≡ spirit; physics ≡ theology; particle ≡ god

Physicists seem to have a habit of aspiring to god. Kaku writes about The God Equation. Davies is trying to get a grip on the mind of God. The big question is: What do we mean by God. Michio Kaku (2021): The God Equation: The Quest for a Theory of Everything, Paul Davies (1992): The Mind of God: Science and the Search for Ultimate Meaning

A large section of the learned community of the world live in countries which were colonized by the three thousand years of theological history developed in Mediterranean Europe. Many consciously or unconsciously accept the story that the Universe was created by an omniscient, omnipotent and benevolent God who had the best interests of humanity at heart. From a scientific point of view, this assumption portrays the Universe as a puzzle set by God for the sciences to unravel.

This God has been studied extensively in the Christian tradition and which is here considered to be mistaken: that the divinity is both omniscient and absolutely structureless, a contradiction in modern terms. We deal with this problem by assuming that a realistic initial singularity is omnipotent (since it has created the Universe) but must be ignorant, since it does not have the structure necessary to represent information. This structure, in the form of the discrete independent particles which constitute the alphabet of spirit, is what must be created to give us the Universe we inhabit.

If we are to combine the physicists' mathematical approach to studying reality with the model of God devised by Aquinas, our best model of God is the empty set, ∅. We take that view here, and out first interpretation of the empty set is the naked gravitation of the initial singularity within which we begin to build the Universe by imagining the development of Hilbert space and quantum theory in this set. Formally, ∅ is next to nothing. From the point of view of naked potential, we interpret it as omnipotent.

This beginning establishes the initial state represented by the heading of this section: matter ≡ spirit; physics ≡ theology; particle ≡ god. The process of differentiation which has given us the world we inhabit has proceeded by breaking these symmetries, but we assume that a principle of zero sum complexification establishes that even though these symmetries are broken, the underlying symmetry remains intact, so that although we may imagine every particles and source in the Universe to be divine, we also imagine that the initial singularity at the root of the whole system remains in place. This idea is discussed in some detail on page 17: Gravitation and quantum theory—in the beginning where I propose that the emergence of quantum mechanics breaks the initial singularity which I identify as naked gravitation into potential and kinetic energy.

Cosmic theology describes how we construct a Thalesian world full of Gods from an initial singularity, ie analogous to the mechanism of the ancient theological model of the Trinity through which the Father conceives the Word and the Word becomes reality, the second person of the Trinity. This second person is also a dynamic living mind which conceives further words which become realities and so on ad infinitum. This process is analogous to to biological reproduction. The stationary eigenfunctions created by quantum mechanics play roles similar to genes in biological reproduction. Thales - Wikipedia

The mathematical foundation of quantum mechanics emerged fully formed in the abstract Hilbert space described by John von Neumann. The first applications of this formalism were interpretations of the source of the fixed values observed by spectroscopists. In more recent times we have become aware that the formalism may also serve to model various computational tasks like searching large volumes of data and factorizing large numbers. It has been widely conjectured that quantum computation may be more powerful that Turing computation but we face two difficulties. First, quantum processes are microscopic and very delicate so precision engineering, low temperatures and error correcting algorithms become necessary in order to establish deterministic processes; and second, the apparent power of quantum processes to deal with large vectors in high dimensional spaces implies that it can also yield large spectrum of eigenvalues which has a the probabilistic distribution described by the Born Rule, so it is difficult to decide which result may be specific to the problem input to the system. Nielsen & Chuang (2016): Quantum Computation and Quantum Information, Xavier Waintal (2023_12_29): The Quantum House Of Cards

We can be relatively certain, however, that the computing power of quantum mechanical processes has been sufficient to deal with all the problems the Universe faced in creating of fundamental particles and the interactions between them; and that it provides us with a language of sufficient power to represent any theory that embraces all the physical processes in the Universe, which would meet a reasonable specification of a theory of everything.

Both the power and the wastefulness of evolution arise from its dependence on random input. This fact provides us with the solution to the theological problem of evil discussed on page 4: Theology: a new paradigm?. We do not need to propose a malevolent spirit to explain evil in the world. It simply follows from the fact that deterministic processes cannot be creative. Randomness is required. As Turing showed, deterministic processes are constrained by an algorithm which can only explore an infinitesimal subset of the possibilities. Of the 1 mappings of the natural numbers onto themselves, only 0 are computable. Martin Davis (1982): Computability and Unsolvability

Here, although we must acknowledge that the majority of evolutionary variations are deleterious or even fatal, we concentrate on the creative power of evolution to overcome the bottlenecks in both science and the creation of the world.

The developmental stage of any evolving technology or living entity is often wasteful and painful because creativity is bound to proceed by trials which have a random element so that new knowledge and structure is often very costly. The upside of evolution (which Malthus saw as a potential downside) is that once a successful theory, technology or species arises from an evolutionary process it is then capable of exponential growth by reproduction. Other things being equal (which they occasionally are) this growth can ultimately amortize the expense of its discovery. In the long run it seems obvious that the process of evolution is productive despite its difficulties because billions of new species, new technologies and new ideas that have been created over the life of the Universe. Some would prefer to see the explanation of this proliferation in the work of an omnipotent and omniscient God, but the existence of such an entity seems unnecessary where experience shows that evolution in all its forms ultimately works. The ancient theology, interpreted in terms of modern information theory, shows that the ancient creator is impossible.

There appear to be many points in the evolution of the Universe where our modern understanding reveals cases where slight differences in the natural constants may have prevented the formation of a Universe suitable for life. It is true that we are here, and it seems very probable that the development of the present Universe was an evolutionary process all the way. The history of science shows that we have reached our present level of understanding after a considerable number of dead ends and retractions, so we might expect similar events in the evolution of the Universe. Although biological evolution provides us with a story, in principle, for the enormous variety of species that inhabit and have inhabited the Earth, we still seem to be relatively ignorant about how the particular set of elementary particles that we observe has emerged. Barrow & Tipler (1996): The Anthropic Cosmological Principle

this page 22: toc

22.7: A meeting point: the methods of science and evolution

Toward the end of last century cosmologists began to accept that the Universe originated from a structureless initial singularity which placed few initial constraints, if any, on the structure that we now inhabit. From a scientific point of view, the stories developed and promulgated by our ancient ancestors are hypotheses designed to explain the conditions of human life. These revolve around loves and wars, feasts and famines: early attempts at theories of everything.

Here we assume that given the initial singularity, the world created itself. It therefore plays many of the roles attributed to traditional God. Since the key article of scientific faith is that the Universe is a consistent whole, we must assume that theology and physics treat the same subject. A purpose of this site is to explore the advantages for both physics and theology by exploiting this assumption.

Theology, if it is to be science, must be evidence based. The physical world is our primary source of evidence. It is the repository of all information available to us, including all our own thoughts, languages and desires. The biggest shock to the traditional theology of creation has been the theory of evolution. It has shown us in explicit detail that, at least in the realm of life, the Universe is perfectly capable of creating itself. Many religious institutions are still struggling to accept this reality.

Roman Catholic Theology is more a form of politics and culture rather than an empirical science. It is under the control of a powerful global institution, the Roman Catholic Church, which claims total and infallible control of its doctrine. It has enforced this doctrine for nearly two thousand years using a mixture of brilliant marketing, education, torture, murder and war. Its principal product is a promise its believers an eternal post mortem bliss in a heaven whose principal pleasure is a vision of divine magnificence. Unbelievers, on the other hand, face an eternity of hell. Dante Alighieri (1982): The Divine Comedy

One of the foundations of theology that probably dates from well before the Christian era is the distinction between matter and spirit. In Aristotle's time the world beneath the Moon was made of the four material elements, Earth, Water, Air and Fire. The heavens beyond the Moon comprised a fifth more spiritual element. This layer was later brought down to Earth in the shape of theories of aether and field. Physicists used ether to explain the electromagnetic substrate of light waves. The special theory of relativity and the evidence that light is a flow of particles rendered this ether unnecessary. Now quantum field theories consider spacetime to be full of fields which are understood to be the source of all the observable structures that constitute world. Aether theories - Wikipedia, Classical field theory - Wikipedia, Quantum field theory - Wikipedia,

Theologically, spirit is considered to be immaterial, that is completely unphysical. Aristotle used the idea of immateriality to explain the universality of human intellect. He worked by analogy with sight. Colour in our eyes would restrict our ability to see colours. Matter in our minds would similarly restrict our understanding of the material world. Aquinas explains the omniscience of God by equating knowledge to immateriality and concluding that since God is supremely immaterial, they are also supremely knowing. Since God's knowledge is the cause of all things they know all things and can therefore be called omniscient. Aquinas, Summa: I, 14, 1: Is there knowledge in God?, Aquinas, Summa, I, 14, 8: Is the knowledge of God the cause of things?

The principal theological role for physics is the role particulate matter plays in support of spirit discussed on page 21: Matter and spirit. This is in contrast to the ancient belief mentioned above that spirit and intelligence depended on immaterial being. It is for this reason that the the initial singularity introduced on page 5: God's ideas, cybernetics and singularity is understood to have no structure and no knowledge at all. The evolutionary creation of the discrete particles provides the material symbols necessary to support the growing spiritual life of the evolving Universe. This includes, of course, our own minds and the recent immense growth in the electronic dissemination of information, the modern analogue of the invention of printing. History of printing - Wikipedia

Kuhlman, in his article quoted above lists a series of problems with the current state of quantum field theory which seem to me to be persistent echoes of its origin in classical physics and nineteenth century mathematics. I see two foundations for the trouble.

First, we clearly live in a quantized Universe. Everything we observe is a discrete object, from stars to atoms and eigenvectors which represent particles as standing waves. Nevertheless we try to describe the world from which we derive these observations with calculus and continuous functions. This is obviously because we perceive motion as continuous. Given the almost infinitesimal size of the quantum of action, this perception has a strong foundation which is, nevertheless, not supported by quantum mechanics.

I see quantization as a sign that the life of the Universe is in fact a stepwise logical process. From this point of view the quantum of action is a universal logical operator analogous to the classical operator NAND or Sheffer stroke. There is a difference however, since all quantum logic is executed by reversible unitary functions and the classical NAND gate is not reversible. We cannot reconstruct its two inputs from its single output. The quantum parallel of the universal NAND gate can be constructed from CNOT gate and single qubit gates. The demands of normalization, unitarity, reversibility and the conjecture that all fundamental interactions occur pairwise between elementary particles suggest that every quantum event is accompanied the exchange of just one quantum of action. In Minkowski space, the quantum of action has the dimensions of angular momentum and Bohr's semi-classical description of the hydrogen atom took advantage of this fact when he explained the interactions between photons and electrons and the fixed atomic orbitals as standing waves. Sheffer stroke - Wikipedia, NAND gate - Wikipedia, Controlled NOT gate - Wikipedia, Richard P. Feynman (1985): Quantum mechanical computers, Bohr model - Wikipedia

All communication uses physical representations of information. Most developments in electronic information storage and communication have arisen from our quantum understanding of the Universe. Much of this technology revolves around the manipulation of individual electrons and photons, and we are continually moving toward smaller physical representations of information enabling us to store and transport more information with fewer physical resources, a boon for the forests that provide paper and for the atmosphere which has to deal with the wastes from our historical reliance on fossil fuel. We are currently working toward 2 billion transistors per square millimetre. Modha et al: Neural inference at the frontier of energy, space, and time

The histories of politics, science and evolution have a similar structure: first, a problem arises; eventually a solution emerges. In politics a common solution is war. Political leaders in difficult circumstances often start wars which serve to unite the populace and distract everybody from the problem. The result is usually a massive loss of physical and cultural capital. The ultimate peace through exhaustion usually leaves a similar problem in place so once sufficient capital is accumulated the war restarts.

this page 22: toc

22.8: Lust for life: the power of faith and hope

The look of love alarms
Because 'tis filled with fire;
But the look of soft deceit
Shall win the lover's hire.
William Blake (1966): The Complete Writings of William Blake p. 182

The essential step in evolution is to create an individual that can reproduce in its local environment. Any biological genetic advances or intellectual steps forward that are not copied die and are lost. Although science abhors final causality, it creeps in through evolution insofar as entities that exist are the result of lust in their forebears which has carried through to the current generation. In the evolutionary understanding of design, successful strategies appear to imply final causes. The struggle for existence demands that sexually reproducing organisms find a mate. Those who wish to live by their art and ideas face a similar imperative, colloquially, publish or perish.

In the passage quoted above Kuhlman points out that it is exceptionally unclear which parts of the formalism should be taken to represent anything physical in the first place . The answer to this problem, it seems to me, is a distinction between kinematic and dynamic processes. A kinematic process is an inert puppet, driven by forces outside itself. A dynamic process is one that has its own life and energy and drives itself.

Einstein introduces the special theory of relativity in the kinematic part of his 1905 paper. The introduction to his paper highlights a common misconception which we might call dynamic since it concerns a perception of cause and effect in the relationship between magnetic and electrical potential. As he points out, their true relationship is kinematic, it is simply the way things are, they are puppets of the dynamic space in which they exist connected by Maxwell's equations. Formally, mathematics is kinematic, it must be moved by dynamic reality such as the mind of a mathematician. Aristotle was already aware of this necessity when he devised the unmnoved mover to move Plato's forms. Albert Einstein (1905): On the Electrodynamics of Moving Bodies, Unmoved mover - Wikipedia

The physical equivalent of biological lust is potential. Potentials are dynamic in the sense that they will cause motion unless they are inhibited. We learn about gravitational potential from childhood. If we are not supported, we fall. From the theological point of view we equate potential with the omnipotence of the initial singularity which we take to be equivalent to gravitation. The role of gravitation in animating the kinematic structures arising through quantum mechanical selection from kinematic Hilbert space is described on page 17: Gravitation and quantum theory—in the beginning.

The three traditional theological virtues are faith, hope and love. We may see them as cultural prescriptions for dealing with the impact of our evolutionary history on our lives. Unlike the operations of an omniscient and omnipotent creator, a persistent feature of operations in our evolutionary milieu is that things rarely work well the first time. Hence the ancient recipe for success: if at first you do not succeed, try, try again. Faith and hope embody this idea and love becomes possible when a procedure is perfected and can be profitably shared. Theological virtues - Wikipedia

this page 22: toc

22.9: Bootstrapping: quantum vacuum and the cosmological constant problem revisited

The big bang theory is consistent with the current approach to particle physics. We try to arrange ever more violent collisions in the hope of eliciting new particles. It may be that the energies available during the initial instants of the big bang enable events to be executed in a unit of Planck time, 5 × 10−44 second. Assuming the uncertainty expression Δ t × Δ E ≈ ℏ holds, these events require about 50 billion Joules of energy for their resolution. The LHC can accelerate protons to about 6.5 TeV. 1 TeV is about 1.6 × 10-7 J so Plank events require about 50 million billion TeV, well out of the range of any experimental accelerator. We have no chance of reproducing these conditions in a laboratory. Michelangelo L Mangano (2009): The super-LHC, Planck units - Wikipedia, Planck constant - Wikipedia

The alternative proposed here is not a big bang but as explained on page 9: The active creation of Hilbert space a more gentle process of evolution. I see the beginning of the process in the existence of a continuous, convex closed dynamic initial singularity which generates an internal kinematic Hilbert space consistent with mathematical fixed point theory. This Hilbert space exists prior to spacetime and its basis vectors are orthogonal. Like the angels of ancient theology, every one is a distinct "species". Although the initial singularity is assumed to be an ideal dynamic entity, the quantum mechanical operations in the Hilbert space within it are considered to be kinematic mathematical ideals, driving by the initial singularity in a manner analogous to a mathematician kinematically executing the imaginary formalism of mathematics. Fixed point theorem - Wikipedia, Aquinas, Summa I, 50, 4: Is every angel a different species?

A fundamental problem of quantum field theory seems to be that the postulated initial state, known as the vacuum, is far too complex and energetic.

First, does the vacuum really exist? There are various physical observations, like the Casimir effect and the Lamb shift that suggest that the vacuum may have energy, which points to real existence. On the other hand, quantum field theoretical calculations attribute an absurdly high energy to the vacuum which disagrees with reality. Cosmological observations suggest that the vacuum energy of free space is actually about 10-9 Joule per cubic metre, that is about 10-26 kilograms, the mass of a few hydrogen atoms. Vacuum energy - Wikipedia, Casimir effect - Wikipedia, Lamb shift - Wikipedia

The huge difference between the measured and computed energies of the vacuum is called the cosmological constant problem. It is discussed at length on page 18.8 The cosmological constant problem. Cosmological constant problem - Wikipedia, Steven Weinberg (2000): The Cosmological Constant Problems

Let us conclude, then, that there is no infinity of actual particles and that the precision of measurement in real space has a kinetic uncertainty of ±½h with no dynamical consequences. The cosmological constant that we measure in our world of classical gravitation is a consequence of the existence of real particles. The cosmological constant problem could simply be a theoretical artefact of the opinion that Minkowski space is the domain of Hilbert space. page 13: Is Hilbert space independent of Minkowski space? provides arguments against this assumption.

God is said to be omnipotent, which we generally understand to mean able to bring anything to be except an actual contradiction. In the case of the Christian God, we understand them to be able to create the Universe out of nothing. Here I propose that the Universe is itself God and uses its omnipotence to create itself. This is a bootstrapping operation which depends upon the distinction between kinematic and dynamic motion. Kinematic structures in Hilbert space have no particular energy cost. The energy to produce real dynamic particles comes from increasing the depth of the gravitational potential well. The role of gravitation in transforming kinematic structures into dynamic particles is discussed on page 17: Gravitation and quantum theory—in the beginning Aquinas, Summa I, 25, 3: Is God omnipotent?

If we attribute the same omnipotence to the initial singularity as we attribute to God, we would expect to see the Universe created in its finished form immediately. This was the ancient belief, but we can see now that it is clearly wrong. The Universe is a dynamic entity which has taken about 14 billion years of evolution to reach its present form.

The big bang does not seem to make provision for this biological form of evolution. If there is no independent creator, we would expect the first crop of elementary particles to arise spontaneously and not only be able to exist independently, but to reproduce themselves and interact in ways necessary to create more complex structures. High energy accelerator experiments provide some evidence for this scenario. Collisions of sufficiently powerful bubbles of energy yield the full spectrum of known particles. We therefore expect the violence of the big bang to be able to achieve the same result. Martinus Veltman (2003): Facts and Mysteries in Elementary Particle Physics

If this were the case we would expect the set of elementary particles to have the mathematical structure of a group whose interactions with one another produce other elements of the group, so that the group could sustain its own existence. The energy of the initial singularity would multiply the initial elementary group to create the Universe as we have it. We already find much group structure on the Universe, and the discovery of a self sufficient group of elementary fermions and bosons is in effect the goal of a theory of everything.

On this site, I propose a slightly different scenario somewhat closer to traditional theology and modern biological evolution.

The idea that the Universe is divine, ie that the Universe is a self sufficient system with both physical and mental aspects, implies first that physics and theology must be mutually consistent, and second, that a mechanism must exist to lead us from an initial singularity understood as an infinitely fecund quantum of action (God) to the current state of the Universe. The basic insight here is that quantum observation, paradoxically, leads to an increase in entropy, that is complexity. It is paradoxical (from the point of view of current theory) because some believe that the measurement process, conceived as the "collapse" of a wave function, would destroy rather than create information. John von Neumann (2014): Mathematical Foundations of Quantum Mechanics, Chapter V §3 Reversibility and Equilibrium Problems, Wave function collapse - Wikipedia

Since the quantum is a real discrete object in Minkowski space its finite size has no dynamical implication. As a carpenter I use a measuring tape graduated in millimetres, and all my measurements lie within ± 0.5 mm of a particular graduation. This is simply a formal consequence of my method of measurement. While it kinematically governs the precision of my work in the real world, it has no dynamical implications. In this context it has the same kinematic quality as the special theory of relativity communicated in terms of rods and clocks.

Most of the problems in quantum theory arise from the application of continuous mathematics to quantum phenomena and mistaking quantum kinematics for real dynamics. What we are seeing in the creation of the Universe is in effect the inverse of fixed point theory: a vast and complex logically continuous, compact and convex world arising from a fixed point. The difference here is that the continuity is not the ideal or fictitious variety derived from point set theory, but logical continuity. The notion of logical continuity is derived from two of Aristotle's ideas. The first is his definition of continuity: things are continuous if their ends are contained in one another. The second is the logical idea that the middle term connects the two propositions in a syllogism. States in computing machines and the machines themselves communicate with one another by sharing memory, that is reading from and writing to the same memory location as we see in the Turing machine. This memory serves as the overlap or middle term between the machines. Fixed point theorem - Wikipedia, Aristotle (continuity), Middle term - Wikipedia, Alan Turing (1936): On Computable Numbers, with an application to the Entscheidungsproblem

Logical continuity is embodied in the mathematical techniques of proof and computation. The archetype of logical continuity, corresponding to a continuous function in analysis, is a proof: a logically watertight connection between some hypothesis (say Euclid's axioms) and some conclusion (eg Pythagoras' theorem) that can be executed by a suitably programmed deterministic computer. A halting Turing machine is a logical continuum, moving deterministically from an initial state to a final state. We may think of an error free communication channel as a machine whose output is identical to its input at a different point in spacetime. Turing machine - Wikipedia

From a practical point of view, logical continuity takes precedence over classical point-set continuity. Our practical mathematical understanding of continuity is based on logical processes (like the epsilon-delta argument) that assume that the points in a continuous line are so crowded together that we can always find another point between any two. This argument is rather circular since we assume that the epsilons and deltas are themselves continuous. (ε, δ) - definition of limit - Wikipedia

Mathematical continuity by proximity motivated Cantor’s search for the cardinal of the continuum. Using logical continuity, Cohen has shown that Cantor’s approach, using theory based on sets of discrete elements, can say nothing about classical mathematical continuity. Cohen’s logically continuous argument shows that Cantor’s continuum hypothesis is independent of axiomatic set theory. Joseph Dauben (1990): Georg Cantor: His Mathematics and Philosophy of the Infinite, Paul Cohen (1980): Set Theory and the Continuum Hypothesis

Evolution is the search for the algorithms for survival, and success or failure in this search is governed by the relationship of each living creature to the environment in which it finds itself. The evolution of algorithms is very similar to the evolution of mathematics. Galileo felt that mathematics is the language of the Universe. Wigner thought it was remarkable that mathematics is so useful in science. Eugene Wigner (1960): The Unreasonable Effectiveness of Mathematics in the Natural Sciences

Hilbert, following Cantor, liberated mathematics from its attachment to physics, but in the process he introduced a pitfall which I see as related to the distinction between kinematics and dynamics. This distinction is striking in the different functions of mathematics in special and general relativity.

Einstein begins his article on inertial motion with 1. Kinematical Part. First he defines simultaneity in Newtonian mechanics which he calls a stationary system. He then goes on to show that:

. . . we see that we cannot attach any absolute signification to the concept of simultaneity, but that two events which, viewed from a system of co-ordinates are simultaneous can no longer be looked upon as simultaneous events when envisaged from a system which is in motion relatively to that system.

He then goes on to show the role of the velocity of light in explaining the relationship between stationary and moving systems. He had in effect used a kinematic expression to describe what at first sight appears to be a dynamic relationship between space and time.

A few years later Minkowski greatly simplified special relativity by describing the transformation between inertial frames as a rotation in in a four dimensional spacetime whose metric is summed up in the relationship

3 × 105 kilometres = i seconds

Einstein, Lorentz, Minkowski & Weyl (1924): The Principle of Relativity: A Collection of original Papers

In the general theory of relativity, Einstein produced a formal equation with dynamic consequences for the overall shape of universal space-time.

Hilbert space is a formal mathematical ideal, kinematic rather than dynamic. The new quantum dynamics now exists in Minkowski space and communicates with Hilbert space through the quantum process which we call measurement or observation. (see page 14: Measurement: The interface between Hilbert and Minkowski. All interactions in Minkowski space are executed in Hilbert space where the wave functions corresponding to the interacting particles execute a formal computation of the outcome of the interaction. As I explain on page 15: Quantum amplitudes and logical processes are invisible §15.4, computation is a kinematic formal process made possible by the dynamics of the clock in a physical computer in Minkowski space. The physical computing processes themselves interact through their couplings to underlying Hilbert spaces"page 23: Network quantum electrodynamics). Nielsen & Chuang (2016): Quantum Computation and Quantum Information

It seems to me that another root of the problems in quantum field theory lie in this mapping of abstract Hilbert space onto real Minkowski space making it subject to special relativity. This mapping makes it necessary to perform Lorentz transformations on Hilbert space to accommodate the union of the two spaces. The ideal properties of Hilbert space are thus carried over into real space giving us problems like point electrons with infinite electromagnetic mass, the infinite energies induced by the infinite frequency and the zero point properties of the quantum harmonic oscillator. Martinus Veltman (1994): Diagrammatica: The Path to the Feynman Rules (page 20)

It follows that it is unnecessary to devise relativistic quantum mechanics because Hilbert space with its operators and vectors lives in an abstract space where there is no distance or time and so no special relativity. Special relativity enters when Minkowski space is created by Hilbert space. It is implicit in the Minkowski metric which makes possible the transport of quantum states through spacetime by embedding them in null geodesics. In a null geodesic the measures of space and time add up to zero so that it is effect a tentacle of Hilbert space reaching into Minkowski space.

this page 22: toc

22.10: Quantum theory describes conversations between particles: matrix, mind and language

I imagine a world of particles talking to one another, like people. We have used thousands of different languages over the millennia, but the fundamental particles of the Universe restrict themselves to (at most) four: gravitation, the strong interaction, the weak interaction and electromagnetism. The particular applications of these interactions a strictly local, a concept embodied in gauge or phase theory. In the quantum world, entities communicate by sharing phase information Gauge theory - Wikipedia

The core principle of this site, symmetry with respect to complexity, connects all these languages together through Heisenberg's discovery that the observables in quantum mechanics are matrices, linear operators on vectors in Hilbert space. Principle 7: Symmetry with respect to complexity. Quantum mechanics itself is also symmetrical with respect to complexity, the basic structure being the same for all dimensions of Hilbert space. As the world complexifies layer upon layer larger vectors resembling more complex structures are assembled from smaller vectors. On pages 23 and 24 we will notice also that the total Hilbert space of the Universe is divided into subspaces corresponding to distinct particles and systems of particles. These subspaces are loosely connected by photons travelling on null geodesics. The overall layered spatial structure in both the abstract Hilbert domains and the real Minkowski domain grows like the transfinite numbers.

Every passage of human speech is unitary in a communication theoretical sense insofar as the sum of the probabilities Σi pi of the phonemes ai in any speech or any message is 1. Each speech is an element of some group SU(n), and we may think of all the observable information available in the Universe as the speech of the Universe; in a divine Universe the word of god. Among the many difficulties of quantum field theory is the difficulty of maintaining unitarity in Yang-Mills theory. Feynman and Fadeev-Popov found a need to introduce "ghosts" in order to maintain unitarity in this theory. This trick, like renormalization, suggests the theory is missing some ingredient. We can assume that evolution has created this ingredient just as it invented space and solved the baryon symmetry problem. It is now a puzzle for scientists to solve, like the creation and decoding birdsong and all other forms of animal signalling. Yang-Mills theory - Wikipedia, Unitarity (physics) - Wikipedia, 't Gerardus Hooft (2005); 50 Years of Yang Mills Theory (page 40: Popov & Fadeev: Perturbation Theory for Gauge-Invariant Fields)

The notion that quantum mechanics is a dialogue between particles each of whose vocabulary is a spectrum of of quantum states enables us to create an analogy between quantum mechanics and conversation. Each person in conversation has a vocabulary of words ("basis states") in their language which serve to encode the meaning in the conversation which engages them. This is edging us closer to a cognitive model of quantum mechanics which can be built on the role of quantum computation playing the role of human intelligence in our 'scientific' investigations of one another. Conversation enables particular couples to move toward stable social bonds like marriage and friendship or, in the opposite direction, to indifference or lethal hate.

Like Kuhlman quoted at the top of this page I am unhappy with the current state of quantum field theory (and of theology also). Much of my unhappiness lies in the notions of screening and antiscreening which are processes developed to give substance to the process of renormalization. Renormalization is a theoretical answer to the appearance of infinities in quantum calculations. Since I feel, on intuitive grounds, that these infinities are fictitious, the struggle against them may also lack substance. I agree with Aristotle, that there is no actual infinity.

Kuhlman points out that it is exceptionally unclear which parts of the formalism should be taken to represent anything physical in the first place. The answer to this problem, it seems to me, is a distinction between kinematic ("mental, cognitive") and dynamic ("real, physical") processes. A kinematic process is a puppet, driven by forces outside itself. A dynamic process is one that has its own energy and drives itself.

Mathematics is kinematic. It does not do itself, it is done by physical entities, computers and human minds. Hilbert's formal program suggests we can make up any mathematical expressions we like provided they do not lead to contradictions. Cantor demonstrated to his contemporaries that the concept of infinity is not essentially inconsistent. In fact, since it is by definition undefined (infinite), it is not easy to make provable statements about it. The infinities that arise from the self energy of the electron and so on are in this mathematical category illusory, a misapplication of continuity and division by zero.

Contrariwise, as I point out in the discussion of the cosmological constant problem, the quantum of action is a real dynamic entity, which we can best understand as the intermediary between creation and annihilation. Taken formally, it is the logical operator that transforms some p into some not-p and some not-p into some p. In the binary world of two valued logic not-not-pp, but in the wider world the list of not-ps can be quasi infinite. My world contains 7 billion not-mes.

The history of science is quite similar to the history of evolution (see page 6: Evolution: genetic memory, variation and selection). We are familiar, particularly in physics, with apparently impenetrable impasses which stymie us for years. And then a solution is found. The "decade of uncertainty" before the discovery of renormalization is an example of this, as are the 40 years between Kirchoff's hypothesis about black body radiation and Planck's discovery of the relevant law. Planck's work opened a new world for us. Renormalization, like war in politics, seems to have broken through an impasse but left the real problem unsolved, demanding further work.

It is clear that evolution has solved all the problems that have hindered physics, so the evolutionary approach is worth following: to create a line of mutations that have led from some reasonable starting point to the conclusion we seek. Such is a quantum mechanical explanation of the Minkowski metric on page 12: The quantum creation of Minkowski space. A logical account of how this actually happens is beyond me, but it seems certain that it happens. I wish to get from the initial singularity to a proton. A question: do we go via electrodynamics or is electrodynamics a consequence of chromodynamics?

The evolutionary approach may also give us a logical alternative to the current doctrine that envisages infinities caused by fluctuations being extinguished by renormalizations that average the fluctuations out of existence. Kerson Huang (2013): A Critical History of Renormalization

A genome is a static representation of a process that can be copied kinematically into messenger RNA and then processed kinematically by ribosomes to produce dynamic proteins that can take part in the processes of the cell. We can see a similar algorithm in quantum theory. The messenger RNA is eigenvector selected by quantum mechanics and converted into a fundamental particle by energy obtained by increasing gravitational potential. Ribosome - Wikipedia

These biological processes are partly facilitated by the random motion of small molecules and executed by protein motions driven by ATP, the cellular energy currency. The quantum mechanical analogues may serve to achieve similar outcomes using gravitational energy without any input from quantum fluctuations. These processes are analogous to the operations envisaged in quantum computation with specific operations defined by specific quantum operators. Overall, this approach suggests that screening is on the whole not necessary because the infinitesimal processes imagined to be necessary in Minkowski space (and consequently introducing infinities) are actually executed by kinematic logic in Hilbert space where they are outside the constraints of space and time.

While the concepts of screening and renormalization are connected, the specific idea of antiscreening is to explain the observation of asymptotic freedom and confinement in the constitution of hadrons. We will talk about this problem on the page on chromodynamics. The complexities of chromodynamics suggest that the problems arising there are much better suited to execution by computation rather than random fluctuations. In the meantime we turn to discussion of the roles of phases and operators in quantum computation and communication.

Quantum mechanics is a child of classical mechanics. Isaac Newton laid the foundations of classical mechanics in terms of mass, acceleration and force. On a more philosophical level, Maupertuis (1698-1759) speculated that a wise creator would have made the world as efficient as possible. This started a train of thought that led to Hamilton's principle, the amalgamation of the calculus of variations with Langrangian mechanics. We might also imagine that the effectiveness of Hamilton's principle is the result of natural evolution selecting the most efficient mechanical processes to dominate the physical world. Hamilton's principle - Wikipedia

Joseph-Louis Lagrange sought to express classical Newtonian mechanics in a form that would make it easier to study many body problems like the solar system. His work, Mecanique Analytique, placed mechanics on an algebraic rather than a geometric foundation. Mécanique analytique Volume 1

In the Lagrangian approach the action S associated with an event x that takes place between times t1 and t2 is expressed by the action functional

S(x) = ∫L dt

The classical Lagrangian L = (T(t) −V(t)), where T and V are functions of the kinetic and potential energy of the system. Lagrangian mechanics postulates Hamilton's principle that the actual trajectory taken by a particle whose motion is constrained by T and V coincides with a stationary value of S (a fixed point in the action) which may be found using Euler's calculus of variations. Lagrangian mechanics - Wikipedia, Calculus of variations - Wikipedia

Lagrangian mechanics has been found to be very versatile and serves as a bridge between classical and quantum mechanics, quantum field theory and physical problems in general. On this basis, we might understand classical mechanics in spacetime as the study of action in the relationship between kinetic and potential energy.

Dirac found that the Lagrangian deals with phase / time alone and not the dimensions of space. Feynman used this idea to develop the path integral version of quantum theory. P. A. M. Dirac (1933): The Lagrangian in Quantum Mechanics, Path integral formulation - Wikipedia

Entirely new theories may be needed to make sense of the general problem of communication in the Universe. Logical theories executed by quantum computation in Hilbert space do not care about energy or distance, only about logical states which are encoded in the phase of vectors in Hilbert space, which phases are constructed by the superposition of basis states.

this page 22: toc

(Revised Monday 11 March 2024)

Back to site toc

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Notes and references

Further reading

Books

Alighieri (1982), Dante, and Francis Cray (translator), Gustav Dore (Illustrator), Dante: The Divine Comedy, Omega Books / Ebeling 1982 'The great Italian poet Dante was born in Florence in 1265, into a Guelph family of the lower nobility. As an adolescent he met Beatrice, generally identified with Beatrice Portinari, for whom he had a sublime and mystical love which was immortalized in his works. . . .. Gustave Dore's illustrations for The Divine Comedy first appeared with this translation by Henry Francis Cary in the eighteen sixties. The famous French illustrator, who was born in 1832 in Strabourg, had already completed editions of Rabelais and Balzac in 1854 and 1855 and published Dante in 1861.' 
Amazon
  back

Auyang (1995), Sunny Y., How is Quantum Field Theory Possible?, Oxford University Press 1995 Jacket: 'Quantum field theory (QFT) combines quantum mechanics with Einstein's special theory of relativity and underlies elementary particle physics. This book presents a philosophical analysis of QFT. It is the first treatise in which the philosophies of space-time, quantum phenomena and particle interactions are encompassed in a unified framework.' 
Amazon
  back

Barrow (1996), John D., and Frank J. Tipler, The Anthropic Cosmological Principle, Oxford University Press 1986, 1996 'This wide-ranging and detailed book explores the many ramifications of the Anthropic Cosmological Principle, covering the whole spectrum of human inquiry from Aristotle to Z bosons. Bringing a unique combination of skills and knowledge to the subject, John D. Barrow and Frank J. Tipler - two of the world's leading cosmologists - cover the definition and nature of life, the search for extraterrestrial intelligence, and the interpretation of the quantum theory in relation to the existence of observers.' 
Amazon
  back

Blake (1966), William, and Geoffrey Keynes (ed), The Complete Writings of William Blake with variant readings Oxford Standard Authors 1966, Oxford UP 1966 ' The Complete Works of William Blake, with all the variant readings is edited by Geoffrey Keynes. Some of the works that appear in this volume include Blake's poetical sketches, Songs of Innocence, The French Revolution and A Song Liberty. William Blake was an English poet, painter and printmaker who is now considered a seminal figure in the history of the poetry and visual art of the Romantic Age.' back

Cantor (1897, 1955), Georg, Contributions to the Founding of the Theory of Transfinite Numbers (Translated, with Introduction and Notes by Philip E B Jourdain), Dover 1895, 1897, 1955 Jacket: 'One of the greatest mathematical classics of all time, this work established a new field of mathematics which was to be of incalculable importance in topology, number theory, analysis, theory of functions, etc, as well as the entire field of modern logic.' 
Amazon
  back

Carlson (2006), James, and Arthur Jaffe & Andrew Wiles, The Millennium Prize Problems, Clay Mathematics Institute and American Mathematical Society 2006
1: The Birch and Swinnerton-Dyer Conjecture: Andrew Wiles
2: The Hodge Conjecture: Pierre Deligne
3: The Existence and Smoothness of the Navier-Stokes Equation: Charles L Fefferman
4: The Poincare Conjecture: John Milnor
5: The P versus NP Problem: Stephen Cook
6: The Riemann Hypothesis: Enrico Bombieri
7: Quantum Yang-Mills Theory: Arthur Jaffe and Edward Whitten 
Amazon
  back

Cohen (1980), Paul J, Set Theory and the Continuum Hypothesis, Benjamin/Cummings 1966-1980 Preface: 'The notes that follow are based on a course given at Harvard University, Spring 1965. The main objective was to give the proof of the independence of the continuum hypothesis [from the Zermelo-Fraenkel axioms for set theory with the axiom of choice included]. To keep the course as self contained as possible we included background materials in logic and axiomatic set theory as well as an account of Gödel's proof of the consistency of the continuum hypothesis. . . .'  
Amazon
  back

Dauben (1990), Joseph Warren, Georg Cantor: His Mathematics and Philosophy of the Infinite, Princeton University Press 1990 Jacket: 'One of the greatest revolutions in mathematics occurred when Georg Cantor (1843-1918) promulgated his theory of transfinite sets. . . . Set theory has been widely adopted in mathematics and philosophy, but the controversy surrounding it at the turn of the century remains of great interest. Cantor's own faith in his theory was partly theological. His religious beliefs led him to expect paradox in any concept of the infinite, and he always retained his belief in the utter veracity of transfinite set theory. Later in his life, he was troubled by attacks of severe depression. Dauben shows that these played an integral part in his understanding and defense of set theory.' 
Amazon
  back

Davies (1992), Paul, The Mind of God: Science and the Search for Ultimate Meaning, Penguin Books 1992 'Paul Davies' "The Mind of God: Science and the Search for Ultimate Meaning" explores how modern science is beginning to shed light on the mysteries of our existence. Is the universe - and our place in it - the result of random chance, or is there an ultimate meaning to existence? Where did the laws of nature come from? Were they created by a higher force, or can they be explained in some other way? How, for example, could a mechanism as complex as an eye have evolved without a creator? Paul Davies argues that the achievement of science and mathematics in unlocking the secrets of nature mean that there must be a deep and significant link between the human mind and the organization of the physical world. . . . ' 
Amazon
  back

Davis (1982), Martin, Computability and Unsolvability, Dover 1982 Preface: 'This book is an introduction to the theory of computability and non-computability ususally referred to as the theory of recursive functions. The subject is concerned with the existence of purely mechanical procedures for solving problems. . . . The existence of absolutely unsolvable problems and the Goedel incompleteness theorem are among the results in the theory of computability that have philosophical significance.' 
Amazon
  back

Einstein (1924), Albert, and Lorentz, Weyl, Minkowski. Translated by W. Perrett & G. B. Jeffery, Notes by A. Sommerfield, , The Principle of Relativity: A Collection of original Papers, Dover 1924 Amazon.com reader review: 'This book is a collection of the most important lectures given by Einstein, Lorentz, Minkowski and Weyl that led to the formulation of the theory of relativity in its two parts. The first part is the special theory, which studies the inertial and moving reference frames without considering the effects of gravity. The second part, the general theory, explains the nature of gravity.' Reinaldo Olivares 
Amazon
  back

Farmelo (2009), Graham, The Strangest Man: The Hidden Life of Paul Dirac, Mystic of the Atom, Basic Books 2009 Jacket: 'Paul Dirac was among the greatest scientific geniuses of the modern age. One of the discoverers of quantum mechanics, the most revolutionary theory of the last century, his contributions had a unique insight, eloquence, clarity and mathematical power. His prediction of antimatter was one of the greatest triumphs in the history of physics. One of Einstein's most admired colleagues, Dirac was in 1933 the youngest theoretician ever to win a Nobel Prize in physics. . . . Based on previously undiscovered archives, The Strangest Man reveals the many facets of Dirac's brilliantly original mind. The Strangest Man also depicts a spectacularly exciting era in scientific discovery.' 
Amazon
  back

Gaddis (2005), John Lewis, The Cold War: A New History, The Penguin Press 2005 Jacket: 'Many will remember what it was like to live under the shadow of the Cold War: the ever-present anxiety that at some point, because of some miscalculation or act of hubris, we might find ourselves in the middle of a nuclear holocaust . . . How did this terrible conflict arise? How did wartime allies so quickly become deadly foes after 1945 and divide the world into opposing camps, each armed to the teeth? And how, suddenly, did it all come to an end? Only now that the Cold War has been over for fifteen years can we begin to find a convincing perspective on it. John Lewis Gaddis's masterly book is the first full, major history of the whole conflict and explains not just what happened, but why it happened . . . Gaddis has synthesized all the most recent scholarship, but has also used minutes from Politburo meetings, startling information from recently opened Soviet and Asian archives, ... and above all the words of the leading participants themselves -- showing what was really on the mind of each, with a very dramatic immediacy. . . .' 
Amazon
  back

Galilei (1957), Galileo, and Stillman Drake (translator), Discoveries and Opinions of Galileo: Including the Starry Messenger (1610 Letter to the Grand Duchess Christina), Doubleday Anchor 1957 Amazon: 'Although the introductory sections are a bit dated, this book contains some of the best translations available of Galileo's works in English. It includes a broad range of his theories (both those we recognize as "correct" and those in which he was "in error"). Both types indicate his creativity. The reproductions of his sketches of the moons of Jupiter (in "The Starry Messenger") are accurate enough to match to modern computer programs which show the positions of the moons for any date in history. The appendix with a chronological summary of Galileo's life is very useful in placing the readings in context.' A Reader. 
Amazon
  back

Hallett (1984), Michael, Cantorian Set Theory and Limitation of Size, Oxford UP 1984 Jacket: 'This book will be of use to a wide audience, from beginning students of set theory (who can gain from it a sense of how the subject reached its present form), to mathematical set theorists (who will find an expert guide to the early literature), and for anyone concerned with the philosophy of mathematics (who will be interested by the extensive and perceptive discussion of the set concept).' Daniel Isaacson. 
Amazon
  back

Hawking (1975), Steven W, and G F R Ellis, The Large Scale Structure of Space-Time, Cambridge UP 1975 Preface: Einstein's General Theory of Relativity . . . leads to two remarkable predictions about the universe: first that the final fate of massive stars is to collapse behind an event horizon to form a 'black hole' which will contain a singularity; and secondly that there is a singularity in our past which constitutes, in some sense, a beginning to our universe. Our discussion is principally aimed at developing these two results.' 
Amazon
  back

Kaku (1998), Michio, Introduction to Superstrings and M-Theory (Graduate Texts in Contemporary Physics), Springer 1998 ' Called by some "the theory of everything," superstrings may solve a problem which has eluded physicists for the past 50 years -- the final unification of the two great theories of the twentieth century, general relativity and quantum field theory. This is a course-tested comprehensive introductory graduate text on superstrings which stresses the most current areas of interest, not covered in other presentation, including: string field theory, multi loops, Teichmueller spaces, conformal field theory, and four-dimensional strings. The book begins with a simple discussion of point particle theory, and uses the Feynman path integral technique to unify the presentation of superstrings. Prerequisites are an acquaintance with quantum mechanics and relativity. This second edition has been revised and updated throughout.' 
Amazon
  back

Kaku (2021), Michio, The God Equation: The Quest for a Theory of Everything, Doubleday 2021 ' This is the story of a quest: to find a Theory of Everything. Einstein dedicated his life to seeking this elusive Holy Grail, a single, revolutionary 'god equation' which would tie all the forces in the universe together, yet never found it. Some of the greatest minds in physics took up the search, from Stephen Hawking to Brian Greene. None have yet succeeded. In The God Equation, renowned theoretical physicist Michio Kaku takes the reader on a mind-bending ride through the twists and turns of this epic journey: a mystery that has fascinated him for most of his life. He guides us through the key debates in modern physics, from Newton's law of gravity via relativity and quantum mechanics to the latest developments in string theory.  
Amazon
  back

Nielsen (2016), Michael A., and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2016 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Péter (1961), Rózsa, and Z. P Dienes (translator), Playing with Infinity: Mathematical Explorations and Excursions, Dover 1961 ' This popular account of the many mathematical concepts relating to infinity is one of the best introductions to this subject and to the entire field of mathematics. Dividing her book into three parts—The Sorcereer's Apprentice, The Creative Role of Form, and the Self-Critique of pure Reason— Peter develops her material in twenty-two chapters that sound almost too appealing to be true; playing with fingers, coloring the grey number series, we catch infinity again, the line is filled up, some workshop secrets, the building rocks and so on.' 
Amazon
  back

Pais (1986), Abraham, Inward Bound: Of Matter and Forces in the Physical World, Clarendon Press, Oxford University Press 1986 Preface: 'I will attempt to describe what has been discovered and understood about the constituents of matter, the laws to which they are subject and the forces that act on them [in the period 1895-1983]. . . . I will attempt to convey that these have been times of progress and stagnation, of order and chaos, of belief and incredulity, of the conventional and the bizarre; also of revolutionaries and conservatives, of science by individuals and by consortia, of little gadgets and big machines, and of modest funds and big moneys.' AP 
Amazon
  back

Peter (1961), Rozsa, and Z. P Dienes (translator), Playing with Infinity: Mathematical Explorations and Excursions, Dover 1961 ' This popular account of the many mathematical concepts relating to infinity is one of the best introductions to this subject and to the entire field of mathematics. Dividing her book into three parts—The Sorcereer's Apprentice, The Creative Role of Form, and the Self-Critique of pure Reason— Peter develops her material in twenty-two chapters that sound almost too appealing to be true; playing with fingers, coloring the grey number series, we catch infinity again, the line is filled up, some workshop secrets, the building rocks and so on.' 
Amazon
  back

Rovelli (2017), Carlo, and Simon Carnell & Erica Sere (Translators), Reality is Not What it Seems: The Journey to Quantum Gravity, Allen Lane Penguin 2017 ' Be prepared for your intellectual foundations to be vaporized . . . Carlo Rovelli will melt your synapses with this exploration of physical reality and what the universe is formed of at the very deepest level . . . Quantum gravity is so new that there aren't many popular books about it. You couldn't be in better hands than Rovelli, a world expert.' Tara Shears, The Times Higher Edcation 
Amazon
  back

Streater (2000), Raymond F, and Arthur S Wightman, PCT, Spin, Statistics and All That, Princeton University Press 2000 Amazon product description: 'PCT, Spin and Statistics, and All That is the classic summary of and introduction to the achievements of Axiomatic Quantum Field Theory. This theory gives precise mathematical responses to questions like: What is a quantized field? What are the physically indispensable attributes of a quantized field? Furthermore, Axiomatic Field Theory shows that a number of physically important predictions of quantum field theory are mathematical consequences of the axioms. Here Raymond Streater and Arthur Wightman treat only results that can be rigorously proved, and these are presented in an elegant style that makes them available to a broad range of physics and theoretical mathematics.' 
Amazon
  back

't Hooft (2005), Gerardus, 50 Years of Yang Mills Theory, World Scientific 2005 ' On the 50th anniversary of Yang-Mills theory, this invaluable volume looks back at the developments and achievements in elementary particle physics that ensued from that beautiful idea.During the last five decades, Yang-Mills theory, which is undeniably the most important cornerstone of theoretical physics, has expanded widely. It has been investigated from many perspectives, and many new and unexpected features have been uncovered from this theory. In recent decades, apart from high energy physics, the theory has been actively applied in other branches of physics, such as statistical physics, condensed matter physics, nonlinear systems, etc. This makes the theory an indispensable topic for all who are involved in physics.An international team of experts, each of whom has left his mark on the developments of this remarkable theory, contribute essays or more detailed technical accounts to this volume.' 
Amazon
  back

Veltman (1994), Martinus, Diagrammatica: The Path to the Feynman Rules, Cambridge University Press 1994 Jacket: 'This book provides an easily accessible introduction to quantum field theory via Feynman rules and calculations in particle physics. The aim is to make clear what the physical foundations of present-day field theory are, to clarify the physical content of Feynman rules, and to outline their domain of applicability. ... The book includes valuable appendices that review some essential mathematics, including complex spaces, matrices, the CBH equation, traces and dimensional regularization. . . .' 
Amazon
  back

Veltman (2003), Martinus, Facts and Mysteries in Elementary Particle Physics, World Scientific 2003 'Introduction: The twentieth century has seen an enormous progress in physics. The fundamental physics of the first half of the century was dominated by the theory of relativity, Einstein's theory of gravitation and the theory of quantum mechanics. The second half of the century saw the rise of elementary particle physics. . . . Through this development there has been a subtle change in point of view. In Einstein's theory space and time play an overwhelming dominant role. . . . The view that we would like to defend can perhaps best be explaned by an analogy. To us, space-time and the laws of quantum mechanics are like the decor, the setting of a play. The elementary articles are the actors, and physics is what they do. . . . Thus in this book the elementary particles are the central objects.' 
Amazon
  back

Links

2019 redefinition of the SI base units - Wikipedia, 2019 redefinition of the SI base units - Wikipedia - the free encyclopedia, ' Effective 20 May 2019, the 144th anniversary of the Metre Convention, the SI base units were redefined in agreement with the International System of Quantities. In the redefinition, four of the seven SI base units – the kilogram, ampere, kelvin, and mole – were redefined by setting exact numerical values when expressed in SI units for the Planck constant (h), the elementary electric charge (e), the Boltzmann constant (kB), and the Avogadro constant (NA), respectively. The second, metre, and candela were already defined by physical constants and were not subject to correction to their definitions.' back

Abhay Ashtekar (2013), Loop Quantum Gravity and the The Planck Regime of Cosmology, ' The very early universe provides the best arena we currently have to test quantum gravity theories. The success of the inflationary paradigm in accounting for the observed inhomogeneities in the cosmic microwave background already illustrates this point to a certain extent because the paradigm is based on quantum field theory on the curved cosmological space-times. However, this analysis excludes the Planck era because the background space-time satisfies Einstein's equations all the way back to the big bang singularity. Using techniques from loop quantum gravity, the paradigm has now been extended to a self-consistent theory from the Planck regime to the onset of inflation, covering some 11 orders of magnitude in curvature. In addition, for a narrow window of initial conditions, there are departures from the standard paradigm, with novel effects, such as a modification of the consistency relation involving the scalar and tensor power spectra and a new source for non-Gaussianities. Thus, the genesis of the large scale structure of the universe can be traced back to quantum gravity fluctuations in the Planck regime. This report provides a bird's eye view of these developments for the general relativity community.' back

Aether theories - Wikipedia, Aether theories - Wikipedia, the free encyclopedia, ' In physics, aether theories (also known as ether theories) propose the existence of a medium, a space-filling substance or field as a transmission medium for the propagation of electromagnetic or gravitational forces. Since the development of special relativity, theories using a substantial aether fell out of use in modern physics, and are now replaced by more abstract models. This early modern aether has little in common with the aether of classical elements from which the name was borrowed. The assorted theories embody the various conceptions of this medium and substance.' back

Alan Turing (1936), On Computable Numbers, with an application to the Entscheidungsproblem, 'The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by some finite means. Although the subject of this paper is ostensibly the computable numbers, it is almost equally easy to define and investigate computable functions of an integral variable of a real or computable variable, computable predicates and so forth. . . . ' back

Albert Einstein (1905), On the Electrodynamics of Moving Bodies, ' It is well known that Maxwell's electrodynamics-as usually understood at present—when applied to moving bodies, leads to asymmetries that do not seem to attach to the phenomena. Let us recall, for example, the electro-dynamic interaction between a magnet and a conductor. The observable phenomenon depends here only on the relative motion of conductor and magnet, while according to the customary conception the two cases, in which, respectively, either the one or the other of the two bodies is the one in motion, are to be strictly differentiated from each other. For if the magnet is in motion and the conductor is at rest, there arises in the surroundings of the magnet an electric field endowed with a certain energy value that produces a current in the places where parts of the conductor are located. But if the magnet is at rest and the conductor is in motion, no electric field arises in the surroundings of the magnet, while in the conductor an electromotive force will arise, to which in itself there does not correspond any energy, but which, provided that the relative motion in the two cases considered is the same, gives rise to electrical currents that have the same magnitude and the same course as those produced by the electric forces in the first-mentioned case.' back

Albert Einstein (1905c), On a heuristic point of view concerning the production and transformation of light, ' The wave theory of light, which operates with continuous spatial functions, has proved itself splendidly in describing purely optical phenomena and will probably never be replaced by another theory. One should keep in mind, however, that optical observations apply to time averages and not to momentary values, and it is conceivable that despite the complete confirmation of the theories of diffraction, reflection, refraction, dispersion, etc., by experiment, the theory of light, which operates with continuous spatial functions, may lead to contradictions with experience when it is applied to the phenomena of production and transformation of light. Indeed, it seems to me that the observations regarding "black-body" light, and other groups of phenomena associated with the production or conversion of light can be understood better if one assumes that the energy of light is discontinuously distributed in space.' back

Albert Einstein (1913), "Zero point energy is now as dead as a doornail" , ' Debije's work on the influence of temperature is still of great importance. Hopefully Debije will soon show us the inconsistency of the hypothesis of zero-point energy, the theoretical untenability. Soon after the publication of the work I had written together with Mr. Stern, it became terribly clear to me that it was untenable. It remains the case that the momentum & energy fluctuations are in the radiation.' back

Albert Einstein (1915), The Field Equations of Gravitation, ' In two recently published papers I have shown how to obtain field equations of gravitation that comply with the postulate of general relativity, i.e., which in their general formulation are covariant under arbitrary substitutions of space-time variables. . . . With this, we have finally completed the general theory of relativity as a logical structure. The postulate of relativity in its most general formulation (which makes space-time coordinates into physically meaningless parameters) leads with compelling necessity to a very specific theory of gravitation that also explains the movement of the perihelion of Mercury. However, the postulate of general relativity cannot reveal to us anything new and different about the essence of the various processes in nature than what the special theory of relativity taught us already. The opinions I recently voiced here in this regard have been in error. Every physical theory that complies with the special theory of relativity can, by means of the absolute differential calculus, be integrated into the system of general relativity theory-without the latter providing any criteria about the admissibility of such physical theory.' back

Albert Einstein (1933), On the Method of Theoretical Physics: Herbert Spencer Lecture 1933, ' It can scarcely be denied that the supreme goal of all theory is to make the irreducible basic elements as simple and as few as possible without having to surrender the adequate representation of a single datum of experience. back

Algebraic closure - Wikipedia, Algebraic closure - Wikipedia, the free encyclopedia, I'n mathematics, particularly abstract algebra, an algebraic closure of a field K is an algebraic extension of K that is algebraically closed. It is one of many closures in mathematics. Using Zorn's lemma, it can be shown that every field has an algebraic closure, and that the algebraic closure of a field K is unique up to an isomorphism that fixes every member of K. Because of this essential uniqueness, we often speak of the algebraic closure of K, rather than an algebraic closure of K. . . . The fundamental theorem of algebra states that the algebraic closure of the field of real numbers is the field of complex numbers.' back

Apophatic theology - Wikipedia, Apophatic theology - Wikipedia, the free encyclopedia, 'Apophatic theology (from Greek ἀπόφασις from ἀπόφημι - apophēmi, "to deny")—also known as negative theology or via negativa (Latin for "negative way")—is a theology that attempts to describe God, the Divine Good, by negation, to speak only in terms of what may not be said about the perfect goodness that is God. It stands in contrast with cataphatic theology.' back

Aquinas, Summa I, 25, 3, Is God omnipotent?, '. . . God is called omnipotent because He can do all things that are possible absolutely; which is the second way of saying a thing is possible. For a thing is said to be possible or impossible absolutely, according to the relation in which the very terms stand to one another, possible if the predicate is not incompatible with the subject, as that Socrates sits; and absolutely impossible when the predicate is altogether incompatible with the subject, as, for instance, that a man is a donkey.' back

Aquinas, Summa, I, 14, 8, Is the knowledge of God the cause of things?, 'Now it is manifest that God causes things by His intellect, since His being is His act of understanding; and hence His knowledge must be the cause of things, in so far as His will is joined to it. Hence the knowledge of God as the cause of things is usually called the "knowledge of approbation".' back

Aquinas, Summa, I, 3, 7, Is God altogether simple?, 'I answer that, The absolute simplicity of God may be shown in many ways. First, from the previous articles of this question. For there is neither composition of quantitative parts in God, since He is not a body; nor composition of matter and form; nor does His nature differ from His "suppositum"; nor His essence from His existence; neither is there in Him composition of genus and difference, nor of subject and accident. Therefore, it is clear that God is nowise composite, but is altogether simple. . . . ' back

Aquinas, Summa: I, 14, 1, Is there knowledge in God?, ' I answer that, In God there exists the most perfect knowledge. . . . it is clear that the immateriality of a thing is the reason why it is cognitive; and according to the mode of immateriality is the mode of knowledge. Hence it is said in De Anima ii that plants do not know, because they are wholly material. But sense is cognitive because it can receive images free from matter, and the intellect is still further cognitive, because it is more separated from matter and unmixed, as said in De Anima iii. Since therefore God is in the highest degree of immateriality as stated above (Question 7, Article 1), it follows that He occupies the highest place in knowledge.' back

Aristotle (continuity), Physics V, iii, 'A thing that is in succession and touches is 'contiguous'. The 'continuous' is a subdivision of the contiguous: things are called continuous when the touching limits of each become one and the same and are, as the word implies, contained in each other: continuity is impossible if these extremities are two. This definition makes it plain that continuity belongs to things that naturally in virtue of their mutual contact form a unity. And in whatever way that which holds them together is one, so too will the whole be one, e.g. by a rivet or glue or contact or organic union. ' 227a10 sqq. back

Athanasian Creed - Wikipedia, Athanasian Creed - Wikipedia, the free encyclopedia, ' The Athanasian Creed, also called the Pseudo-Athanasian Creed and sometimes known as Quicunque Vult (or Quicumque Vult), which is both its Latin name and its opening words, meaning "Whosoever wishes", is a Christian statement of belief focused on Trinitarian doctrine and Christology. Used by Christian churches since the sixth century, it was the first creed to explicitly state the equality of the three hypostases of the Trinity.' back

Baryon asymmetry - Wikipedia, Baryon asymmetry - Wikipedia, the free encyclopedia, ' In physical cosmology, the baryon asymmetry problem, also known as the matter asymmetry problem or the matter–antimatter asymmetry problem, is the observed imbalance in baryonic matter (the type of matter experienced in everyday life) and antibaryonic matter in the observable universe. Neither the standard model of particle physics, nor the theory of general relativity provides a known explanation for why this should be so, and it is a natural assumption that the universe is neutral with all conserved charges. The Big Bang should have produced equal amounts of matter and antimatter. Since this does not seem to have been the case, it is likely some physical laws must have acted differently or did not exist for matter and antimatter. Several competing hypotheses exist to explain the imbalance of matter and antimatter that resulted in baryogenesis. However, there is as of yet no consensus theory to explain the phenomenon, which has been described as "one of the great mysteries in physics".' back

Black hole - Wikipedia, Black hole - Wikipedia, the free encyclopedia, ' A black hole is a region of spacetime where gravity is so strong that nothing, including light and other electromagnetic waves, has enough energy to escape it. The theory of general relativity predicts that a sufficiently compact mass can deform spacetime to form a black hole. The boundary of no escape is called the event horizon. Although it has a great effect on the fate and circumstances of an object crossing it, it has no locally detectable features according to general relativity. In many ways, a black hole acts like an ideal black body, as it reflects no light. Moreover, quantum field theory in curved spacetime predicts that event horizons emit Hawking radiation, with the same spectrum as a black body of a temperature inversely proportional to its mass. This temperature is of the order of billionths of a kelvin for stellar black holes, making it essentially impossible to observe directly. ' back

Black-body radiation - Wikipedia, Black-body radiation - Wikipedia, the free encyclopedia, 'Black-body radiation is the type of electromagnetic radiation within or surrounding a body in thermodynamic equilibrium with its environment, or emitted by a black body (an opaque and non-reflective body) held at constant, uniform temperature. The radiation has a specific spectrum and intensity that depends only on the temperature of the body.' back

Bohr model - Wikipedia, Bohr model - Wikipedia, the free encyclopedia, 'In atomic physics, the Rutherford–Bohr model or Bohr model, introduced by Niels Bohr in 1913, depicts the atom as a small, positively charged nucleus surrounded by electrons that travel in circular orbits around the nucleus—similar in structure to the solar system, but with attraction provided by electrostatic forces rather than gravity.' back

Bose-Einstein statistics - Wikipedia, Bose-Einstein statistics - Wikipedia, the free encyclopedia, 'In statistical mechanics, Bose–Einstein statistics (or more colloquially B–E statistics) determines the statistical distribution of identical indistinguishable bosons over the energy states in thermal equilibrium.' back

Boson - Wikipedia, Boson - Wikipedia, the free encyclopedia, 'In particle physics, bosons are particles with an integer spin, as opposed to fermions which have half-integer spin. From a behaviour point of view, fermions are particles that obey the Fermi-Dirac statistics while bosons are particles that obey the Bose-Einstein statistics. They may be either elementary, like the photon, or composite, as mesons. All force carrier particles are bosons. They are named after Satyendra Nath Bose. In contrast to fermions, several bosons can occupy the same quantum state. Thus, bosons with the same energy can occupy the same place in space.' back

Calculus of variations - Wikipedia, Calculus of variations - Wikipedia, the free encylopedia, ' The calculus of variations may be said to begin with Newton's minimal resistance problem in 1687, followed by the brachistochrone curve problem raised by Johann Bernoulli (1696). It immediately occupied the attention of Jakob Bernoulli and the Marquis de l'Hôpital, but Leonhard Euler first elaborated the subject, beginning in 1733. Lagrange was influenced by Euler's work to contribute significantly to the theory. After Euler saw the 1755 work of the 19-year-old Lagrange, Euler dropped his own partly geometric approach in favor of Lagrange's purely analytic approach and renamed the subject the calculus of variations in his 1756 lecture Elementa Calculi Variationum.' back

Casimir effect - Wikipedia, Casimir effect - Wikipedia, the free encyclopedia, ' In physics, the Casimir effect or Casimir-Polder force is a physical force arising from a quantized field. The typical example is of two uncharged metallic plates in a vacuum, placed a few micrometers apart, without any external electromagnetic field. In a classical description, the lack of an external field also means that there is no field between the plates, and no force would be measured between them. When this field is instead studied using quantum mechanics, it is seen that the plates do affect the virtual photons which constitute the field, and generate a net force—either an attraction or a repulsion depending on the specific arrangement of the two plates. This force has been measured, and is a striking example of an effect purely due to second quantization.' back

Classical field theory - Wikipedia, Classical field theory - Wikipedia, the free encyclopedis, ' A classical field theory is a physical theory that predicts how one or more physical fields interact with matter through field equations, without considering effects of quantization; theories that incorporate quantum mechanics are called quantum field theories. In most contexts, 'classical field theory' is specifically intended to describe electromagnetism and gravitation, two of the fundamental forces of nature. A physical field can be thought of as the assignment of a physical quantity at each point of space and time. . . . The first field theories, Newtonian gravitation and Maxwell's equations of electromagnetic fields were developed in classical physics before the advent of relativity theory in 1905, and had to be revised to be consistent with that theory. Consequently, classical field theories are usually categorized as non-relativistic and relativistic. Modern field theories are usually expressed using the mathematics of tensor calculus. A more recent alternative mathematical formalism describes classical fields as sections of mathematical objects called fiber bundles.' back

Claude E Shannon (1948), A Mathematical Theory of Communication, ' The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back

Claude Shannon (1949), Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' [C. E. Shannon , “Communication in the presence of noise,” Proc. IRE, vol. 37, pp. 10–21, Jan. 1949.] back

Collectively exhaustive events - Wikipedia, Collectively exhaustive events - Wikipedia, the free encyclopedia, ' In probability theory and logic, a set of events is jointly or collectively exhaustive if at least one of the events must occur. For example, when rolling a six-sided die, the events 1, 2, 3, 4, 5, and 6 balls of a single outcome are collectively exhaustive, because they encompass the entire range of possible outcomes.' back

Controlled NOT gate - Wikipedia, Controlled NOT gate - Wikipedia, the free encyclopedia, ' In computer science, the controlled NOT gate (also C-NOT or CNOT), controlled-X gate, controlled-bit-flip gate, Feynman gate or controlled Pauli-X is a quantum logic gate that is an essential component in the construction of a gate-based quantum computer. It can be used to entangle and disentangle Bell states. Any quantum circuit can be simulated to an arbitrary degree of accuracy using a combination of CNOT gates and single qubit rotations. The gate is sometimes named after Richard Feynman who developed an early notation for quantum gate diagrams in 1986.' back

Cosmological constant - Wikipedia, Cosmological constant - Wikipedia, the free encyclopedia, 'In physical cosmology, the cosmological constant (usually denoted by the Greek capital letter lambda) was proposed by Albert Einstein as a modification of his original theory of general relativity to achieve a stationary universe. Einstein abandoned the concept after the observation of the Hubble redshift indicated that the universe might not be stationary, as he had based his theory off the idea that the universe is unchanging. However, the discovery of cosmic acceleration in the 1990s has renewed interest in a cosmological constant.' back

Cosmological constant problem - Wikipedia, Cosmological constant problem - Wikipedia, the free encyclopedia, ' In cosmology, the cosmological constant problem or vacuum catastrophe is the disagreement between the observed values of vacuum energy density (the small value of the cosmological constant) and theoretical large value of zero-point energy suggested by quantum field theory. Depending on the Planck energy cutoff and other factors, the discrepancy is as high as 120 orders of magnitude, a state of affairs described by physicists as "the largest discrepancy between theory and experiment in all of science" and "the worst theoretical prediction in the history of physics".' back

Cosmos - Wikipedia, Cosmos - Wikipedia, the free encyclopedia, ' The verb κοσμεῖν (κοσμεῖν) meant generally "to dispose, prepare", but especially "to order and arrange (troops for battle), to set (an army) in array"; also "to establish (a government or regime)", "to adorn, dress" (especially of women). Thus kosmos meant "ornaments, decoration" (compare kosmokomes "dressing the hair," and cosmetic). The philosopher Pythagoras used the term kosmos (Ancient Greek: κόσμος, Latinized kósmos) for the order of the universe. Anaxagoras further introduced the concept of a Cosmic Mind (Nous) ordering all things. The modern Greek κόσμος "order, good order, orderly arrangement" is a word with several main senses rooted in those notions. κόσμος has developed, along with primary "the universe, the world", the meaning of "people" (collectively).' back

David Hilbert (1925), On the Infinite, ' We encounter a completely different and quite unique conception of the notion of infinity in the important and fruitful method of ideal elements. The method of ideal elements is used even in elementary plane geometry. The points and straight lines of the plane originally are real, actually existent objects. One of the axioms that hold for them is the axiom of connection: one and only one straight line passes through two points. It follows from this axiom that two straight lines intersect at most at one point. There is no theorem that two straight lines always intersect at some point, however, for the two straight lines might well be parallel. Still we know that by introducing ideal elements, viz., infinitely long lines and points at infinity, we can make the theorem that two straight lines always intersect at one and only one point come out universally true. These ideal "infinite" elements have the advantage of making the system of connection laws as simple and perspicuous as possible. Another example of the use of ideal elements are the familiar complex-imaginary magnitudes of algebra which serve to simplify theorems about the existence and number of the roots of an equation.' back

Differentiable manifold - Wikipedia, Differentiable manifold - Wikipedia, the free encyclopedia, ' In mathematics, a differentiable manifold (also differential manifold) is a type of manifold that is locally similar enough to a vector space to allow one to apply calculus. Any manifold can be described by a collection of charts (atlas). One may then apply ideas from calculus while working within the individual charts, since each chart lies within a vector space to which the usual rules of calculus apply. If the charts are suitably compatible (namely, the transition from one chart to another is differentiable), then computations done in one chart are valid in any other differentiable chart. ' back

Electron - Wikipedia, Electron - Wikipedia, the free encyclopedia, ' The electron (e or β) is a subatomic particle with a negative one elementary electric charge. Electrons belong to the first generation of the lepton particle family, and are generally thought to be elementary particles because they have no known components or substructure. The electron's mass is approximately 1/1836 that of the proton.Quantum mechanical properties of the electron include an intrinsic angular momentum (spin) of a half-integer value, expressed in units of the reduced Planck constant, ℏ. Being fermions, no two electrons can occupy the same quantum state, the Pauli exclusion principle.' back

Elementary particle - Wikipedia, Elementary particle - Wikipedia, the free encyclopedia, ' In particle physics, an elementary particle or fundamental particle is a subatomic particle that is not composed of other particles. Particles currently thought to be elementary include the fundamental fermions (quarks, leptons, antiquarks, and antileptons), which generally are "matter particles" and "antimatter particles", as well as the fundamental bosons (gauge bosons and the Higgs boson), which generally are "force particles" that mediate interactions among fermions. A particle containing two or more elementary particles is a composite particle.' back

Ellen Barry (2024), One Twin Was Hurt, the Other Was Not. Their Adult Mental Health Diverged., ' Twins are a bonanza for research psychologists. In a field perpetually seeking to tease out the effects of genetics, environment and life experience, they provide a natural controlled experiment as their paths diverge, subtly or dramatically, through adulthood. Take Dennis and Douglas. In high school, they were so alike that friends told them apart by the cars they drove, they told researchers in a study of twins in Virginia. Most of their childhood experiences were shared — except that Dennis endured an attempted molestation when he was 13. At 18, Douglas married his high school girlfriend. He raised three children and became deeply religious. Dennis cycled through short-term relationships and was twice divorced, plunging into bouts of despair after each split. By their 50s, Dennis had a history of major depression, and his brother did not. Why do twins, who share so many genetic and environmental inputs, diverge as adults in their experience of mental illness? On Wednesday, a team of researchers from the University of Iceland and Karolinska Institutet in Sweden reported new findings on the role played by childhood trauma. back

Entropy - Wikipedia, Entropy - Wikipedia, the free encyclopedia, 'In statistical mechanics, entropy is an extensive property of a thermodynamic system. It is closely related to the number Ω of microscopic configurations (known as microstates) that are consistent with the macroscopic quantities that characterize the system (such as its volume, pressure and temperature). Under the assumption that each microstate is equally probable, the entropy S is the natural logarithm of the number of microstates, multiplied by the Boltzmann constant kB. Formally (assuming equiprobable microstates), S = k B ln ⁡ Ω . ' back

(ε, δ)-definition of limit - Wikipedia, (ε, δ)-definition of limit - Wikipedia, the free encyclopedia, 'In calculus, the (ε, δ)-definition of limit ("epsilon-delta definition of limit") is a formalization of the notion of limit. It was first given by Bernard Bolzano in 1817. Augustin-Louis Cauchy never gave an (ε, δ) definition of limit in his Cours d'Analyse, but occasionally used ε, δ arguments in proofs. The definitive modern statement was ultimately provided by Karl Weierstrass.' back

Eugene Wigner (1960), The Unreasonable Effectiveness of Mathematics in the Natural Sciences, 'The first point is that the enormous usefulness of mathematics in the natural sciences is something bordering on the mysterious and that there is no rational explanation for it. Second, it is just this uncanny usefulness of mathematical concepts that raises the question of the uniqueness of our physical theories.' back

Event horizon - Wikipedia, Event horizon - Wikipedia, the free encyclopedia, 'In general relativity, an event horizon is a boundary in spacetime beyond which events cannot affect an outside observer. In layman's terms, it is defined as the shell of "points of no return", i,e., the points at which the gravitational pull becomes so great as to make escape impossible, even for light. ' back

Fermion - Wikipedia, Fermion - Wikipedia, the free encyclopedia, 'In particle physics, fermions are particles with a half-integer spin, such as protons and electrons. They obey the Fermi-Dirac statistics and are named after Enrico Fermi. In the Standard Model there are two types of elementary fermions: quarks and leptons. . . . In contrast to bosons, only one fermion can occupy a quantum state at a given time (they obey the Pauli Exclusion Principle). Thus, if more than one fermion occupies the same place in space, the properties of each fermion (e.g. its spin) must be different from the rest. Therefore fermions are usually related with matter while bosons are related with radiation, though the separation between the two is not clear in quantum physics. back

Feynman, Leighton and Sands FLP II_02, Chapter 2: Differential Calculus of Vector Fields, ' Ideas such as the field lines, capacitance, resistance, and inductance are, for such purposes, very useful. So we will spend much of our time analyzing them. In this way we will get a feel as to what should happen in different electromagnetic situations. On the other hand, none of the heuristic models, such as field lines, is really adequate and accurate for all situations. There is only one precise way of presenting the laws, and that is by means of differential equations. They have the advantage of being fundamental and, so far as we know, precise. If you have learned the differential equations you can always go back to them. There is nothing to unlearn.' back

Fixed point theorem - Wikipedia, Fixed point theorem - Wikipedia, the free encyclopedia, 'In mathematics, a fixed point theorem is a result saying that a function F will have at least one fixed point (a point x for which F(x) = x), under some conditions on F that can be stated in general terms. Results of this kind are amongst the most generally useful in mathematics. The Banach fixed point theorem gives a general criterion guaranteeing that, if it is satisfied, the procedure of iterating a function yields a fixed point. By contrast, the Brouwer fixed point theorem is a non-constructive result: it says that any continuous function from the closed unit ball in n-dimensional Euclidean space to itself must have a fixed point, but it doesn't describe how to find the fixed point (See also Sperner's lemma).' back

Frank Wilczek (2004), Nobel lecture: Asymptotic Freedom: from Paradox to Paradigm, ' Frank Wilczek held his Nobel lecture December 8, 2004, at Aula Magna, Stockholm University. He was presented by Professor Sune Svanberg, Chairman of the Nobel Committee for Physics. Summary: The idea that Quarks that are born free are confined and can’t be pulled apart was once considered a paradox. The emerging theory for strong interactions, Quantum Chromo Dynamics (QCD) predicts the existence of gluons, which together with quarks can be seen indirectly as jets from hard scattering reactions between particles. Quantum Chromo Dynamics predicts that the forces between quarks are feeble for small separations but are powerful far away, which explains confinement. Many experiments have confirmed this property of the strong interaction. ' back

Galileo Galilei, Recantation of Galileo (June 22, 1633), ' Therefore, desiring to remove from the minds of your Eminences, and of all faithful Christians, this vehement suspicion, justly conceived against me, with sincere heart and unfeigned faith I abjure, curse, and detest the aforesaid errors and heresies, and generally every other error, heresy, and sect whatsoever contrary to the said Holy Church, and I swear that in the future I will never again say or assert, verbally or in writing, anything that might furnish occasion for a similar suspicion regarding me; ' back

Gauge theory - Wikipedia, Gauge theory - Wikipedia, the free encyclopedia, ' In physics, a gauge theory is a type of field theory in which the Lagrangian does not change (is invariant) under local transformations from certain Lie groups. The term gauge refers to any specific mathematical formalism to regulate redundant degrees of freedom in the Lagrangian. The transformations between possible gauges, called gauge transformations, form a Lie group—referred to as the symmetry group or the gauge group of the theory. Associated with any Lie group is the Lie algebra of group generators. For each group generator there necessarily arises a corresponding field (usually a vector field) called the gauge field. Gauge fields are included in the Lagrangian to ensure its invariance under the local group transformations (called gauge invariance). When such a theory is quantized, the quanta of the gauge fields are called gauge bosons. If the symmetry group is non-commutative, then the gauge theory is referred to as non-abelian gauge theory, the usual example being the Yang–Mills theory.' back

General covariance - Wikipedia, General covariance - Wikipedia, the free encyclopedia, In theoretical physics, general covariance (also known as diffeomorphism covariance or general invariance) is the invariance of the form of physical laws under arbitrary differentiable coordinate transformations. The essential idea is that coordinates do not exist a priori in nature, but are only artifices used in describing nature, and hence should play no role in the formulation of fundamental physical laws.' back

Gerolamo Cardano - Wikipedia, Gerolamo Cardano - Wikipedia, the free encyclopedia, 'Gerolamo Cardano ( Hieronymus Cardanus; 24 September 1501– 21 September 1576) was an Italian polymath whose interests and proficiencies ranged through those of mathematician, physician, biologist, physicist, chemist, astrologer, astronomer, philosopher, writer, and gambler.[3] He became one of the most influential mathematicians of the Renaissance and one of the key figures in the foundation of probability; he introduced the binomial coefficients and the binomial theorem in the Western world. He wrote more than 200 works on science. . . .. Today, Cardano is well known for his achievements in algebra. In his 1545 book Ars Magna he made the first systematic use of negative numbers in Europe, published (with attribution) the solutions of other mathematicians for cubic and quartic equations, and acknowledged the existence of imaginary numbers.' back

Gödel's incompleteness theorems - Wikipedia, Gödel's incompleteness theorems - Wikipedia, 'Gödel's incompleteness theorems are two theorems of mathematical logic that establish inherent limitations of all but the most trivial axiomatic systems capable of doing arithmetic. The theorems, proven by Kurt Gödel in 1931, are important both in mathematical logic and in the philosophy of mathematics. The two results are widely, but not universally, interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible, giving a negative answer to Hilbert's second problem. The first incompleteness theorem states that no consistent system of axioms whose theorems can be listed by an "effective procedure" (i.e., any sort of algorithm) is capable of proving all truths about the relations of the natural numbers (arithmetic). For any such system, there will always be statements about the natural numbers that are true, but that are unprovable within the system. The second incompleteness theorem, an extension of the first, shows that such a system cannot demonstrate its own consistency.' back

Hamilton's principle - Wikipedia, Hamilton's principle - Wikipedia, the free encyclopedia, 'In physics, Hamilton's principle is William Rowan Hamilton's formulation of the principle of stationary action . . . It states that the dynamics of a physical system is determined by a variational problem for a functional based on a single function, the Lagrangian, which contains all physical information concerning the system and the forces acting on it.' back

Hawking radiation - Wikipedia, Black hole - Wikipedia, the free encyclopedia, ' Hawking radiation is dependent on the Unruh effect and the equivalence principle applied to black-hole horizons. Close to the event horizon of a black hole, a local observer must accelerate to keep from falling in. An accelerating observer sees a thermal bath of particles that pop out of the local acceleration horizon, turn around, and free-fall back in. The condition of local thermal equilibrium implies that the consistent extension of this local thermal bath has a finite temperature at infinity, which implies that some of these particles emitted by the horizon are not reabsorbed and become outgoing Hawking radiation.' back

Hebrew Bible - Wikipedia, Hebrew Bible - Wikipedia, the free encyclopedia, ' The Hebrew Bible . . . is a term referring to the books of the Jewish Bible as originally written mostly in Biblical Hebrew with some Biblical Aramaic. The term closely corresponds to contents of the Jewish Tanakh and the Protestant Old Testament (see also Judeo-Christian) but does not include the deuterocanonical portions of the Roman Catholic or the Anagignoskomena portions of the Eastern Orthodox Old Testaments. The term does not imply naming, numbering or ordering of books, which varies (see also Biblical canon).' back

Hermann Minkowski (1908) , Hermann Minkowski, Wikiquote, ' The views of space and time which I wish to lay before you have sprung from the soil of experimental physics, and therein lies their strength. They are radical. Henceforth, space by itself, and time by itself, are doomed to fade away into mere shadows, and only a kind of union of the two will preserve an independent reality.' Address to the 80th Assembly of German Natural Scientists and Physicians, (Sep 21, 1908) back

Hilbert's program - Wikipedia, Hilbert's program - Wikipedia, the free encyclopedia, ' In mathematics, Hilbert's program, formulated by German mathematician David Hilbert, was a proposed solution to the foundational crisis of mathematics, when early attempts to clarify the foundations of mathematics were found to suffer from paradoxes and inconsistencies. As a solution, Hilbert proposed to ground all existing theories to a finite, complete set of axioms, and provide a proof that these axioms were consistent. Hilbert proposed that the consistency of more complicated systems, such as real analysis, could be proven in terms of simpler systems. Ultimately, the consistency of all of mathematics could be reduced to basic arithmetic.' back

History of printing - Wikipedia, History of printing - Wikipedia, the free encyclopedia, ' The history of printing starts as early as 3000 BCE, when the proto-Elamite and Sumerian civilizations used cylinder seals to certify documents written in clay tablets. . . .. Woodblock printing was also used in Europe until the mid-15th century. Late medieval German inventor Johannes Gutenberg created the first printing press based on previously known mechanical presses and a process for mass-producing metal type. By the end of the 15th century, his invention and widescale circulation of the Gutenberg Bible became responsible for a burgeoning economical book publishing industry across Renaissance Europe and eventually among the colonial publishers and printers that emerged in the British-American colonies. This industry enabled the communication of ideas and the sharing of knowledge on an unprecedented scale, leading to the global spread of the printing press during the early modern period. Alongside the development of text printing, new and lower-cost methods of image reproduction were developed, including lithography, screen printing and photocopying.' back

Hulse-Taylor pulsar - Wikipedia, Hulse-Taylor pulsar - Wikipedia, the free encyclopedia, The Hulse–Taylor pulsar . . . is a binary star system composed of a neutron star and a pulsar which orbit around their common center of mass. It is the first binary pulsar ever discovered. The pulsar was discovered by Russell Alan Hulse and Joseph Hooton Taylor Jr., of the University of Massachusetts Amherst in 1974. . .. The orbit has decayed since the binary system was initially discovered, in precise agreement with the loss of energy due to gravitational waves described by Albert Einstein's general theory of relativity. The ratio of observed to predicted rate of orbital decay is calculated to be 0.997 ± 0.002. The total power of the gravitational waves emitted by this system presently is calculated to be 7.35 × 10^24 watts. For comparison, this is 1.9% of the power radiated in light by the Sun.' back

Initial singularity - Wikipedia, Initial singularity - Wikipedia, the free encyclopedia, ' The initial singularity is a singularity predicted by some models of the Big Bang theory to have existed before the Big Bang. The instant immediately following the initial singularity is part of the Planck epoch, the earliest period of time in the history of our universe. The use of only general relativity to predict what happened in the beginnings of the Universe has been heavily criticized, as quantum mechanics becomes a significant factor in the high-energy environment of the earliest Universe, and general relativity on its own fails to make accurate predictions. Although there is no direct evidence for a singularity of infinite density, the cosmic microwave background is evidence that the universe expanded from a very hot, dense state. ' back

Internet Protocol - Wikipedia, Internet Protocol - Wikipedia, the free encyclopedia, ' The Internet Protocol (IP) is the network layer communications protocol in the Internet protocol suite for relaying datagrams across network boundaries. Its routing function enables internetworking, and essentially establishes the Internet. IP has the task of delivering packets from the source host to the destination host solely based on the IP addresses in the packet headers. For this purpose, IP defines packet structures that encapsulate the data to be delivered. It also defines addressing methods that are used to label the datagram with source and destination information.' back

Jeffrey Nicholls (1987), A theory of Peace, ' The argument: I began to think about peace in a very practical way during the Viet Nam war. I was the right age to be called up. I was exempted because I was a clergyman, but despite the terrors that war held for me, I think I might have gone. It was my first whiff of the force of patriotism. To my amazement, it was strong enough to make even me face death.
In the Church, I became embroiled in a deeper war. Not a war between goodies and baddies, but the war between good and evil that lies at the heart of all human consciousness. Existence is a struggle. We need all the help we can get. Religion is part of that help and theology is the scientific foundation of religion.' back

John von Neumann (2014), Mathematical Foundations of Quantum Mechanics, ' Mathematical Foundations of Quantum Mechanics by John von Neumann translated from the German by Robert T. Beyer (New Edition) edited by Nicholas A. Wheeler. Princeton UP Princeton & Oxford. Preface: ' This book is the realization of my long-held intention to someday use the resources of TEX to produce a more easily read version of Robert T. Beyer’s authorized English translation (Princeton University Press, 1955) of John von Neumann’s classic Mathematische Grundlagen der Quantenmechanik (Springer, 1932).'This content downloaded from 129.127.145.240 on Sat, 30 May 2020 22:38:31 UTC back

Joseph-Louis Lagrange (1811), Mécanique analytique Volume 1, ' On a déjà plusieurs Traités de Mécanique , mais le plan de celui - ci est entièrement neuf . Je me suis proposé de réduire la théorie de cette Science , et l'art de résoudre les problèmes qui s'y rapportent , à des formules générales, dont le simple développement donne toutes les équations nécessaires pour la solution de chaque problème.' back

Kenneth G Wilson (1982), Nobel Lecture: The Renormalisation Group and Critical Phenomena, Nobel Prize Lecture, 8 December 1982: This paper has three parts. The first part is a simplified presentation of the basic ideas of the renormalization group and the e expansion applied to critical phenomena, following roughly a summary exposition given in 1972. The second part is an account of the history (as I remember it) of work leading up to the papers in I971-1972 on the renormalization group. Finally, some of the developments since 1971 will be summarized, and an assessment for the future given.' back

Kenneth G. Wilson (1979), Problems in Physics with many Scales of Length, ' One of the more conspicuous prop­erties of nature is the great diversity of size or length scales in the structure of the world. An ocean, for example, has currents that persist for thousands of kilometers and has tides of global extent; it also has waves that range in size from less than a centimeter to several meters; at much finer resolu­tion seawater must be regarded as an aggregate of molecules whose charac­teristic scale of length is roughly 10-8 centimeter. From the smallest structure to the largest is a span of some 17 or­ders of magnitude.' back

Kerson Huang (2013), A Critical History of Renormalization, ' The history of renormalization is reviewed with a critical eye,starting with Lorentz's theory of radiation damping, through perturbative QED with Dyson, Gell‐Mann & Low, and others, to Wilson's formulation and Polchinski's functional equation, and applications to "triviality", and dark energy in cosmology.
Dedication: Renormalization, that astounding mathematical trick that enabled one to tame divergences in Feynman diagrams, led to the triumph of quantum electrodynamics. Ken Wilson made it physics, by uncovering its deep connection with scale transformations. The idea that scale determines the perception of world seems obvious. When one examines an oil painting, for example, what one sees depends on the resolution of the instrument one uses for the examination. At resolutions of the naked eye, one sees art, perhaps, but upon greater and greater magnifications, one sees pigments, then molecules and atoms, and so forth. What is non‐trivial is to formulate this mathematically, as a physical theory, and this is what Ken Wilson had achieved.' back

Kibble balance - Wikipedia, Kibble balance - Wikipedia, the free enecylopedia, ' Prior to 2019, the definition of the kilogram was based on a physical object known as the International Prototype of the Kilogram (IPK). After considering alternatives, in 2013 the General Conference on Weights and Measures (CGPM) agreed on accuracy criteria for replacing this definition with one based on the use of a Kibble balance. After these criteria had been achieved, the CGPM voted unanimously on November 16, 2018 to change the definition of the kilogram and several other units, effective May 20, 2019, to coincide with World Metrology Day.' back

Kurt Gödel I (1931), On formally undecidable propositions of Principia Mathematica and related systems I, '1 Introduction The development of mathematics towards greater exactness has, as is well-known, lead to formalization of large areas of it such that you can carry out proofs by following a few mechanical rules. The most comprehensive current formal systems are the system of Principia Mathematica (PM) on the one hand, the Zermelo-Fraenkelian axiom-system of set theory on the other hand. These two systems are so far developed that you can formalize in them all proof methods that are currently in use in mathematics, i.e. you can reduce these proof methods to a few axioms and deduction rules. Therefore, the conclusion seems plausible that these deduction rules are sufficient to decide all mathematical questions expressible in those systems. We will show that this is not true, but that there are even relatively easy problem in the theory of ordinary whole numbers that can not be decided from the axioms. This is not due to the nature of these systems, but it is true for a very wide class of formal systems, which in particular includes all those that you get by adding a finite number of axioms to the above mentioned systems, provided the additional axioms don’t make false theorems provable.' back

Lagrangian mechanics - Wikipedia, Lagrangian mechanics - Wikipedia, the free encyclopedia, ' Introduced by the Italian-French mathematician and astronomer Joseph-Louis Lagrange in 1788, Lagrangian mechanics is a formulation of classical mechanics and is founded on the stationary action principle. Given a system of point masses and a pair, t1 and t2 Lagrangian mechanics postulates that the system's trajectory (describing evolution of the system over time) . . . must be a stationary point of the action functional S = L dt. By convention, L = T − V, where T and V are the kinetic and potential energy of the system, respectively.' back

Lamb shift - Wikipedia, Lamb shift - Wikipedia, the free encyclopedia, 'In physics, the Lamb shift, named after Willis Lamb (1913–2008), is a difference in energy between two energy levels 2S½ and 2P½ (in term symbol notation) of the hydrogen atom which was not predicted by the Dirac equation, according to which these states should have the same energy. Interaction between vacuum energy fluctuations and the hydrogen electron in these different orbitals is the cause of the Lamb Shift, as was shown subsequent to its discovery.' back

Lambda-CDM model - Wikipedia, Lambda-CDM model - Wikipedia, the free encyclopedia, ' The ΛCDM (Lambda cold dark matter) or Lambda-CDM model is a parameterization of the Big Bang cosmological model in which the universe contains three major components: first, a cosmological constant denoted by Lambda (Greek Λ) associated with dark energy; second, the postulated cold dark matter (abbreviated CDM); and third, ordinary matter. It is frequently referred to as the standard model of Big Bang cosmology because it is the simplest model that provides a reasonably good account of the following properties of the cosmos: the existence and structure of the cosmic microwave background the large-scale structure in the distribution of galaxies the observed abundances of hydrogen (including deuterium), helium, and lithium the accelerating expansion of the universe observed in the light from distant galaxies and supernovae. ' back

Last universal common ancestor - Wikipedia, Last universal common ancestor - Wikipedia, the free encyclopedia, ' The last universal common ancestor (LUCA) is the most recent population from which all organisms now living on Earth share common descent—the most recent common ancestor of all current life on Earth. . . . .. While no specific fossil evidence of the LUCA exists, the detailed biochemical similarity of all current life makes it plausible. Its characteristics can be inferred from shared features of modern genomes. These genes describe a complex life form with many co-adapted features, including transcription and translation mechanisms to convert information from DNA to mRNA to proteins. The LUCA probably lived in the high-temperature water of deep sea vents near ocean-floor magma flows around 4 billion years ago.' back

Lorentz group - Wikipedia, Lorentz group - Wikipedia, the free encyclopedia, 'In physics (and mathematics), the Lorentz group is the group of all Lorentz transformations of Minkowski spacetime, the classical setting for all (nongravitational) physical phenomena. The Lorentz group is named for the Dutch physicist Hendrik Lorentz. The mathematical form of the kinematical laws of special relativity, Maxwell's field equations in the theory of electromagnetism, the Dirac equation in the theory of the electron, are each invariant under the Lorentz transformations. Therefore the Lorentz group is said to express the fundamental symmetry of many of the known fundamental Laws of Nature.' back

Max Planck (1901), On the Law of Distribution of Energy in the Normal Spectrum, Annalen der Physik, vol. 4, p. 553 ff (1901) 'The recent spectral measurements made by O. Lummer and E. Pringsheim and even more notable those by H. Rubens and F. Kurlbaum which together confirmed an earlier result obtained by H. Beckmann show that the law of energy distribution in the normal spectrum, first derived by W. Wien from molecular-kinetic considerations and later by me from the theory of electromagnetic radiation, is not valid generally.' back

Meinard Kuhlmann (Stanford Encyclopedia of Philosophy), Quantum Field Theory, ' Quantum Field Theory (QFT) is the mathematical and conceptual framework for contemporary elementary particle physics. In a rather informal sense QFT is the extension of quantum mechanics (QM), dealing with particles, over to fields, i.e. systems with an infinite number of degrees of freedom. (See the entry on quantum mechanics.) In the last few years QFT has become a more widely discussed topic in philosophy of science, with questions ranging from methodology and semantics to ontology. QFT taken seriously in its metaphysical implications seems to give a picture of the world which is at variance with central classical conceptions of particles and fields, and even with some features of QM.' back

Michelangelo L Mangano (2009), The super-LHC, ' We review here the prospects of a long-term upgrade programme for the Large Hadron Collider (LHC), CERN laboratory's new proton-proton collider. The super-LHC, which is currently under evaluation and design, is expected to deliver of the order of ten times the statistics of the LHC. In addition to a non-technical summary of the principal physics arguments for the upgrade, I present a pedagogical introduction to the technological challenges on the accelerator and experimental fronts, and a review of the current status of the planning. ' back

Middle term - Wikipedia, Middle term - Wikipedia, the free encyclopedia, 'In logic, a middle term is a term that appears (as a subject or predicate of a categorical proposition) in both premises but not in the conclusion of a categorical syllogism. The middle term (in bold below) must be distributed in at least one premise but not in the conclusion. The major term and the minor terms, also called the end terms, do appear in the conclusion.' back

Minkowski space - Wikipedia, Minkowski space - Wikipedia, the free encyclopedia, ' By 1908 Minkowski realized that the special theory of relativity, introduced by his former student Albert Einstein in 1905 and based on the previous work of Lorentz and Poincaré, could best be understood in a four-dimensional space, since known as the "Minkowski spacetime", in which time and space are not separated entities but intermingled in a four-dimensional space–time, and in which the Lorentz geometry of special relativity can be effectively represented using the invariant interval x2 + y2 + z2 − c2 t2.' back

Modha et al, Neural inference at the frontier of energy, space, and time, ' Computing, since its inception, has been processor-centric, with memory separated from compute. Inspired by the organic brain and optimized for inorganic silicon, NorthPole is a neural inference architecture that blurs this boundary by eliminating off-chip memory, intertwining compute with memory on-chip, and appearing externally as an active memory chip. NorthPole is a low-precision, massively parallel, densely interconnected, energy-efficient, and spatial computing architecture with a co-optimized, high-utilization programming model. On the ResNet50 benchmark image classification network, relative to a graphics processing unit (GPU) that uses a comparable 12-nanometer technology process, NorthPole achieves a 25 times higher energy metric of frames per second (FPS) per watt, a 5 times higher space metric of FPS per transistor, and a 22 times lower time metric of latency. Similar results are reported for the Yolo-v4 detection network. NorthPole outperforms all prevalent architectures, even those that use more-advanced technology processes.' back

NAND gate - Wikipedia, NAND gate - Wikipedia, the free encyclopedia, ' In digital electronics, a NAND gate (NOT-AND) is a logic gate which produces an output which is false only if all its inputs are true; thus its output is complement to that of an AND gate. A LOW (0) output results only if all the inputs to the gate are HIGH (1); if any input is LOW (0), a HIGH (1) output results. A NAND gate is made using transistors and junction diodes. . . .. The NAND gate is significant because any Boolean function can be implemented by using a combination of NAND gates. This property is called "functional completeness". It shares this property with the NOR gate. Digital systems employing certain logic circuits take advantage of NAND's functional completeness.' back

Nicene Creed - Wikipedia, Nicene Creed - Wikipedia, the free encyclopedia, ' The Nicene Creed (Greek: Σύμβολον τῆς Νίκαιας, Latin: Symbolum Nicaenum) is the profession of faith or creed that is most widely used in Christian liturgy. It forms the mainstream definition of Christianity for most Christians. It is called Nicene because, in its original form, it was adopted in the city of Nicaea (present day Iznik in Turkey) by the first ecumenical council, which met there in the year 325. The Nicene Creed has been normative for the Catholic Church, the Eastern Orthodox Church, the Church of the East, the Oriental Orthodox churches, the Anglican Communion, and the great majority of Protestant denominations.' back

One-electron universe - Wikipedia, One-electron universe - Wikipedia, the free encyclopedia, ' The one-electron universe postulate, proposed by John Wheeler in a telephone call to Richard Feynman in the spring of 1940, is the hypothesis that all electrons and positrons are actually manifestations of a single entity moving backwards and forwards in time. According to Feynman: I received a telephone call one day at the graduate college at Princeton from Professor Wheeler, in which he said, "Feynman, I know why all electrons have the same charge and the same mass" "Why?" "Because, they are all the same electron!" ' back

P versus NP problem - Wikipedia, P versus NP problem - Wikipedia, the free encyclopedia, ' The P versus NP problem is a major unsolved problem in computer science. It asks whether every problem whose solution can be quickly verified (technically, verified in polynomial time) can also be solved quickly (again, in polynomial time). The underlying issues were first discussed in the 1950s, in letters from John Forbes Nash Jr. to the National Security Agency, and from Kurt Gödel to John von Neumann. The precise statement of the P versus NP problem was introduced in 1971 by Stephen Cook in his seminal paper " The complexity of theorem proving procedures" and is considered by many to be the most important open problem in the field.' back

P. A. M. Dirac (1933), The Lagrangian in Quantum Mechanics, ' . . . there is an alternative formulation [to the Hamiltonian] in classical dynamics, provided by the Lagrangian. This requires one to work in terms of coordinates and velocities instead of coordinates and momenta. The two formulation are closely related but there are reasons for believing that the Lagrangian one is more fundamental. . . . Secondly the lagrangian method can easily be expressed relativistically, on account of the action function being a relativistic invariant; . . .. ' [This article was first published in Physikalische Zeitschrift der Sowjetunion, Band 3, Heft 1 (1933), pp. 64–72.] back

Path integral formulation - Wikipedia, Path integral formulation - Wikipedia, the free encyclopedia, 'The path integral formulation of quantum mechanics is a description of quantum theory which generalizes the action principle of classical mechanics. It replaces the classical notion of a single, unique trajectory for a system with a sum, or functional integral, over an infinity of possible trajectories to compute a quantum amplitude. . . . This formulation has proved crucial to the subsequent development of theoretical physics, since it provided the basis for the grand synthesis of the 1970s which unified quantum field theory with statistical mechanics. . . . ' back

Planck constant - Wikipedia, Planck constant - Wikipedia, the free encyclopedia, ' Since energy and mass are equivalent, the Planck constant also relates mass to frequency. By 2017, the Planck constant had been measured with sufficient accuracy in terms of the SI base units, that it was central to replacing the metal cylinder, called the International Prototype of the Kilogram (IPK), that had defined the kilogram since 1889. . . . For this new definition of the kilogram, the Planck constant, as defined by the ISO standard, was set to 6.626 070 150 × 10-34 J⋅s exactly. ' back

Planck units - Wikipedia, Planck units - Wikipedia, the free encycloedia, ' In particle physics and physical cosmology, Planck units are a set of units of measurement defined exclusively in terms of four universal physical constants, in such a manner that these physical constants take on the numerical value of 1 when expressed in terms of these units. . Originally proposed in 1899 by German physicist Max Planck, these units are also known as natural units because the origin of their definition comes only from properties of nature and not from any human construct. Planck units are only one system of several systems of natural units, but Planck units are not based on properties of any prototype object or particle (that would be arbitrarily chosen), but rather on only the properties of free space.' back

Precession of the perihelion of Mercury, Precession of the perihelion of Mercury, ' The precession of the orbit is not peculiar to Mercury, all the planetary orbits precess. In fact, Newton's theory predicts these effects, as being produced by the pull of the planets on one another. The question is whether Newton's predictions agree with the amount an orbit precesses; it is not enough to understand qualitatively what is the origin of an effect, such arguments must be backed by hard numbers to give them credence. The precession of the orbits of all planets except for Mercury's can, in fact, be understood using Newton;s equations. But Mercury seemed to be an exception. . . . An early success of Einstein's theory was an explanation for the 43'' per century precession of the perihelion of Mercury. In a curved spacetime a planet does not orbit the Sun in a static elliptical orbit, as in Newton's theory. Rather, the orbit is obliged to precess because of the curvature of spacetime. When Einstein calculated the magnitude of this effect for Mercury he got precisely the previously unexplained 43''. He correctly took the view that this was an important confirmation of his theory.' back

Principle of individuation - Wikipedia, Principle of individuation - Wikipedia, the free encyclopedia, ' The principle of individuation is a criterion that individuates or numerically distinguishes the members of the kind for which it is given, that is by which we can supposedly determine, regarding any kind of thing, when we have more than one of them or not. It is also known as a 'criterion of identity' or 'indiscernibility principle'. The history of the consideration of such a principle begins with Aristotle. It was much discussed by the medieval philosopher Duns Scotus (c. 1266–1308) with his "haecceity" and later, during Renaissance, by Francisco Suárez (1548–1617), Bonaventure Baron (1610–1696) and Leibniz (1646–1716). back

Problem of evil - Wikipedia, Problem of evil - Wikipedia, the free encyclopedia, 'In the philosophy of religion, the problem of evil is the question of how to reconcile the existence of evil with that of a deity who is omnipotent, omniscient and omnibenevolent (see theism). An argument from evil attempts to show that the co-existence of evil and such a deity is unlikely or impossible, and attempts to show the contrary have been traditionally known as theodicies.' back

QED vacuum - Wikipedia, QED vacuum - Wikipedia, the free encyclopedia, ' The QED vacuum is the field-theoretic vacuum of quantum electrodynamics. It is the lowest energy state (the ground state) of the electromagnetic field when the fields are quantized. When Planck's constant is hypothetically allowed to approach zero, QED vacuum is converted to classical vacuum, which is to say, the vacuum of classical electromagnetism.' Another field-theoretic vacuum is the QCD vacuum of the Standard Model.' back

Quantum field theory - Wikipedia, Quantum field theory - Wikipedia, the free encyclopedia, 'Quantum field theory (QFT) provides a theoretical framework for constructing quantum mechanical models of systems classically described by fields or (especially in a condensed matter context) of many-body systems. . . . In QFT photons are not thought of as 'little billiard balls', they are considered to be field quanta - necessarily chunked ripples in a field that 'look like' particles. Fermions, like the electron, can also be described as ripples in a field, where each kind of fermion has its own field. In summary, the classical visualisation of "everything is particles and fields", in quantum field theory, resolves into "everything is particles", which then resolves into "everything is fields". In the end, particles are regarded as excited states of a field (field quanta). back

Renormalization - Wikipedia, Renormalization - Wikipedia, the free encyclopedia, ' Renormalization is a collection of techniques in quantum field theory, the statistical mechanics of fields, and the theory of self-similar geometric structures, that are used to treat infinities arising in calculated quantities by altering values of quantities to compensate for effects of their self-interactions. But even if it were the case that no infinities arose in loop diagrams in quantum field theory, it could be shown that renormalization of mass and fields appearing in the original Lagrangian is necessary.' back

Renormalization group - Wikipedia, Renormalization group - Wikipedia, the free encyclopedia, ' In theoretical physics, renormalization group (RG) refers to a mathematical apparatus that allows one to investigate the changes of a physical system as one views it at different distance scales. In particle physics it reflects the changes in the underlying force laws as one varies the energy scale at which physical processes occur. A change in scale is called a "scale transformation" or "conformal transformation." The renormalization group is intimately related to "conformal invariance" or "scale invariance," a symmetry by which the system appears the same at all scales (so-called self-similarity).' back

Ribosome - Wikipedia, Ribosome - Wikipedia, the free encyclopedia, ' Ribosomes are macromolecular machines, found within all living cells, that perform biological protein synthesis (mRNA translation). Ribosomes link amino acids together in the order specified by the codons of messenger RNA molecules to form polypeptide chains. Ribosomes consist of two major components: the small and large ribosomal subunits. Each subunit consists of one or more ribosomal RNA molecules and many ribosomal proteins.' back

Richard P. Feynman (1981), Simulating Physics with Computers, 'I want to talk about the possibiity that there is to be an exact simulation, that the computer will do exactly the same as nature. If this is to be proved and the type of computer is as I've already explained, then it's going to be necessary that everything that happens in a finite volume of space and time would have to be exactly analyzable with a finite number of logical operations. The present theory of physics is not that way, apparently. It allows space to go down into infinitesimal distances, wavelengths to get infinitely great, terms to be summed in infinite order, and so forth; and therefore if this proposition is right, physical law is wrong.' International Journal of Theoretical Physics, VoL 21, Nos. 6/7, 1982 back

Richard P. Feynman (1985), Quantum mechanical computers, ' We are here considering ideal machines; the effects of imperfections will be considered later. This study is one of principle; our aim is to exhibit some Hamiltonian for a system which could serve as a computer. We are not concerned with whether we have the most efficient system, nor how we could best implement it.' back

Russell's paradox - Wikipedia, Russell's paradox - Wikipedia, the free encyclopedia, 'According to naive set theory, any definable collection is a set. Let R be the set of all sets that are not members of themselves. If R is not a member of itself, then its definition dictates that it must contain itself, and if it contains itself, then it contradicts its own definition as the set of all sets that are not members of themselves. This contradiction is Russell's paradox.' back

Sheffer stroke - Wikipedia, Sheffer stroke - Wikipedia, the free encyclopedia, 'In Boolean functions and propositional calculus, the Sheffer stroke, named after Henry M. Sheffer, written "|" . . . denotes a logical operation that is equivalent to the negation of the conjunction operation, expressed in ordinary language as "not both". It is also called nand ("not and") or the alternative denial, since it says in effect that at least one of its operands is false.' back

Shield of the Trinity - Wikipedia, Shield of the Trinity - Wikipedia, the free encyclopedia, 'The Shield of the Trinity or Scutum Fidei is a traditional Christian visual symbol which expresses many aspects of the doctrine of the Trinity, summarizing the first part of the Athanasian Creed in a compact diagram. In late medieval England and France, this emblem was considered to be the heraldic arms of God (and of the Trinity).' back

Spacetime - Wikipedia, Spacetime - Wikipedia, the free encyclopedia, ' In physics, spacetime is any mathematical model which fuses the three dimensions of space and the one dimension of time into a single four-dimensional manifold. Spacetime diagrams can be used to visualize relativistic effects, such as why different observers perceive differently where and when events occur.' back

Spin-1/2 - Wikipedia, Spin-1/2 - Wikipedia, the free encyclopedia, ' In quantum mechanics, spin is an intrinsic property of all elementary particles. All known fermions, the particles that constitute ordinary matter, have a spin of 1/2. The spin number describes how many symmetrical facets a particle has in one full rotation; a spin of 1/2 means that the particle must be rotated by two full turns (through 720°) before it has the same configuration as when it started. Particles having net spin 1/2 include the proton, neutron, electron, neutrino, and quarks. The dynamics of spin-1/2 objects cannot be accurately described using classical physics; they are among the simplest systems which require quantum mechanics to describe them. As such, the study of the behavior of spin-1/2 systems forms a central part of quantum mechanics.' back

Steven Weinberg (2000), The Cosmological Constant Problems, 'Abstract. The old cosmological constant problem is to understand why the vacuum energy is so small; the new problem is to understand why it is comparable to the present mass density. Several approaches to these problems are reviewed. Quintessence does not help with either; anthropic considerations offer a possibility of solving both. In theories with a scalar field that takes random initial values, the anthropic principle may apply to the cosmological constant, but probably to nothing else.' back

String theory - Wikipedia, String theory - Wikipedia, the free encyclopedia, 'In physics, string theory is a theoretical framework in which the point-like particles of particle physics are replaced by one-dimensional objects called strings. String theory describes how these strings propagate through space and interact with each other. On distance scales larger than the string scale, a string looks just like an ordinary particle, with its mass, charge, and other properties determined by the vibrational state of the string. In string theory, one of the many vibrational states of the string corresponds to the graviton, a quantum mechanical particle that carries gravitational force. Thus string theory is a theory of quantum gravity. ' back

Supersymmetry - Wikipedia, Supersymmetry - Wikipedia, the free encyclopedia, ' In a supersymmetric theory the equations for force and the equations for matter are identical. In theoretical and mathematical physics, any theory with this property has the principle of supersymmetry (SUSY). Dozens of supersymmetric theories exist. Supersymmetry is a spacetime symmetry between two basic classes of particles: bosons, which have an integer-valued spin and follow Bose–Einstein statistics, and fermions, which have a half-integer-valued spin and follow Fermi-Dirac statistics. In supersymmetry, each particle from one class would have an associated particle in the other, known as its superpartner, the spin of which differs by a half-integer.' back

Thales - Wikipedia, Thales - Wikipedia, the free encyclopedia, 'Thales of Miletus (Greek: Θαλῆς (ὁ Μιλήσιος), Thalēs; c. 624 – c. 546 BC) was a pre-Socratic Greek philosopher from Miletus in Asia Minor, and one of the Seven Sages of Greece. Many, most notably Aristotle, regard him as the first philosopher in the Greek tradition. Aristotle reported Thales' hypothesis about the nature of matter – that the originating principle of nature was a single material substance: water.
According to Bertrand Russell, "Western philosophy begins with Thales." Thales attempted to explain natural phenomena without reference to mythology and was tremendously influential in this respect.' back

Theological virtues - Wikipedia, Theological virtues - Wikipedia, the free encyclopedia, ' Theological virtues are virtues associated in Christian theology and philosophy with salvation resulting from the grace of God. Virtues are traits or qualities which dispose one to conduct oneself in a morally good manner. Traditionally the theological virtues have been named Faith, Hope, and Charity (Love). The medieval Catholic philosopher Thomas Aquinas explained that these virtues are called theological virtues "first, because their object is God, inasmuch as they direct us aright to God: secondly, because they are infused in us by God alone: thirdly, because these virtues are not made known to us, save by Divine revelation, contained in Holy Writ". back

Theory of Forms - Wikipedia, Theory of Forms - Wikipedia, the free encyclopedia, 'Plato's theory of Forms or theory of Ideas asserts that non-material abstract (but substantial) forms (or ideas), and not the material world of change known to us through sensation, possess the highest and most fundamental kind of reality. When used in this sense, the word form or idea is often capitalized. Plato speaks of these entities only through the characters (primarily Socrates) of his dialogues who sometimes suggest that these Forms are the only true objects of study that can provide us with genuine knowledge; thus even apart from the very controversial status of the theory, Plato's own views are much in doubt. Plato spoke of Forms in formulating a possible solution to the problem of universals.' back

Turing machine - Wikipedia, Turing machine - Wikipedia, the free encyclopedia, ' A Turing machine is a hypothetical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a computer. The "machine" was invented in 1936 by Alan Turingwho called it an "a-machine" (automatic machine). The Turing machine is not intended as practical computing technology, but rather as a hypothetical device representing a computing machine. Turing machines help computer scientists understand the limits of mechanical computation.' back

Unitarity (physics) - Wikipedia, Unitarity (physics) - Wikipedia, the free encyclopedia, ' In quantum physics, unitarity means that the sum of probabilities of all possible outcome of any event is always 1. This is necessary for the theory to be consistent. This implies that the operator which describes the progress of a physical system in time must be a unitary operator. This operator is e iHt where H is the Hamiltonian of the system and t is [an increasing number, not necessarily time since we are in Hilbert space where there is no space-time].' back

Unmoved mover - Wikipedia, Unmoved mover - Wikipedia, the free encyclopedia, ' The unmoved mover (Ancient Greek: ὃ οὐ κινούμενον κινεῖ, lit. 'that which moves without being moved' or prime mover (Latin: primum movens) is a concept advanced by Aristotle as a primary cause (or first uncaused cause) or "mover" of all the motion in the universe. As is implicit in the name, the unmoved mover moves other things, but is not itself moved by any prior action. In Book 12 (Greek: Λ) of his Metaphysics, Aristotle describes the unmoved mover as being perfectly beautiful, indivisible, and contemplating only the perfect contemplation: self-contemplation. He equates this concept also with the active intellect. This Aristotelian concept had its roots in cosmological speculations of the earliest Greek pre-Socratic philosophers and became highly influential and widely drawn upon in medieval philosophy and theology. St. Thomas Aquinas, for example, elaborated on the unmoved mover in the Quinque viae. ' back

Vacuum energy - Wikipedia, Vacuum energy - Wikipedia, the free encyclopedia, 'The effects of vacuum energy can be experimentally observed in various phenomena such as spontaneous emission, the Casimir effect and the Lamb shift, and are thought to influence the behavior of the Universe on cosmological scales. Using the upper limit of the cosmological constant, the vacuum energy of free space has been estimated to be 10−9 joules . . . per cubic meter. However, in both quantum electrodynamics (QED) and stochastic electrodynamics (SED), consistency with the principle of Lorentz covariance and with the magnitude of the Planck constant requires it to have a much larger value of 10113 joules per cubic meter. This huge discrepancy is known as the vacuum catastrophe.' back

Wave function collapse - Wikipedia, Wave function collapse - Wikipedia, the free encyclopedia, 'In quantum mechanics, wave function collapse is said to occur when a wave function—initially in a superposition of several eigenstates—appears to reduce to a single eigenstate (by "observation"). It is the essence of measurement in quantum mechanics and connects the wave function with classical observables like position and momentum. Collapse is one of two processes by which quantum systems evolve in time; the other is continuous evolution via the Schrödinger equation.' back

Whitehead and Russell (1910), Principia Mathematica, Jacket: 'Principia Mathematica was first published in 1910-1913; this is the fifth impression of the second edition of 1925-7. The Principia has long been recognized as one of the intellectual landmarks of the century. It was the first book to show clearly the close relationship between mathematics and formal logic. Starting with a minimal number of axioms, Whitehead and Russell display the structure of both kinds of thought. No other book has had such an influence on the subsequent history of mathematical philosophy .' back

Wi-Fi - Wikipedia, Wi-Fi - Wikipedia, the free enccylopedia, ' Wi-Fi is a family of wireless network protocols, based on the IEEE 802.11 family of standards, which are commonly used for local area networking of devices and Internet access, allowing nearby digital devices to exchange data by radio waves. These are the most widely used computer networks in the world. Wi‑Fi is a trademark of the non-profit Wi-Fi Alliance, which restricts the use of the term Wi-Fi Certified to products that successfully complete interoperability certification testing. As of 2017, the Wi-Fi Alliance consisted of more than 800 companies from around the world. As of 2019, over 3.05 billion Wi-Fi enabled devices are shipped globally each year.' back

Xavier Waintal (2023_12_29), The Quantum House Of Cards, ' Quantum computers have been proposed to solve a number of important problems such as discovering new drugs, new catalysts for fertilizer production, breaking encryption protocols, optimizing financial portfolios, or implementing new artificial intelligence applications. Yet, to date, a simple task such as multiplying 3 by 5 is beyond existing quantum hardware. This article examines the difficulties that would need to be solved for quantum computers to live up to their promises. I discuss the whole stack of technologies that has been envisioned to build a quantum computer from the top layers (the actual algorithms and associated applications) down to the very bottom ones (the quantum hardware, its control electronics, cryogeny, etc.) while not forgetting the crucial intermediate layer of quantum error correction.' back

Yang-Mills theory - Wikipedia, Yang-Mills theory - Wikipedia, the free encyclopedia, 'Yang–Mills theory is a gauge theory based on the SU(N) group, or more generally any compact, reductive Lie algebra. Yang–Mills theory seeks to describe the behavior of elementary particles using these non-Abelian Lie groups and is at the core of the unification of the electromagnetic force and weak forces (i.e. U(1) × SU(2)) as well as quantum chromodynamics, the theory of the strong force (based on SU(3)). Thus it forms the basis of our understanding of the Standard Model of particle physics.' back

 
 

https://www.cognitivecosmology.com is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2024 © Jeffrey Nicholls