This site documents the Divine Mind:
the Universe that enables us to create ourselves

Contact us: Click to email

page 15: Quantum amplitudes and logical processes are invisible

Table of contents

15.1: Limited resolution

15.2: Symmetry

15.3: One must act to be seen

15.4: Mathematical formalism and physical dynamics

15.5: Applied Science

15.1: Limited resolution

We are continually amazed by the extraordinary properties of matter that are routinely being revealed by quantum mechanics and we hold out high hopes for quantum computation, but are somewhat stymied by the fact that the intimate details of quantum processes are invisible to us. Matter is no longer the passive reality imagined by the ancients, the absolute opposite to spirit, but it remains very hard to understand. Baggott & Heilbron (2024): Quantum Drama: From the Bohr-Einstein Debate to the Riddle of Entanglement

Understanding the hidden quantum information is a question that we grapple with for much of this book and which lies at the heart of what makes quantum mechanics a powerful tool for information processing. Nielsen & Chuang (2000): Quantum Computation and Quantum Information, page 16

The operations of quantum fields both inside and outside fundamental particles are invisible. We can to some degree see inside larger complex particles, like baryons and atoms. We see quarks and gluons in baryons but since their binding energy is greater than their mass, we cannot isolate them as we can with photons electrons, protons, neutrons and their antiparticles which are capable of independent existence.

We use fields to explain the behaviour of the observable particles. To do this we subject particles to all sorts of violence and see what results emerge. Our leading microscope, the Large Hadron Collider can accelerate a proton until its apparent relativistic mass is about 7000 times greater than its rest mass: 6.5 TeV / 0.94 GeV. This enormous energy enables is to image the particles inside hadrons, but we still know vdry little about hew they behave. Large Hadron Collider - Wikipedia

Much of what goes on at the smallest levels in the universe is invisible to us for three main reasons:

1. Limited resolution - many features are too small to be seen;
2. Symmetry means that there is very little or nothing to be seen;
3. A system must act to be seen - it must transmit information to us.

When we measure the diagonal of a unit square with a relatively precise instrument like a micrometer, we may find that it is 1.4142 units, a rational approximation to the real number √2. With a ruler graduated in units, on the other hand the best we can say is that it is somewhere between 1 and 2. What we can and cannot see depends partly on the resolution of the instrument we use. Measurement uncertainty - Wikipedia

The dynamic resolution of the universe is limited to one quantum of action. Although in the logical world of Hilbert space this may be a precisely defined event, we find that in spacetime measurements quantization, like a ruler marked in units, establishes a fixed degree of uncertainty proportional to Planck's constant. Precision in measurement of position is traded off against precision in the measurement of momentum, and precision in the measurement of time against precision in the measurement of energy, expressed by the equations:

Δ x × Δ ph
Δ t × Δ Eh

Consequently an exact measurement of momentum (if it were possible) would give us no information about position; an exact measurement of energy (equally impossible) would require an stationary state lasting an infinite period of time. On the other hand, exact measurements of position require high momentum. This is why an electron microscope, using electrons with high momentum, can resolve distances invisible to a light microscope using low momentum photons of visible light.

Similarly, if we wish to precisely measure short intervals of time, we need high energy. It may be that the actual smallest actual interval in the universe is the Planck time, 5 × 10−44 second. This time may represent the frequency of the highest energy virtual particles in some quantum vacuum. A brief, rough calculation yields an approximate value for this energy. Planck's constant in metric units is approximately 10−33 Joule seconds. The numbers show that we need approximately fifty billion Joules of energy, to resolve a period of one Planck time, ie E = ℏ / tp = 10−33 / 5 × 10−44 . Planck units - Wikipedia, Planck constant - Wikipedia

This energy is equivalent to a mass of about 0.5 milligram, the mass of about 3 × 1022 protons. Since Large Hadron Collider can accelerate a proton to about 10 000 times its rest mass, we can guess that measuring the Planck time would take an accelerator about about a billion billion times more powerful than the LHC. We might conclude that the Planck time is not experimentally accessible. It is too small to see with any machinery we could possibly devise, and so it is practically invisible, a figment of theoretical imagination. Brian Greene (1999): The Elegant Universe: superstrings, hidden dimensions and the quest for the ultimate theory chapter 5

This is an extreme example of the fact, that given the fixed quantum of action, high energy events are very quick. At the opposite end of the scale we find eternity, the zero of energy, where nothing happens. Conventional cosmology envisages a big bang where everything happens very quickly. Here we are more interested in the divine source of the Universe, , an eternal entity of pure unstructured action identical to the god of Aristotle and Aquinas. Initial singularity - Wikipedia, Unmoved mover - Wikipedia

The machinery of quantum transformation through time is represented by the energy (or Hamiltonian) operator. This operator creates a time sequence of mappings of a Hilbert space onto itself. This mapping fulfills the hypotheses for mathematical fixed point theorems like that found by Brouwer: any continuous function f mapping a compact convex set into itself has a point x0 such that f (x0) = x0. The results of quantum observations may be understood as the fixed points predicted both by this theorem in general and by the more specific theorems of quantum mechanics. The most important of these is the eigenvalue equation which identifies operators that leave the phase of a quantum state unchanged.

T v = λ v

were T is a linear operator, v a vector (the eigenvector) and λ a real scalar (the eigenvalue). Brouwer fixed point theorem - Wikipedia, Eigenvalues and eigenvectors - Wikipedia

The evolution of quantum wave functions is invisible. We see only the fixed points revealed by measurement. We presume here that measurement is not something specific to physicists, but that elements of the Universe represented in Hilbert space continually interact with one another, that is measure one another, and communicate with one another through the interaction of the states in Hilbert space associated with particles in Minkowski space.

The creation and annihilation of particles is a reflection of the evolution of wave functions and also controls this evolution, so that we consider the two processes to be duals of one another, carrying the same information in different forms across the boundary between the quantum and classical worlds.

A similar level of uncertainty exists at all other scales, greater and smaller that the human individual. An uncertainty principle operates, for instance, between what we think, what we say and what we do. This arises because there are many different ways of expressing a thought and many different ways of putting words into action. The uncertainty principle is one of the most surprising discoveries of twentieth century physics although it has always been hiding in plain sight, like the quantum: everything we see is a discrete object and made of discrete objects. The sea is a mass of salt and water molecules, a wide range of other molcules and living creastures. Born rule - Wikipedia

Until the advent of quantum mechanics, physicists were generally inclined to believe that the world was deterministic. Some still attribute determinism to the invisible computation process that underlies quantum observations, but they now have to accept that even though this process may be deterministic, it does not determine the actual outcome of events. This uncertainty led Einstein to believe that quantum mechanics is an incomplete theory. All we can calculate are the relative frequencies of elements of the spectrum of outcomes that may result from a particular interaction between quantum states. Laplace's demon - Wikipedia, Einstein, Podolsky & Rosen (1935): Can the Quantum Mechanical Description of Physical Reality be Considered Complete?

On page 14: "Measurement": the interface between Hilbert and Minkowski spaces we have interpreted quantum measurement as a conversation between particles. We define a particle as any discrete entity, like a human person. We are aware that in everyday conversations with one another it is often impossible to predict where a conversation will lead since our minds, like quantum particles, have quasi-infinite degrees of freedom.

We know that the dynamic observable Universe is graduated or pixellated in units of Planck's constant, the measure of the smallest action, the minimum possible event. This is the limit of the resolution at which we can study action in the world.

What is not uncertain is the exact magnitude of a quantum action, because it is coupled to Planck's constant. When an electron moves from one orbital to another in an atom it emits or absorbs a photon with one quantum of angular momentum and the electron involved changed its angular momentum by one unit also, in the opposite direction because angular momentum or action is conserved. Using quantum electrodynamics, we can sometimes compute the energy change, the eigenvalue λ in the equation above, with great precision.

There is no reason to suspect that this value is not an exact constant of nature. This is why we can construct atomic clocks accurate to one second in the age of the universe. Photon - Wikipedia, W. F. McGrew et al: Atomic clock performance enabling geodesy below the centimetre level

this page 15: toc

15.2: Symmetry

The second source of invisibility is symmetry. A perfect snowflake is symmetrical, with six identical 'arms'. Because they are identical we cannot tell which is which. If we look away and someone rotates the snowflake, we have no way of telling how far it was turned or if it is was turned at all.

Traditional theology holds that God is completely mysterious to us and beyond our ken:

Having established the existence of something, the next question is how it exists in order that we learn its nature. But because we are unable to know the nature of God, but only what what God is not, we are not able to study how God exists, but rather how God does not exist. . . .

We can show how God does not exist by removing from them inappropriate features such as composition, motion and other similar things. . . .

From this discussion, Aquinas concludes that god is completely simple, omnino simplex. Thomas Aquinas, Summa, I, 3: Introduction, Fundación Tomás de Aquino: Corpus Thomisticum: Summa I, 3 (Proemium)

This is the famous via negativa.

Symmetries are situations where changes produce no observable effects, like a perfectly smooth and balanced spinning wheel. To show that it is spinning we must break the symetry by marking it in some way. Symmetries are the practical stationary boundaries of the dynamic Universe. We may picture this to a degree by imagining the string of a piano or guitar. When struck, the string vibrates at every point except at the ends, which are fixed by the structure of the instrument, and the nodes, which are fixed by the symmetrical motion of the overtones. In modern terms the absolute simplicity of God amounts to total symmetry: no matter how we look at this God, we see no detail. The initial symmetry at the foundation of this site, if we could see it, would look like this vision of God. Later we will identify this symmetry, which sees all structures identically as energy, with gravitation: page 17: Gravitation and quantum theory—in the beginning. Symmetry - Wikipedia, Dwight Neuenschwander (2011): Emmy Noether's Wonderful Theorem

When we consider the Universe as divine, we can imagine the symmetries discovered by physics as the boundaries of the divinity. From a logical point of view, the dynamics of the Universe is consistent. The boundaries of the dynamics are the points beyond which it would become inconsistent, that is non-existent: outside the Universe.

All our experience is experience of God, and all our experiences are in effect measurements of God, that is events that we see as fixed points in the divine dynamics. We can learn a lot more about the natural God than the traditional theologies can learn about their imagined Gods. The natural God is only partially visible, but since we are continually in contact with it, we have a good chance of learning how it works. We can know nothing about completely invisible Gods but what their prophets have chosen to tell us.

Nevertheless true knowledge of God is necessary for survival. In a divine world everything can be learnt from the world itself without any special revelation. The principal advantage of religious fictions, if they are widely believed, is that they form a basis for cooperation which is the most powerful force for survival. The old dictum says united we stand, divided we fall. Our religions unite us, but they may also divide us from groups who follow different religions. These two tendencies working together often form a ground for war. Extremists at all scales from the Papacy, Xi and Putin to individuals who consider their own views to be sovereign despite evidence to the contrary, have the effect of taking the world out of the region of symmetry and consistency into fatal inconsistencies. Polanyi (1966, 2009): The Tacit Dimension

One aim of this project is to enable us to develop a scientific theology based on observations of the divine reality in which we live. This will establish a symmetry in theology just as biology establishes a symmetry in health care. Apart from minor individual differences, we all share basically the same anatomy and physiology so that a doctor can approach every patient with a basic knowledge of the probable cause of a particular set of symptoms. The role of a scientific theology is to play a similar role in care of the soul, since we all share the same human spirit by common descent from our ancestors whose lives all go back to the last universal common ancestor. Last universal common ancestor - Wikipedia

Although we are all unique individuals we share a fundamental genetic symmetry which is reflected in the fact that there is no specific restraint on us choosing partners for reproduction anywhere in the world. This is the symmetry which we share with the perfect wheel. Because of our individuality we can all get to know one another as unique, and we have constructed many social distinctions which put us into specific categories of race, language, colour and culture which can often be divisive and the source of social problems. These differences have no ground in our shared human symmetry any more than they are grounds for the medical profession to differentiate health care on social grounds. Audley Smedley & Brian D. Smedley (2005): Race as biology is fiction, racism as a social problem is real: Anthropological and historical perspectives on the social construction of race

this page 15: toc

15.3: One must act to be seen

The third source of invisibility is that what we can only see something if it is actively emitting a signal that we can receive. Attention seeking, from childhood to superstardom, is a consistent feature of human behaviour. We signal because we want to be seen, to be acknowledged, to have a voice in our society.

Here we model all communication in the universe with a computer network. We cannot tell what a computer is doing unless it communicates with us, and communication is itself a computation. We cannot therefore see every move that a computer makes. It would have to stop what it was doing to explain itself after every operation and since this explanation is also a computation it would also be required to explain the operations of explaining itself, creating an endless recursion which would stop it from getting anywhere. For this reason we see only the results of halted processes. In effect, we can only read stopped clocks.

We therefore assume that the observable fixed points in the Universe are the states of halted computers and the invisible dynamic of the Universe are executed by invisible computers, either quantum or classical. We suspect the presence of deterministic digital computation because of the precision with which nature determines the eigenvalues of various observations. Bastin & Kilmister (1995): Combinatorial Physics

This phenomenon appears at all scales. If you ask someone what they are doing, particularly if it requires close attention, they must stop what they are doing to explain themselves. Ultimately the inner workings of any system must be partially hidden from us, a situation that prevails in both human relationships and quantum mechanics. This situation lies at the heart of the difference between classical and quantum physics and is closely related to the measurement problem discussed on page 14: Measurement: the interface between Hilbert and Minkowski spaces.

Hilbert space in phyics plays a role analogous to that of the mind or brain in human conversations. In both the human and the physical world there are large number of things that could be said. At each moment only one can be said. This is what is meant by the "collapse of the wavefunction". It is the source of difficulty in the interpretation of Ludwig Wittgenstein's multifaceted philosophy. The physical system transmits just one of a possible infinity of messages. When we are talking to one another we have a headful of things to say and consciously or unconsciously choose from these possibilities. Part of our evolutionary heritage is the power of deception, which plays an important role in survival. This is stock in trade for spies and confidence tricksters, but even with the best will in the world it is very hard for two people to share a precise understanding of what they are thinking. Anat Biletski and Anat Matar: Ludwig Wittgenstein (Standford Encyclopedia of Philosophy)

There is, therefore, a logical limit to our knowledge of the world and each other which is implicit in the scientific method of hypothesis and testing. We make hypotheses about what is happening in regions where we cannot see, and test them by exploring their consequences to see if they correctly predict phenomena that we can see. A successful test does not necessarily guarantee a correct hypothesis but a failed test tells us that the hypothesis must be revised. The arts of science, fiction and evolution all explore this space, and this creative exploration is ultimately made possible by the uncertainty in observation of invisible features of ourselves and our world, that is the uncertainty in measurement. If it weren't for the variety provided by this uncertainty the world could not have evolved from the initial singularity to its present state. Jules and Jim - Wikipedia, Adam Sisman (2015): John le Carré: The Biography

It took physicists nearly thirty years, from 1900 to the late 1920s, to bring non-relativistic quantum mechanics to its definitive form. An important step forward was made by Heisenberg who pointed out that our only task is to explain the observable phenomena. We need not be bound by the classical historical picture of how the world works but are free to explore all possibilities to find satisfactory explanations. Werner Heisenberg (1925): Quantum-theoretical re-interpretation of kinematic and mechanical relations

this page 15: toc

15.4: Mathematical formalism and physical dynamics

The clock in a computer has two complementary roles. First it determines the rate of processing in cycles per second, typically of the order of a billions in modern machines. In the physical world the rate of processing varies from zero (ie eternity) to an upper bound which represents the total kinetic energy of the Universe.

Second, the clock hides the actual physical kinematics of logical processing. A clock signal has two 'edges' which we may call up and down. The up edge sets the process in motion, transistors switch and voltages change. The down edge shuts the process down, so everything halts and the machine comes to rest in a stable stationary state until the next up edge. The effect of this is to make the computer behave like a purely kinetic formal logic machine. This is similar to the operation of old mechanical motion picture film cameras and projectors. A shutter comes down, blocking the light. The film is stepped forward one frame. The shutter opens and the camera takes a snap or the projector puts an image on the screen. The shutter closes and film advances, . . .. Turing machine - Wikipedia

It has been found that the illusion of movement is almost perfect for most people if movies are filmed and projected at 24 frames per second. Other animals, like flies which have much more detailed time resolution in their vision, might see a motion picture as a sequence of still images.

The clock hides the physics to reveal the formal logic. The inverse process, performed by mathematical physicists, is to try to fit formal logical and mathematical ideas to the world. This leads to a number of extreme positions which are often implicitly criticized in these pages. Carlo Rovelli (2017}: Reality is Not What it Seems: The Journey to Quantum Gravity

this page 15: toc

15.5: Applied science

The macroscopic shape of crystals give us clues to their atomic arrangements which are studied in detail by x-ray diffraction. Initial information about the electronic structure of atoms came from spectroscopy which yielded comprehensive information about energy levels but revealed few clues to spatial organization. Rutherford studied the scattering of alpha particles by gold foil and concluded that the gold atom had a heavy central element many thousands of time smaller that the atom itself. Rutherford model - Wikipedia, X-ray crystallography - Wikipedia

Niels Bohr produced a structural model of atomic electrons which accounts quite closely for the spectrum of the hydrogen atom. This model required rejection of the classical belief that moving electrons would radiate energy so that the atom would collapse. Instead it suggested that the electrons move in orbits whose angular momentum is measured in integral multiples of Planck's constant and that the energy changes that occur when an electron moved from one orbit to another accounts for the atomic spectrum. It took twenty more years for quantum mechanics to develop sufficiently to explain the structure of atoms with many electrons. Bohr model - Wikipedia

Initially quantum mechanics was driven by the study of atomic spectra which are generated by the electronic structure of atoms. The study of the nucleus began with the discovery of radioactivity by Henri Becquerel in 1896. The energy of nuclear events is typically millions of times greater than that of electronic events. Nuclear physics led to a steady stream of discoveries of new particles. Until the 1930s the nucleus was thought to comprise protons and electrons, but in 1932 Chadwick discovered the neutron which became a very useful tool for studying nuclear reactions. Nuclear Physics - Wikipedia

Until the 1960's the known massive elementary particles were generally considered to have no internal structure and many of them, like the electron, were supposed to be pointlike, having zero size. High energy experiments began to suggest that the proton, neutron and other heavy massive particles (hadrons) have internal structure which is now explained in terms of simpler quarks and gluons. Hadron - Wikipedia

Almost all our information about fundamental particles is collected by scattering them off one another in particle accelerators. As noted above, our knowledge of cells began with light microscopes which use light photons whose energy is a few electron volts. Electron microscopes operate with electrons accelerated through high voltages to give them much higher momentum and much shorter wavelength. This enables them to magnify small structures up to 10 million times, enough to make an atom appear a thousandth of a millimetre in diameter. Electron microscope - Wikipedia

This high resolution is a consequence of the uncertainty principle quote in section 15.1 above: high momentum implies short wavelength and high resolution. Particle accelerators function as microscopes. Electrons accelerated to 50 billion electron volts by the Stanford Linear Accelerator revealed the inner structure of hadrons. This work is known as quantum chromodynamics and will treated on page 25: Physical theology. Richard E. Taylor (1990): Nobel Lecture: I. Deep Inelastic Scattering: The Early Years, Deep inelastic scattering - Wikipedia

this page 15: toc

(Revised Friday 9 August 2024)

Back to site toc

Copyright:

You may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

Notes and references

Further reading

Books

Baggott (2024), Jim, and John L.Heilbron, Quantum Drama: From the Bohr-Einstein Debate to the Riddle of Entanglement , Oxford UP 2024 ' The definitive account of the great Bohr-Einstein debate and its continuing legacy. In 1927, Niels Bohr and Albert Einstein began a debate about the interpretation and meaning of the new quantum theory. This would become one of the most famous debates in the history of science. At stake were an understanding of the purpose, and defense of the integrity, of science. What (if any) limits should we place on our expectations for what science can tell us about physical reality?Our protagonists slowly disappeared from the vanguard of physics, as its centre of gravity shifted from a war-ravaged Continental Europe to a bold,pragmatic, post-war America. What Einstein and Bohr had considered to be matters of the utmost importance were now set aside. Their debate was regarded either as settled in Bohr's favour or as superfluous to real physics. But the debate was not resolved. The problems of interpretation and meaning persisted, at least in the minds of a few stubborn physicists, such as David Bohm and John Bell, who refused to stop asking awkward questions. The Bohr-Einstein debate was rejoined, now with a new set of protagonists, on a small scale at first. Through their efforts, the debate was revealed to be about physics after all. Their questions did indeed have answers that could be found in a laboratory. As quantum entanglement became a real physical phenomenon, whole new disciplines were established, such as quantum computing, teleportation, and cryptography. The efforts of the experimentalists were rewarded with shares in the 2022 Nobel prize in physics. As Quantum Drama reveals, science owes a large debt to those who kept the discussions going against the apathy and indifference of most physicists before definitive experimental inquiries became possible. Although experiment moved the Bohr-Einstein debate to a new level and drew many into foundational research, it has by no means removed or resolved the fundamental question. There will be no Nobel prize for an answer. That will not shut off discussion. Our Drama will continue beyond our telling of it and is unlikely to reach its final scene before science ceases or the world ends. Quantum mechanics is an extraordinarily successful scientific theory. It is also completely baffling. From the moment of its inception, its founders struggled to understand its meaning. This struggle was most famously encapsulated in the debate between Niels Bohr and Albert Einstein; Quantum Drama tells the story of their engagement and its legacy.' 
Amazon
  back

Bastin (1995), Ted, and C W Kilmister, Combinatorial Physics, World Scientific 1995 About this book (World Scientific) 'The authors aim to reinstate a spirit of philosophical enquiry in physics. They abandon the intuitive continuum concepts and build up constructively a combinatorial mathematics of process. This radical change alone makes it possible to calculate the coupling constants of the fundamental fields which — via high energy scattering — are the bridge from the combinatorial world into dynamics. The untenable distinction between what is ‘observed’, or measured, and what is not, upon which current quantum theory is based, is not needed. If we are to speak of mind, this has to be present — albeit in primitive form — at the most basic level, and not to be dragged in at one arbitrary point to avoid the difficulties about quantum observation. There is a growing literature on information-theoretic models for physics, but hitherto the two disciplines have gone in parallel. In this book they interact vitally.' 
Amazon
  back

Crotty (2016), Robert, Jesus, His Mother, Her Sister Mary and Mary Magdalene: The Gnostic Background to the Gospel of John, David Lovell Publishing 2016 ' The Gospel of John has always been a difficult book to interpret. The differences between John and the Synoptics have always been a stumbling block for students. . . . This book takes up these problems. It demonstrates that the present text has followed a long and tortured journey from Jewish Gnosticism to a Christian Gnostic compendium, later extensively edited by Roman Christianity. The result is a surprising re-reading. The book throws light on a different Jesus to the canonical one (he is not human). . . The Roman Christians disagreed on all these interpretations and heavily edited the gospel in order to silence its Gnostic statement. This book will show how the gospel of John should be read at the present time to take account of this complex tradition history.' 
Amazon
  back

Greene (1999), Brian, The Elegant Universe: Superstrings, hidden dimensions and the quest for the ultimate theory, W W Norton and Company 1999 Jacket: 'Brian Greene has come forth with a beautifully crafted account of string theory - a theory that appears to be a most promising way station to an ultimate theory of everything. His book gives a clear, simple, yet masterful account that makes a complex theory very accessible to nonscientists but is also a delightful read for the professional.' David M Lee 
Amazon
  back

Neuenschwander (2011), Dwight E, Emmy Noether's Wonderful Theorem, Johns Hopkins University Press 2011 Jacket: A beautiful piece of mathematics, Noether's theorem touches on every aspect of physics. Emmy Noether proved her theorem in 1915 and published it in 1918. This profound concept demonstrates the connection between conservation laws and symmetries. For instance, the theorem shows that a system invariant under translations of time, space or rotation will obey the laws of conservation of energy, linear momentum or angular momentum respectively. This exciting result offers a rich unifying principle for all of physics.' 
Amazon
  back

Nielsen (2000), Michael A, and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2000 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Polanyi (1966, 2009), Michael, and Amaryta Sen (foreword), The Tacit Dimension, University Of Chicago Press 1966, 2009 ' “I shall reconsider human knowledge by starting from the fact that we can know more than we can tell,” writes Michael Polanyi, whose work paved the way for the likes of Thomas Kuhn and Karl Popper. The Tacit Dimension argues that tacit knowledge—tradition, inherited practices, implied values, and prejudgments—is a crucial part of scientific knowledge. Back in print for a new generation of students and scholars, this volume challenges the assumption that skepticism, rather than established belief, lies at the heart of scientific discovery.' 
Amazon
  back

Rovelli (2017), Carlo, and Simon Carnell & Erica Sere (Translators), Reality is Not What it Seems: The Journey to Quantum Gravity, Allen Lane Penguin 2017 ' Be prepared for your intellectual foundations to be vaporized . . . Carlo Rovelli will melt your synapses with this exploration of physical reality and what the universe is formed of at the very deepest level . . . Quantum gravity is so new that there aren't many popular books about it. You couldn't be in better hands than Rovelli, a world expert.' Tara Shears, The Times Higher Edcation 
Amazon
  back

Sisman (2015), Adam, John le Carré: The Biography, Bloomsbury 2015 ' Written with exclusive access to David Cornwell himself, to his private archive and to the most important people in his life – family, friends, enemies, intelligence ex-colleagues and ex-lovers – and featuring a wealth of previously unseen photographic material, Adam Sisman's extraordinarily insightful and constantly revealing biography brings in from the cold a man whose own life has been as complex and confounding and filled with treachery as any of his novels. 'I'm a liar, Cornwell has written. 'Born to lying, bred to it, trained to it by an industry that lies for a living, practised in it as a novelist.' 
Amazon
  back

Links

Anat Biletski and Anat Matar, Ludwig Wittgenstein (Standford Encyclopedia of Philosophy), 'Considered by some to be the greatest philosopher of the 20th century, Ludwig Wittgenstein played a central, if controversial, role in 20th-century analytic philosophy. He continues to influence current philosophical thought in topics as diverse as logic and language, perception and intention, ethics and religion, aesthetics and culture.' back

Anatomy - Wikipedia, Anatomy - Wikipedia, the free encyclopedia, ' Anatomy (from Ancient Greek ἀνατομή (anatomḗ) 'dissection') is the branch of biology concerned with the study of the structure of organisms and their parts. Anatomy is a branch of natural science which deals with the structural organization of living things. It is an old science, having its beginnings in prehistoric times. Anatomy is inherently tied to developmental biology, embryology, comparative anatomy, evolutionary biology, and phylogeny, as these are the processes by which anatomy is generated, both over immediate and long-term timescales.' back

Audley Smedley & Brian D. Smedley (2005), Race as biology is fiction, racism as a social problem is real: Anthropological and historical perspectives on the social construction of race , ' Racialized science seeks to explain human population differences in health, intelligence, education, and wealth as the consequence of immutable, biologically based differences between "racial" groups. Recent advances in the sequencing of the human genome and in an understanding of biological correlates of behavior have fueled racialized science, despite evidence that racial groups are not genetically discrete, reliably measured, or scientifically meaningful. Yet even these counterarguments often fail to take into account the origin and history of the idea of race. This article reviews the origins of the concept of race, placing the contemporary discussion of racial differences in an anthropological and historical context.' back

Bohr model - Wikipedia, Bohr model - Wikipedia, the free encyclopedia, 'In atomic physics, the Rutherford–Bohr model or Bohr model, introduced by Niels Bohr in 1913, depicts the atom as a small, positively charged nucleus surrounded by electrons that travel in circular orbits around the nucleus—similar in structure to the solar system, but with attraction provided by electrostatic forces rather than gravity.' back

Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, ' The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see bra-ket notation), then the measured result will be one of the eigenvalues λ of A, and the probability of measuring a given eigenvalue λi will equal <ψ|Pi|ψ> where Pi is the projection onto the eigenspace of A corresponding to λi'.' back

Brouwer fixed point theorem - Wikipedia, Brouwer fixed point theorem - Wikipedia, the free encyclopedia, 'Among hundreds of fixed-point theorems] Brouwer's is particularly well known, due in part to its use across numerous fields of mathematics. In its original field, this result is one of the key theorems characterizing the topology of Euclidean spaces, along with the Jordan curve theorem, the hairy ball theorem, the invariance of dimension and the Borsuk–Ulam theorem. This gives it a place among the fundamental theorems of topology.' back

Deep inelastic scattering - Wikipedia, Deep inelastic scattering - Wikipedia, the free encyclopedia, ' Deep inelastic scattering is the name given to a process used to probe the insides of hadrons (particularly the baryons, such as protons and neutrons), using electrons, muons and neutrinos. It provided the first convincing evidence of the reality of quarks, which up until that point had been considered by many to be a purely mathematical phenomenon. . . . Henry Way Kendall, Jerome Isaac Friedman and Richard E. Taylor were joint recipients of the Nobel Prize of 1990 "for their pioneering investigations concerning deep inelastic scattering of electrons on protons and bound neutrons, which have been of essential importance for the development of the quark model in particle physics".' back

Discovery of the neutron - Wikipedia, Discovery of the neutron - Wikipedia, the free encyclopedia, ' The essential nature of the atomic nucleus was established with the discovery of the neutron by James Chadwick in 1932 and the determination that it was a new elementary particle, distinct from the proton. The uncharged neutron was immediately exploited as a new means to probe nuclear structure, leading to such discoveries as the creation of new radioactive elements by neutron irradiation (1934) and the fission of uranium atoms by neutrons (1938). The discovery of fission led to the creation of both nuclear power and nuclear weapons by the end of World War II. Both the proton and the neutron were presumed to be elementary particles until the 1960s, when they were determined to be composite particles built from quarks.' back

Eigenvalues and eigenvectors - Wikipedia, Eigenvalues and eigenvectors - Wikipedia, the free encyclopedia, ' In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by λ, is the factor by which the eigenvector is scaled. Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated.' back

Einstein, Podolsky & Rosen (1935), Can the Quantum Mechanical Description of Physical Reality be Considered Complete?, A PDF of the classic paper. 'In a complete theory there is an element corresponding to each element of reality. A sufficient condition for the reality of a physical quantity is the possibility of predicting it with certainty, without disturbing the system. In quantum mechanics in the case of two physical quantities described by non-commuting operators, the knowledge of one precludes the knowledge of the other. Then either (1) the description of reality given by the wave function in quantum mechanics is not complete or (2) these two quantities cannot have simultaneous reality. Consideration of the problem of making predictions concerning a system on the basis of measurements made on another system that had previously interacted with it leads to the result that if (1) is false then (2) is also false, One is thus led to conclude that the description of reality given by the wave function is not complete.' back

Electron microscope - Wikipedia, Electron microscope - Wikipedia, the free encylopedia, ' An electron microscope is a microscope that uses a beam of accelerated electrons as a source of illumination. As the wavelength of an electron can be up to 100,000 times shorter than that of visible light photons, electron microscopes have a higher resolving power than light microscopes and can reveal the structure of smaller objects. A scanning transmission electron microscope has achieved better than 50 pm resolution in annular dark-field imaging mode and magnifications of up to about 10,000,000× whereas most light microscopes are limited by diffraction to about 200 nm resolution and useful magnifications below 2000×.' back

Fundación Tomás de Aquino, Corpus Thomisticum: Summa I, 3 (Proemium) , Quaestio 3 Prooemium Cognito de aliquo an sit, inquirendum restat quomodo sit, ut sciatur de eo quid sit. Sed quia de Deo scire non possumus quid sit, sed quid non sit, non possumus considerare de Deo quomodo sit, sed potius quomodo non sit. Primo ergo considerandum est quomodo non sit; secundo, quomodo a nobis cognoscatur; tertio, quomodo nominetur. Potest autem ostendi de Deo quomodo non sit, removendo ab eo ea quae ei non conveniunt, utpote compositionem, motum, et alia huiusmodi. Primo ergo inquiratur de simplicitate ipsius, per quam removetur ab eo compositio. Et quia simplicia in rebus corporalibus sunt imperfecta et partes, secundo inquiretur de perfectione ipsius; tertio, de infinitate eius; quarto, de immutabilitate; quinto, de unitate. Circa primum quaeruntur octo. Primo, utrum Deus sit corpus. Secundo, utrum sit in eo compositio formae et materiae. Tertio, utrum sit in eo compositio quidditatis, sive essentiae, vel naturae, et subiecti. Quarto, utrum sit in eo compositio quae est ex essentia et esse. Quinto, utrum sit in eo compositio generis et differentiae. Sexto, utrum sit in eo compositio subiecti et accidentis. Septimo, utrum sit quocumque modo compositus, vel totaliter simplex. Octavo, utrum veniat in compositionem cum aliis. back

Hadron - Wikipedia, Hadron - Wikipedia, the free encyclopedia, In particle physics, a hadron is a composite subatomic particle made of two or more quarks held together by the strong interaction. They are analogous to molecules that are held together by the electric force. Most of the mass of ordinary matter comes from two hadrons: the proton and the neutron, while most of the mass of the protons and neutrons is in turn due to the binding energy of their constituent quarks, due to the strong force.' back

Initial singularity - Wikipedia, Initial singularity - Wikipedia, the free encyclopedia, ' The initial singularity is a singularity predicted by some models of the Big Bang theory to have existed before the Big Bang. The instant immediately following the initial singularity is part of the Planck epoch, the earliest period of time in the history of our universe. The use of only general relativity to predict what happened in the beginnings of the Universe has been heavily criticized, as quantum mechanics becomes a significant factor in the high-energy environment of the earliest Universe, and general relativity on its own fails to make accurate predictions. Although there is no direct evidence for a singularity of infinite density, the cosmic microwave background is evidence that the universe expanded from a very hot, dense state. ' back

Jules and Jim - Wikipedia, Jules and Jim - Wikipedia, the free encyclopedia, ' Jules and Jim (French: Jules et Jim is a 1962 French New Wave romantic drama film, directed, produced and written by François Truffaut. Set before and after World War I, it describes a tragic love triangle involving French Bohemian Jim (Henri Serre), his shy Austrian friend Jules (Oskar Werner), and Jules's girlfriend and later wife Catherine (Jeanne Moreau). The film is based on Henri-Pierre Roché's 1953 semi-autobiographical novel describing his relationship with young writer Franz Hessel and Helen Grund, whom Hessel married. . . . The film won the 1962 Grand Prix of French film prizes, the Étoile de Cristal, and Jeanne Moreau won that year's prize for best actress. The film ranked 46 in Empire magazine's "The 100 Best Films Of World Cinema" in 2010.' back

Laplace's demon - Wikipedia, Laplace's demon - Wikipedia, the free encyclopedia, ' We may regard the present state of the universe as the effect of its past and the cause of its future. An intellect which at a certain moment would know all forces that set nature in motion, and all positions of all items of which nature is composed, if this intellect were also vast enough to submit these data to analysis, it would embrace in a single formula the movements of the greatest bodies of the universe and those of the tiniest atom; for such an intellect nothing would be uncertain and the future just like the past would be present before its eyes.' A Philosophical Essay on Probabilities, Essai philosophique dur les probabilites introduction to the second edition of Theorie analytique des probabilites based on a lecture given in 1794. back

Large Hadron Collider - Wikipedia, Large Hadron Collider - Wikipedia, the free encyclopedia, ' The Large Hadron Collider (LHC) is the world's largest and highest-energy particle collider. It was built by the European Organization for Nuclear Research (CERN) between 1998 and 2008 in collaboration with over 10,000 scientists and hundreds of universities and laboratories, as well as more than 100 countries. It lies in a tunnel 27 kilometres (17 mi) in circumference and as deep as 175 metres (574 ft) beneath the France–Switzerland border near Geneva. The first collisions were achieved in 2010 at an energy of 3.5 teraelectronvolts (TeV) per beam, about four times the previous world record. After upgrades it reached 6.5 TeV per beam (13 TeV total collision energy, the present world record). At the end of 2018, it was shut down for three years for further upgrades.' back

Measurement uncertainty - Wikipedia, Measurement uncertainty - Wikipedia, the free encyclopedia, 'In metrology, measurement uncertainty is a non-negative parameter characterizing the dispersion of the values attributed to a measured quantity. The uncertainty has a probabilistic basis and reflects incomplete knowledge of the quantity. All measurements are subject to uncertainty and a measured value is only complete if it is accompanied by a statement of the associated uncertainty.' back

Nuclear Physics - Wikipedia, Nuclear Physics - Wikipedia, the free encyclopedia, ' Nuclear physics is the field of physics that studies atomic nuclei and their constituents and interactions, in addition to the study of other forms of nuclear matter. . . . Particle physics evolved out of nuclear physics and the two fields are typically taught in close association. Nuclear astrophysics, the application of nuclear physics to astrophysics, is crucial in explaining the inner workings of stars and the origin of the chemical elements.' back

Photon - Wikipedia, Photon - Wikipedia, the free encyclopedia, ' A photon (from Ancient Greek φῶς, φωτός (phôs, phōtós) 'light') is an elementary particle that is a quantum of the electromagnetic field, including electromagnetic radiation such as light and radio waves, and the force carrier for the electromagnetic force. Photons are massless, so they always move at the speed of light in vacuum, 299792458 m/s . . .. The photon belongs to the class of bosons.' back

Planck constant - Wikipedia, Planck constant - Wikipedia, the free encyclopedia, ' Since energy and mass are equivalent, the Planck constant also relates mass to frequency. By 2017, the Planck constant had been measured with sufficient accuracy in terms of the SI base units, that it was central to replacing the metal cylinder, called the International Prototype of the Kilogram (IPK), that had defined the kilogram since 1889. . . . For this new definition of the kilogram, the Planck constant, as defined by the ISO standard, was set to 6.626 070 150 × 10-34 J⋅s exactly. ' back

Planck units - Wikipedia, Planck units - Wikipedia, the free encycloedia, ' In particle physics and physical cosmology, Planck units are a set of units of measurement defined exclusively in terms of four universal physical constants, in such a manner that these physical constants take on the numerical value of 1 when expressed in terms of these units. . Originally proposed in 1899 by German physicist Max Planck, these units are also known as natural units because the origin of their definition comes only from properties of nature and not from any human construct. Planck units are only one system of several systems of natural units, but Planck units are not based on properties of any prototype object or particle (that would be arbitrarily chosen), but rather on only the properties of free space.' back

Richard E. Taylor (1990), Nobel Lecture: I. Deep Inelastic Scattering: The Early Years, ' Soon after the 1990 Nobel Prize in Physics was announced Henry Kendall, Jerry Friedman and I agreed that we would each describe a part of the deep inelastic experiments in our Nobel lectures. The division we agreed upon was roughly chronological. I would cover the early times, describing some of the work that led to the establishment of the Stanford Linear Accelerator Center where the experiments were performed, followed by a brief account of the construction of the experimental apparatus used in the experiments and the commissioning of the spectrometer facility in early elastic scattering experiments at the Center.' back

Rutherford model - Wikipedia, Rutherford model - Wikipedia, the free encyclopedia, ' The Rutherford model was devised by the New Zealand-born physicist Ernest Rutherford to describe an atom. Rutherford directed the Geiger–Marsden experiment in 1909, which suggested, upon Rutherford's 1911 analysis, that J. J. Thomson's plum pudding model of the atom was incorrect. Rutherford's new model for the atom, based on the experimental results, contained new features of a relatively high central charge concentrated into a very small volume in comparison to the rest of the atom and with this central volume also containing the bulk of the atomic mass of the atom. This region would be known as the "nucleus" of the atom.' back

Symmetry - Wikipedia, Symmetry - Wikipedia, the free encyclopedia, ' Symmetry (from Ancient Greek: συμμετρία symmetria "agreement in dimensions, due proportion, arrangement") in everyday language refers to a sense of harmonious and beautiful proportion and balance. In mathematics, "symmetry" has a more precise definition, and is usually used to refer to an object that is invariant under some transformations; including translation, reflection, rotation or scaling. Although these two meanings of "symmetry" can sometimes be told apart, they are intricately related, and hence are discussed together in this article. ' back

Thomas Aquinas, Summa, I, 3, Introduction, ' When the existence of a thing has been ascertained there remains the further question of the manner of its existence, in order that we may know its essence. Now, because we cannot know what God is, but rather what He is not, we have no means for considering how God is, but rather how He is not. . . . ' back

Turing machine - Wikipedia, Turing machine - Wikipedia, the free encyclopedia, ' A Turing machine is a hypothetical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a computer. The "machine" was invented in 1936 by Alan Turingwho called it an "a-machine" (automatic machine). The Turing machine is not intended as practical computing technology, but rather as a hypothetical device representing a computing machine. Turing machines help computer scientists understand the limits of mechanical computation.' back

Unmoved mover - Wikipedia, Unmoved mover - Wikipedia, the free encyclopedia, ' The unmoved mover (Ancient Greek: ὃ οὐ κινούμενον κινεῖ, lit. 'that which moves without being moved' or prime mover (Latin: primum movens) is a concept advanced by Aristotle as a primary cause (or first uncaused cause) or "mover" of all the motion in the universe. As is implicit in the name, the unmoved mover moves other things, but is not itself moved by any prior action. In Book 12 (Greek: Λ) of his Metaphysics, Aristotle describes the unmoved mover as being perfectly beautiful, indivisible, and contemplating only the perfect contemplation: self-contemplation. He equates this concept also with the active intellect. This Aristotelian concept had its roots in cosmological speculations of the earliest Greek pre-Socratic philosophers and became highly influential and widely drawn upon in medieval philosophy and theology. St. Thomas Aquinas, for example, elaborated on the unmoved mover in the Quinque viae. ' back

W. F. McGrew et al, Atomic clock performance enabling geodesy below the centimetre level, ' The passage of time is tracked by counting oscillations of a frequency reference, such as Earth’s revolutions or swings of a pendulum. By referencing atomic transitions, frequency (and thus time) can be measured more precisely than any other physical quantity, with the current generation of optical atomic clocks reporting fractional performance below the 10−17 level. However, the theory of relativity prescribes that the passage of time is not absolute, but is affected by an observer’s reference frame. Consequently, clock measurements exhibit sensitivity to relative velocity, acceleration and gravity potential. Here we demonstrate local optical clock measurements that surpass the current ability to account for the gravitational distortion of space-time across the surface of Earth. In two independent ytterbium optical lattice clocks, we demonstrate unprecedented values of three fundamental benchmarks of clock performance. In units of the clock frequency, we report systematic uncertainty of 1.4 × 10−18, measurement instability of 3.2 × 10−19 and reproducibility characterized by ten blinded frequency comparisons, yielding a frequency difference of [−7 ± (5)stat ± (8)sys] × 10−19, where ‘stat’ and ‘sys’ indicate statistical and systematic uncertainty, respectively. Although sensitivity to differences in gravity potential could degrade the performance of the clocks as terrestrial standards of time, this same sensitivity can be used as a very sensitive probe of geopotential. Near the surface of Earth, clock comparisons at the 1 × 10−18 level provide a resolution of one centimetre along the direction of gravity, so the performance of these clocks should enable geodesy beyond the state-of-the-art level. These optical clocks could further be used to explore geophysical phenomena, detect gravitational waves, test general relativity and search for dark matter.' back

Werner Heisenberg (1925), Quantum-theoretical re-interpretation of kinematic and mechanical relations , 'The present paper seeks to establish a basis for theoretical quantum mechanics founded exclusively upon relationships between quantities which in principle are observable.' [From Sources of Quantum Mechanics, edited by B. L. van der Waerden (Amsterdam, North-Holland, 1967)] back

X-ray crystallography - Wikipedia, X-ray crystallography - Wikipedia, the free encyclopedia, X-ray crystallography is the experimental science determining the atomic and molecular structure of a crystal, in which the crystalline structure causes a beam of incident X-rays to diffract into many specific directions. By measuring the angles and intensities of these diffracted beams, a crystallographer can produce a three-dimensional picture of the density of electrons within the crystal. From this electron density, the mean positions of the atoms in the crystal can be determined, as well as their chemical bonds, their crystallographic disorder, and various other information.' back

 
 

https://www.cognitivecosmology.com is maintained by The Theology Company Proprietary Limited ACN 097 887 075 ABN 74 097 887 075 Copyright 2000-2024 © Jeffrey Nicholls