This site documents the Divine Mind:
the Universe that enables us to create ourselves

Contact us: Click to email

Page 19: Networks: cooperation and bonding

Table of contents

19.1: Love, entropy and a great leap forward

19.2: Quantum mechanics is a language

19.3: Elementary particles: bosons and fermions

19.4: The quantum evolution of minkowski space: fermionic networks need 4D for their existence

19.5: Transfinite networks in minkowski space

19.6: Senses, brains and muscles: living networks

19.7: Communities, theologies, religions and politics

19.8: Some properties of classical networks

19.1: Love, entropy and a great leap forward

Through divine revelation, God chose to show forth and communicate himself and the eternal decisions of his will regarding the salvation of men. that is to say, he chose to share with them those divine treasures which totally transcend the understanding of the human mind. Papal Encyclicals Online First Vatican Council: Dogmatic Constitution on the Catholic Faith

6. As a sacred synod has affirmed, God, the beginning and end of all things, can be known with certainty from created reality by the light of human reason; but teaches that it is through his revelation that those religious truths which are by their nature accessible to human reason can be known by all men with ease, with solid certitude and with no trace of error, even in this present state of the human race. Second Vatican Council: Dogmatic Constitution on Divine Revelation—Dei Verbum

When Pope John xxiii opened the Second Vatican Council in October 1962 I was about to join the Dominican Order to become a catholic priest. By the time Pope Paul VI closed the council in 1965 I had spent three years studying the work of Thomas Aquinas and seeds of heretical dissent had sprouted in my mind. Second Vatican Council - Wikipedia

Aquinas was a Dominican theologian who lived and worked in the thirteenth century when works of Aristotle translated from Arabic and Greek were beginning to ignite a revolution in European theology. During my years in the order I read the work of Aquinas closely and slowly formed the opinion that if one could replace the rather rudimentary Aristotelian science that was then available to him with modern science, one could make a good case for establishing a theology as a science in the modern sense of the word. I circulated an essay explaining my idea. Unfortunately, since the time of Galileo, the Catholic Church had divorced itself from science and established itself as an infallible autocracy. My view was deemed heretical. After five years I was expelled from the order and my solemn vows of poverty, chastity and obedience were annulled by the Pope. Aquinas, Summa, I, 1, 2: Is sacred doctrine is a science?, Fortun & Bernstein (1998): Muddling Through: Pursuing science and truths in the twenty-first century, Jeffrey Nicholls (1967): How universal is the universe?

That was nearly 60 years ago. Since then I have learnt that the academic mainstream considers theology and religion, despite their massive global influence as a persistent sources of war, to be mythological dreams. They are studied by historians, sociologists, propagandists and political scientists, but are of no interest to real science per se.

As this site demonstrates, I remain convinced that ignoring theology is a massive blunder on a trumplike scale. All that is needed is the radical new point of view outlined on page 4: Theology: a new paradigm?. If we assume that the Universe is divine (playing all the roles attributed to the traditional gods) theology assumes its traditional role as the apex theory of everything and all our scientific discoveries become input to the theological mill. In particular, since physics and theology study the same divine Universe, we should expect them to be work toward mutual consistency. Robyn J. Whitaker (2024-09_06): For God and country: why the choice of next US president is a religious choice, too

Traditional theology divides reality into two layers: an invisible divine layer, the heaven where god lives and rules our visible human world; and the visible world in which we live. theologies, churches and religions in our human world claim to know what is happening in their divine world, but from my point of view they have very little evidence for anything they say. Most of it is incredible.

In the last few centuries our scientific view of the world based on physical observation has developed a similar structure. We have the world of visible events in which we live our everyday lives; and a less visible world of ideas encoded in words, languages and feelings which we use to explain what is going on around us. My key to peace and sanity is to believe that these two pictures are two sides of the same elephantine universe. They must therefore fit together.

The invisible foundation of the modern scientific world is q uantum mechanics, discussed on Pages 9: The active creation of Hilbert space and 10: The emergence of quantum mechanics. Quantum mechanics is essentially a theory of complex periodic functions which we experience mostly as sound and music. The enormous variety of sounds in our world is a measure of the versatility of quantum theory. On the other hand we do not see it in space and time, nor do we see it in the everyday classical physics of daily life, the motions of our bodies, our vehicles and all the other things that fill our lives. It is rather like the old god, acting behind the scenes.

It is invisible but it is there. Quantum mechanics is responsible for absolutely everything that happens on the world, and unlike the old gods it is completely reliable and trustworthy. In the last century we have learnt an enormous amount about it and we see it everywhere in the amazing new sources of electric light which are spreading through our world. We have learnt to understand it quite well by carefully study of and experimentation. One of the most powerful motivations for the study of quantum physics is the central role it plays in the development of nuclear weapons. SIPRI: Stockholm International Peace Research Institute: Role of nuclear weapons grows as geopolitical relations deteriorate—new SIPRI Yearbook out now

We can trust the world, as we can trust al the other results of properly managed science. My feeling is illustrated by the impact of science on health care. Doctors no longer have borders. Modern biology has taught us that the body is no longer the enemy of the spirit: physically and spiritually we are each one person. The doctrines of the Catholic Church debasing the body are drawn from letters of the gnostic propagandist Paul of Tarsus. They blighted my youth. These ideas are still powerful, but hopefully they will eventually fade away. Roy Porter (2003): Flesh in the Age of Reason, Gnosticism - Wikipedia, Paul Of Tarsus: Galatians, 5:16-24

We know the Universe is one for two reasons. first, because it began from a single structureless point, the initial singularity; and second because all its parts communicate with one another through the fundamental link of gravitation which guides its overall structure. On this site I see gravitation as our first contact with divinity. Misner, Thorne and Wheeler (1973): Gravitation

The key to scientific theology is the paradigm change built on the idea that since the sciences of physics and theology study the same divine universe, we should expect them to become mutually consistent. At first sight this idea would seems to be theologically impossible. Since the time of Plato the material world, the subject of physics, has been strongly disparaged by theologians, cast as the absolute antithesis of spirituality. Aristotle appears to have embraced this idea when he hinted that intellect is immaterial. Aquinas argued that his god is omniscient, maximally intelligent, because they are maximally immaterial. Christopher Shields (Stanford Encyclopedia of Philosophy b): The active mind of De Anima iii 5, Aquinas, Summa: I, 14, 1: Is there knowledge in God?

back to page 19: toc

19.2: Quantum mechanics is a language

Fortunately for my project, quantum mechanics is a deeper network of communication beneath the structure of space and time and matter. Quantum mechanics emerged into he world of science just as classical mechanics was perfected by Einstein's theories of relativity. History of quantum mechanics - Wikipedia

The quantum processes which underlie all the events we see in our classical world are invisible to us but we know that they are there and all our lives and technologies depend on them. Evolution, constructing the world at the quantum scale, has created all forms of life from elementary particles to galaxies and the Universe as a whole. Almost everything we know about our own lives and all the life surrounding us is implicit in the theory of evolution, the simplest and most powerful theory in all of science.The theory of creation.

Newton saw space and time as fundamental manifestations of god's creation and Einstein perfected classical newtonian physics with special relativity and Minkowski space. As quantum mechanics developed and it became clear that its proper mathematical foundation is Hilbert space. At first quantum mechanics was conceived as a layer of structure built on Minkowski space. Streater and Wightman write:

since in quantum mechanics observables are represented by hermitian operators which act on the Hilbert space of state vectors, one expects the analogue in relativistic quantum mechanics of a classical observable field to be a set of hermitian operators defined for each point in space-time and having a well defined transformation law under the appropriate group.
Streater & Wightman (2000): PCT, Spin, Statistics and All That

The key insight that encouraged me toward the conception of the Universe as divine was to reverse the ontological order of Hilbert and Minkowski spaces. This reversal became possible when Feynman and others began to see quantum mechanics as a language of computation and communication analogous to the classical languages of mathematics and logic. Nielsen & Chuang (2016): Quantum Computation and Quantum information

There is now a growing consensus that quantum computers may be more powerful than classical turing machines because hermitian operators may be able to process a whole spectrum of values in parallel, conducting a search of a large database in "one look" rather than working through it step by step. This power carries with it the downside known as the measurement problem since it is difficult to confine a quantum system to deterministic behaviour. I discuss this problem on page 14: Measurement—the interface between the Minkowski and Hilbert spaces. Lov Grover (1996): A fast quantum mechanical algorithm for database search, Xavier Waintal (2023_12_29): The Quantum House Of Cards

In the evolutionary context, the lack of determinism is a feature rather than defect, since the creative essence of evolution is variation. We may understand Hilbert space as a space of periodic functions like sounds and music, and the role of the linear operators in quantum mechanics is to select spectra of stationary states out of superpositions of periodic functions by solving the eigenvalue problem. Quantum theory contains the essence of evolution. Eigenvalues and eigenvectors - Wikipedia, Wayne Enright: The eigenvalue problem

We may define a person, source or particle in Minkowski space as an entity capable of sending and receiving the messages described by Shannon's theory of communication. The mathematical basis of the interface between the Hilbert and Minkowski worlds is normalization. This requirement, that the sum of the probabilities of the potential outcomes of any event be 1, is the foundation of von Neumann's identification of the wave and matrix approaches to quantum theory. Claude Shannon (1948): A mathematical theory of communication, John von Neumann (2014): Mathematical Foundations of Quantum Mechanics

We may imagine the classical world and all its components, including ourselves, as sources. The quantum world is also a source, but Einstein, Podolsky and Rosen, seeking to prove the incompleteness of quantum mechanics, established that it describes a world that predates space and time and enables entanglement, the establishment of apparently instantaneous correlations between spacelike separated points in Minkowski space. Quantum entanglement - Wikipedia

On the assumption that this is the case, we are led to understand Minkowski spacetime as a quantum mechanical product. On page 17: Gravitation and quantum theory—in the beginning and its predecessors we have traced a path from the ancient theological doctrines of Catholicism to the creation of particles and the endowment of these formal structures with energy by splitting naked gravitation into potential and kinetic energy.

back to page 19: toc

19.3: Elementary particles: bosons and fermions

On page 12: the quantum creation of Minkowski space we invoke the evolutionary principle that uncontrolled action can try everything, and that consequences of these trials that are self sustaining are selected and may become fixed with lives of varying length. Darwin discovered that this process is responsible for the origin of new in living species. Here I would like to take evolution right back to the beginning, to the random actions of the initial singularity I have called naked gravitation.

One very important empirical foundation of quantum mechanics has been the study of elementary particles. The first particle discovered was the electron. In 1905 Einstein proposed that the light is a also a stream particles which came to be called photons. At that time most physicists were firmly convinced that light is a wave in the ether that permeates the Universe. In 1924 Louis de Broglie proposed that electrons have a wavelike nature and that waves could be used to explain the interactions between particles. Particle physics t0ok on the mathematics of waves. Since that time dozens of elementary particles have been discovered and huge machines like the Large Hadron Collider have been built to study them by colliding the with one another and looking at what comes out. Electron - Wikipedia, Louis de Broglie - Wikipedia, Large Hadron Collider - Wikipedia

Quantum theory divides all elementary particles into two exclusive classes, fermions and bosons. From a network point of view, fermions are sources and bosons are the messengers that carry information between these sources. Bosons are distinguished by having integral spin while the spins of fermion like electrons are half integral. Fermion - Wikipedia, Boson - Wikipedia, Spin (physics) - Wikipedia

The difference between bosons and fermions is quite radical and is responsible for the fundamental structure of the world. While bosons tend to flock together, fermions obey the Pauli exclusion principle. Apart from a minor exception connected to superconductivity no two fermions can occupy the same point in Minkowski space. This observation is considered to account for all the spacetime structure. If electrons were not fermions there would be no electronic structure in atoms, maybe no different elements and no life or chemistry. As Duck and Sudarshan write in their article on the spin-statistics theorem:

. . .we do not know why anticommuting particles are required to exist. We are glad to see that they do, the point being that none of us would be here, if there was a here, to worry about it if they didn't [page 295].
Spin-statistics theorem - Wikipedia, Ian Duck and E.C.G. Sudarshan: Toward an understanding of the spin-statistics theorem

Many quantum mechanics, beginning with Pauli and his contemporaries, have tried to prove the spin-statistics theorem. An essential ingredient in many proofs of the spin-statistics relation is relativity, that the physical laws do not change under Lorentz transformations. From this point of view, the spin statistics theorem is a consequence of the metric of Minkowski space. Here, however, we are operating in abstract hilbert space prior to the emergence of Minkowski space and seeking to discover whether there are operators in this space that can distinguish fermionic and bosonic behaviour. This leads us to think about the Dirac equation and the nature of measurement in quantum mechanics discussed on Page 14: measurement—the interface between the Minkowski and Hilbert spaces.

It may turn out that the dichotomy between fermions and bosons is a necessary consequence of the mathematics of quantum theory, but this may not be necessary for the existence of dichotomy. It may be simply be a consequence. of evolution. We do not expect to be able to find a quantum mechanical argument to prove the existence of rabbits but it is clear that rabbits are one of the infinite spectrum of possibilities within reach of quantum mechanics. Rabbits have come to be and survive, maybe spin statistics is in that same category.

A key to understanding the Universe is the second law of thermodynamics which tells us that entropy, that is complexity, always tends to increase. Einstein used this law to explain the atomic emission and absorption of photons. As Feynman explains, entropy serves as an attractor. An atomic electron at a high energy level will emit a photon because this will increase the entropy of the world. The photon in radiation field has much greater entropy than a photon hidden in an atom. Einstein coefficients - Wikipedia

back to page 19: toc

19.4: The quantum evolution of Minkowski space: fermionic networks need 4D for their existence

The creation and survival of Minkowski space may be supported by the fact that it enables an attractive increase in the entropy of the universe. Matter, in Aristotle's hylomorphic picture of the universe, is the principle of individuation, enabling the existence of discrete copies of identical forms. Aquinas notes that in the matter free spiritual world of angels, every angel must be a different species. In formal hilbert space in the absence of energy we can imagine photons as being rather like angels, differentiated only by quantum states. Aquinas, Summa I 50, 4: Is every angel a different species?

Theologians face a similar problem with the Trinity. How are we to differentiate three identically divine persons? The traditional answer is by their relationships to one another. From a modern quantum mechanical point of view we might say that they are fermions and obey the Pauli exclusion principle. In other words, the Trinity is a space holding three distinct identical entities. This makes sense of the Shield of the Trinity: Page 8: the theology of the Trinity.

Bosons in the Universe play the role of messengers. Interaction is local. In an abstract Hilbert space before space-time enters the world, contact is inevitable and quantum systems can evolve unitarily without interruption. To correlate their evolution, spatially separated systems must communicate to maintain contact. The metric of Minkowski space enables the existence of null geodesics whose endpoints are in contact because the space-time interval between them is zero. The unitary contact of spatially separated systems can thus be maintained if the messenger travelling between them travels at the speed of light. In Minkowski space in other words the speed of light enables the existence of space by maintaining the integrity of the contact and unitarity that is essential to the work of quantum mechanics. This "trick" explains the relationship between space and time in the Minkowski metric.

Fermions play the role of persons. All the objects we can see and touch in our world are fermions. They need at least three dimensions if they are to move freely.

The Schrödinger wave equation

i ℏ |ψ⟩ / ∂t = H|ψ

is at home in Hilbert space as we describe on Page 14: Measurement—the interface between the Minkowski and Hilbert spaces

Here the variable t and the operator H are often read as time and energy, but in primordial Hilbert space they are simply mathematical symbols that describe the kinematic rate of change of the waves represented by the ket vector |ψ⟩. Neither dynamic time nor energy exist in abstract Hilbert space. There is no relativity or mass shell either. In theological terms this picture is spiritual, abstract or kinematic. How do we make this into the real world?

The answer here, explained on Page 17: Gravitation and quantum theory—in the beginning is that gravitation gives energy and physical reality to quantum mechanical "angels". Quantum mechanics is linear. the Schrödinger equation is linear and, due to Dirac's gamma magic, the Dirac equation is also linear. The four dimensions it describes are not Minkowski spacetime they are two pairs of spinors, two component wave functions. When gravity gives them energy and makes them dynamic, they become fermions. These are not necessarily charged, not necessarily electrons or positrons, not necessarily massive. These finer details will emerge when we move into Minkowski space. The 3D component of Minkowski space is a direct consequence of the exclusion principle, just as the air traffic control is in three dimensions. Like Hilbert space and quantum mechanics, spacetime begins as an abstract formal reality.

back to page 19: toc

19.5: Transfinite networks in Minkowski space

We are now in Minkowski space, our natural home, full of all sorts of people and objects acting in communication with one another using classical channels. The actual execution of every event we know to be managed in the quantum realm. An important class of agents are mathematicians, who enjoy going to infinite limits. Rózsa Péter (1976): Playing with Infinity: Mathematical Explorations and Excursions

The basic task of classical networks is to carry information and other goods through space and time. The scientific study of the communication of information began with the invention telegraphy. The basic problem is to send signals over long distance communication channels without them being corrupted by noise. The theory was perfected in the 1940s by Claude Shannon working for Bell Telephone laboratories. We now use computers to implement Shannon's coding ideas. Claude Shannon (1949): communication in the presence of noise, Alan Turing - Wikipedia

The precision and clarity of mathematics is closely associated with the precision of logic and methods of proof. At the turn of the twentieth century Whitehead and Russell explained the close relationship between these two disciplines in their book Principia Mathematica. Their idea was to develop a linguistic code for mathematics which bypassed as far as possible the ambiguities of natural language. This opened the way to a mechanical algorithmic approach to proof. Whitehead and Russell (1910): Principia Mathematica

Their work was motivated by the appearance of paradoxes or inconsistencies in mathematics, which arose ultimately from the need to perfect the real numbers in order to make calculus logically consistent. A by-product of this work was Georg Cantor's invention of transfinite numbers. Joseph Warren Dauben (1990): Georg Cantor: His Mathematics and Philosophy of the Infinite

David Hilbert saw that Cantor had circumvented Aristotle's rejection of the reality of actual infinity by taking a formal Platonic approach which severed the connection between mathematics and physics and established logical consistency as the foundation for mathematical truth. This is consistent with the Platonic idea that the physical world is but a poor reflection of an ideal heaven. Hilbert proposed that consistent mathematics would be both complete and computable. Hilbert's program - Wikipedia

Kurt Gödel and Alan Turing exploited Whitehead and Russell's work to give us insight into the incompleteness and incomputability of formal mathematics. Gödel's incompleteness theorems - Wikipedia, Alan Turing (1936): on computable numbers, with an application to the Entscheidungsproblem, Entscheidungsproblem - Wikipedia

hu

Formal reality turned out to be far more interesting than Hilbert suspected. In 1931 Kurt G&oml;del, using Whitehead and Russell's algorithmic language, proved that a consistent formal system is necessarily incomplete. In 1982 Gregory Chaitin connected Gödel's theorems to the cybernetic principle of requisite variety. Gödel's theorems also suggest that an omnipotent God or Universe bound by consistency cannot have complete deterministic conterol over everything. Gregory J. Chaitin (1982): Gödel's Theorem and Information

Turing's work placed an absolute bound on computability and established that there are a countable infinity of computable functions. Within the set of computable functions, however, there are many subsidiary degrees of difficulty one of which is illustrated by the "P versus NP" problem. P versus NP problem - Wikipedia, Carlson, Jaffe & Wiles (2006): The Millennium Prize Problems 5 Stephen Cook: The P versus NP Problem:

This problem appears to be relevant to our understanding of evolution, both in the Minkowski and the Hilbert regime. Peter Cook states the problem as follows:

The P versus NP problem is to determine whether every language accepted by some nondeterministic algorithm in polynomial time is also accepted by some deterministic algorithm in polynomial time [my italics].

The standard definition of a deterministic algorithm is a Turing machine. For information theoretical purposes, a non-deterministic (NP) algorithm is one which for each input, there exists a run that produces the desired result, even when other runs produce incorrect results. Turing machine - Wikipedia, Nondeterministic algorithm - Wikipedia

We may consider the variation phase of evolution as an example of a non-deterministic algorithm insofar as occasional random genetic variations produce viable offspring. We may consider that a deterministic P process checks that the NP process is correct by following, without fatal error, the steps required by an individual born into a random environment to reproduce itself. The existence of life on Earth (even given the existence of a divinity constrained by consistency) seems to suggest that the P versus NP problem is soluble given enough trials.

The emergence of Minkowski space enables an vast increase in the number of independent fundamental particles in the Universe, and consequently a vast increase in its entropy. No longer are photons electrons and other particles differentiated by their quantum states alone, but they can also be differentiated by their location in space and time.

Quantum states in Hilbert space are not observable per se until they have been endowed with energy and become particles. In standard quantum field theory this energy is understood to come from vacuum fluxtuation arising from quantum uncertainty. Here, as described on page 17, we imagine that such states are actualized and become observable in Minkowski space and become subject to the Minkoski metric by drawing energy from gravitation. Such real particles and their associated quantum states can be manipulated by elecromagnetic fields and detected by various measuring instruments in experimental devices and manufacturing processes.

The ability of particles to multiply and bond to one another by communication enables the construction of of atoms, molecules and larger aggregates such as planets and stars. Mathematically, these processes can be understood as combinations and permutations of particles whose variety increases in a manner analogous to the formation of Cantor's transfinite numbers. Although Cantor began with the infinite set of natural numbers to model transfinite growth, we may apply the principle of symmetry with respect to complexity discussed on page 27.7 to draw analogies between the growth of the transfinite numbers and the consruction of ever more complex organisms in Minkowski space made possible by the multiplication of fundamental particles.

These strcuctures may be understood as classical networks in Minkowski space. These are well researched and widely implemented. These networks use Turing's theory of computation to implement Shannon's theory of communication to achieve error free communication. Globally their most extensive implementation is the internet. The specific network roles of computation are to implement the complex codecs necessary to apply shannon's theory of error control and to handle the details of routing and connection between any number of individual users. Andrew Tanenbaum (1996): Computer Networks

The scale invariance of classical networks (derived from the scale invariance of quantum Hilbeet spaces) means that they are useful models for everything from the simplest logical structures to the transfinite layers of the universe described on Page 21: matter and spirit which deals with the relationship between the material and spiritual worlds and Page 22: cosmic theology which interfaces the quantum world with the visible world via "measurement" described on Page 14: "measurement": the interface between hilbert and minkowski spaces.

Our most common and intuitive forms of networking are our face to face interactions with one another. We love to see our children learning to speak and understand us. It is here, in domestic, family and social systems where the network paradigm of love receives its first and most potent expressions. Here also we learn and experience the dimensions of trust and honesty that serve as the tacit foundation of meaningful communication.

Engineered networks are a very old idea built around the construction of houses, footpaths, roads, bridges, canals and aqueducts to enable human contact, commerce, agriculture and military mobility. The invention of writing and ocean going ships enabled all literate people around the world to communicate with one another. It also enabled central powers to send authoritative messages over long distances, making it possible to manage empires.

The details of networking computers raises a wide range of issues which are dealt with using network protocols embodied in the software shared by the machines in a network. Three principal issues are the control of transmission errors, which are dealt with by applications of Shannon's theory of communication; the processes of addressing, connection and disconnection that enable machines to find one another and make and break secure contact; and the establishment and maintenance of security which plays the role of an immune system to prevent malicious actors from damaging system stealing valuable information and property. History of the Internet - Wikipedia, History of computing hardware - Wikipedia

Computer and computer networks are build up in layers of complexity starting with simple binary logical operations and building up step by step to systems like the internet with billions of machines and billions of users connected together by thousands of languages and protocols. This immense system provides us with a relatively transparent and systematic network model to understand the structure of the Universe. The internet is self documenting in the sense that one can use the it to search for and find documentation of almost every element of network technology except those made private by their owners. Internet - Wikipedia

The well developed formalism of computer networking throws light on all other forms of networking between particles, people, businesses and nations. In every case successful communication depends on shared language and culture, often unspoken, a tacit dimension. The bounding tacit dimensions of interest throughout this site are theology and physics, the traditional locations of theories of everything. The communications involved in our human affairs occupy all the space within these boundaries. Michael Polanyi (1966, 2009): The Tacit Dimension, Theology - Wikipedia

back to page 19: toc

19.6: Senses, brains and muscles: living networks

We connect through the internet and other telecommunications mainly by speech and vision, which includes writing and all sorts of imagery. Our immediate communications with one another are mediated by our five senses, touch, smell, sight, hearing, and taste, and the spectrum of modes of our action and sensation ranges from pure love and support to violence and injury, even murder.

All of these modes of communication have roots deep in our evolutionary past. We share our chemical senses of smell and taste with creatures as far back as the first motile bacteria that can sense attractive and repulsive chemical environments and move toward or away from them. As more complex creatures evolved, so the spectrum of sensations and actions expanded.

As animal bodies became more complex, they developed the power of proprioception necessary to manage their internal processes. Much of this is mediated chemically at the cellular level in both plants and animals.

One of the disadvantages of membership in the living community is that we are all in different ways food and other resources for one another. This has led to the development of mechanisms for defence and predation ranging from our internal molecular immune system, the evolution of poisons and offensive chemical sprays, and the development of muscles and nerves, teeth, claws and defensive armour, not to mention deceptive colouring and behaviour all of which involve communication by close contact.

The design of all these systems is controlled by the need for error free communication and the limits on activity placed by cybernetics, very much in the same way as telecommunication systems are controlled. We understand that this information is implicitly built into living systems by the selective processes of evolution because evolution favours systems that work. We find that healthy sight, hearing and other modes of sensation approximate to the limits on their functionality defined by quantum mechanics. William Bialek & Allan Schweitzer (1985): Quantum noise and the threshold of hearing

From an abstract point of view, every living creature is a very complex network of interactions beginning at the molecular level in single cells and increasing in complexity with networks of molecular intercellular communication in multicellular creatures. Larger scale physical organs and networks take care of the collection and distribution of oxygen, food and information to all the cells in a complex body through circulatory systems of blood vessels, lymph systems, nerves and other specialized structures. These systems in living bodies may serve us as paradigms for the effective management of large communities of people.

Although the ancients thought that our intelligence is the product of an immaterial intellect which part of our spiritual soul, there can be no doubt that the huge neural network in our heads, our brain, is the seat of all our intelligence, emotion and culture. Recent extensive studies have revealed much more detail in the anatomy of our brains and mechanical devices called artificial intelligence with specialized structures based on the neural structure of animal brains are now beginning to play a role in many industries, not least in computer games. It seems probable that our brains are more complex than the internet in its present state, and more intelligent.

The structure of our brains mimic's quantum superposition. Each neuron receives and integrates inputs from sense transducers and other neurons and transmits the result of this integration to other neurons or effectors such as muscle fibres or cells that excrete hormones or other metabolic substances. This system learns by adjusting the strength and sign of the synaptic inputs to neurons. Gemma Conroy: This is the largest map of the human brain ever made, Brain - Wikipedia, Artificial Intelligence - Wikipedia

back to page 19: toc

19.7: Communities, theologies, religions and politics

My main interest here is in theology and religion. and their roles in human communities. Since our modern species homo sapiens evolved some 300 thousand years ago, we have spread around the planet to all inhabitable regions and developed a vast array of languages and cultures. Different human groups have often evolved in isolation for tens of thousands of years and remained largely ignorant of one another. In the last ten thousand years or so these diverse cultures have gradually been connected through travel, empire building, and marine navigation. This often causes friction resulting from predation. People with guns and horses pillage, rape and murder people with spears, bows and arrows. Warlords have gradually expanded their powers to large empires, often killing and enslaving many of the people they conquer. Augustin Fuentes (2019): Why we believe: evolution and the human way of being, Zahiru'd-din Mohammad Babur (2020): The Babur Nama

The answer to these problems proposed here relies on the fact that despite our diversity we remain one species on one planet. All our cultures and languages have developed through adaptation to particular habitats. The fact that we share a global habitat must now be exploited to deal with global problems of resource distribution and the physical disease and mental dissonance that arises from the damage we are doing to the global ecology that sustains us. We face common problems which must be solved by common knowledge and action.

The first step toward the solution proposed here is the development of scientific understanding of ourselves and our world. A key to this approach is the shared culture based on the nature of our world and ourselves which I call here scientific theology. The key to this theology, as ancient religions have long recognized, may be summed up in three terms: faith, that is shared knowledge; hope, that is a shared vision of our future; and charity, that is cooperative action to realize our hope.

Global communication seem likely to promote this development. Once technical knowledge and the mass production of the necessary physical components became possible, the global network has grown with extraordinary speed. It has been an evolutionary event analogous to the explosive growth of populations of locusts, mice, and rabbits that occur when conditions are favourable. Many forces have combined to speed the propagation of networks and led to some repetitions of the boom and bust cycles that have been been associated with events like the discovery of gold.

A comparable event was the invention of printing which caused an explosive growth in literacy and the propagation of knowledge. It takes time to learn to read and write, but the rapid development of network interfaces like mobile phones and computers which work with sound and image and do not need much in the way of special skill to operate have made it possible for anyone with adequate resources to connect to the the network. The most potent interfaces with electronic networks are mobile phones which have grown to become powerful hand held computers in little more than a decade.

Other common features of evolution have also grown with networks. They have provided new fields for predation, deception, advertisement and profit making, spying, and government control of populations. These tendencies have led to immune responses like encryption and government legislation to protect privacy and enable victims to get justice and recompense. As has occurred many times before in gold rushes and other social paradigm changes, government and the law have been to a large extent left behind and many see the networking of the world as a new form of imperialism by powerful corporations which have been able to take control and profit from exploitation of the new resources. By enabling the rapid propagation of information and giving voice to those who have previously been silent however, we can expect the gradual diminution of the initial advantages gained by the imperialist invasion of this new space.

If the approach taken on this site is anywhere near correct, we have seen the Universe grown from zero entropy to its current enormous complexity through an evolutionary process wrought by the action of an initially blind omnipotent power controlled only by the constraint that contradictions cannot exist. This growth is an example of the second law of thermodynamics: entropy has an irresistible tendency to increase.

This tendency, explained by our evolutionary model of divinity, is a radical source of hope and, as I see it, a foundation for ethics, ie an example of metaethics. I explain my idea in an essay Entropy and Metaethics. Thomas Piketty (2022): A Brief History of Equality, Jeffrey Nicholls (2019a): Entropy and Metaethics

Piketty sees that despite a multitude of ups and downs, human freedom and welfare is gradually increasing. I see this as an implicit feature of our divine world. The increase of entropy is equivalent of the decrease of energy per state. In other words, our local the world is becoming gentler. The idea that nature is red in tooth and claw that inspires so many of the warmongers among us is false, as a scientific theology demonstrates.

back to page 19: toc

19.8: Some properties of classical networks

1: Networks create real relationships and are scale invariant. We create and maintain our human relationships by communicating with one another. It seems that all other relationships, bonds and structures in the universe are built on networked communication. Their scale invariance is based on the notion that the "atom" of a network comprises two sources and a channel between them. Since shannon's proof of the capacity of error free communication channels is based on real function spaces we might expected it to hold across the whole spectrum of Cantor's transfinite numbers so there is no limit of the complexity of sources or the messages exchanged. Claude Shannon (1949): Communication in the presence of noise

2: Computer networks are logically continuous. We may understand a halting turing machine as a logically continuous mechanism: given a certain initial state, it moves by a series of deterministic logical steps to a final state which is the logical consequence of the initial state. It executes a proof. Modern digital computers and a error free digital communication channels are also logically continuous in the same sense. The output of an error free transmission channel will be identical to its input, so it may be considered as a machine that does nothing other than copy information from one point in space-time to another.

The mathematical definitions of continuity developed in the nineteenth century using point set theory have been judged by the mathematical community to provide a safe working basis for differential and integral calculus. They are nevertheless cognitively dissonant in that they explain continuity in terms of the proximity of discrete points. Aristotle states that things are called continuous when the touching limits of each become one and the same and are, as the word implies, contained in each other. This definition is fulfilled by logical continuity. Processes communicate by reading from and writing to shared memory locations as we see in the use of a "tape" in Turing's original model of computation. Aristotle: Physics, v, iii, 227a10 sqq, Alan Turing (1936): On computable numbers, with an application to the Entscheidungsproblem

3: Networks may be made redundant and fail safe. The road network of a country or city normally provides many distinct routes to get from a to b. iI one route is blocked, another is most likely to be available. This property may be built into any network, so that a route may be found between any two sources, at least until the network is so badly damaged that it is divided into two completely disconnected sections.

The internet works like a postal system, since each packet contains the address of its destination. Routers may direct each packet along a different path to its destination to avoid congestion or other problems on particular network links. Router (computing) - Wikipedia

4: Networks can be logically mapped. A problem facing any network user (which may be a machine) is to find the best route to take between two points. The usual way to do this is to consult a map of the network. Networks like the internet achieve this with routers, but a similar ability is available to living creatures that know their environment. Many plants manage their communication statistically by releasing large quantities of pollen some of which may reach a receptive female through a sequence of random events.

5: Networks embody both certainty and uncertainty. They therefore facilitate evolution by variation and selection. Networks, like the internet, can be designed to be deterministic, but uncertainty can enter in three ways. The first is error, although error detection and correction is possible in a well designed network. The second is lossy encoding, where some loss of detail is acceptable in order to reduce the size and cost of transmission. The third is by the interruption and redirection of processes by interaction with other processes. Our lives, for instance, are regularly redirected by our accidental meetings with other people and events.

The effect of this uncertainty it to introduce randomness into a system enabling it explore the whole probability space of possible connections. This is particularly relevant in the search for suitable mates by sexually reproducing organisms. This situation lays the foundation for an evolutionary proceses. Many potential connections will lead to inconsistency and failure. Others will establish stable connections. In the darwinian model, as in quantum theory, stable relationships are the ones selected de facto for survival.

6: The future is uncertain but past is fixed. There may be many different routes through a network and the actual course of processing may depend on random events like the exact moment at which a machine receives a message to interrupt and redirect its process. In retrospect, however, the actual path taken is determinate, and in the case of the whole universe, can (in principle) be traced back to the initial singularity. This suggests that lonergan’s empirical residue is mistaken, since every event has a history stretching back toward its source. Lonergan introduces empirical residue to prove that the universe is not divine. The failure of his argument opens the way for the hypothesis informing this work, that the Universe is divine. Bernard Lonergan (1992): Insight: a study of human understanding

7: Networks increase complexity by copying. The function of communication network is to copy information one from point to another in spacetime. This increases the overall complexity of the system. This tendency is opposed by erasure. In biological networks the death and decay of individuals release resources for use by other organisms. In atmospheric, hydrological and geological networks physical materials are continually being rearranged on the Earth at all scales.

8: They reflect the layered structure of the universe. Networks are layered, beginning with the physical network. Each layer uses the services provided by the layer beneath it to serve the layer above it. We see a similar structure reflected in the world. Fundamental particles communicate with one another to form atoms. Atoms network to form molecules, molecules cells, cells multicellular creatures. These creatures network to form ecologies and so on. Each of these layers can exist without the layers above it, but relies for its existence on the layers beneath it. It is in the interest of higher layers to maintain the layers beneath them upon which they depend. Our existence on earth requires that we protect and maintain the ecological systems upon which our lives depend.

9: Coding and decoding: language and meaning. The mathematical theory of communication is not concerned with the meaning of messages, but with accurate transmission. The embodiment of meaning in messages is the work of the users (sources) in the network. In the case of the internet, the users are predominantly human, although machines (‘things’) also use it, usually in the service of human users. As in face to face communication, the meanings of messages are produced by the users and transmitted in a shared language.

Like the traditional god the initial singularity, taken by itself, means nothing. Formally, meaning is established by correspondence, as in a dictionary. Physically correspondence is established by bonding which is a relationship established by communication.

Atoms, although infinitesimal, are already quite complex structures. Each new layer of structure in the universe adds new layer of meaning which is encoded in the syntax of the structure. This relationship between layers is universal, so i may consider myself as an element of the human layer on Earth, relying on many ecosystems and economic layers for my survival and contributing to all the networks within which I am a source.

This structure establishes that the meaning of my life can be traced back to the initial singularity, and my life contributes to the meaning of all the systems of which I am part.

10: There is no logical upper bound to the complexity of networks. Nevertheless, communications are limited by the velocity at which messages can travel. Communication in the physical universe as a whole this is limited by the velocity of light. Even if there are human civilizations on other planets, given the size of the universe, communications with them may take hundreds, thousands or millions of years if they are possible at all.

11: Networks provide an unbounded basis for theory of everything. I have introduced the network model as a modern equivalent of Aquinas's application of Aristotle’s matter, form, agent, end, potency and act to describe processes in the world. I suggest that a network paradigm is a promising candidate for a description of the Universe as a whole, a theory of everything. Universe - Wikipedia

12: Networks provide a means to avoid evil. To cooperate effectively, we require a common understanding of the problems facing us and the solutions made possible by the world we inhabit. Our basic problem is "malthusian", the limitations of material resources in the face of our aspiration to grow. An answer seems to lie in exploring the enormous creative space made available to us by sharing information about how to live well by exploiting the almost unlimited energy made available to us by the Sun. This path that has been exploited by nature to create the variety of life around us. Malthusianism - Wikipedia, Jeffrey Nicholls (1987): A theory of peace

We feel the full and painful force of natural selection when "zero sum" situations arise. In such cases the lives of some may depend upon the death of others, a cause of predation, pillage, rape and war. The strategy required is to avoid such situations by sufficient effective cooperation through planetary networks of prudent and economic use and sharing of limited resources. Ultimately local troubles can be solved with resources delivered over the network from more prosperous sources and better organized sources. Zero-sum game - Wikipedia, Jared Diamond (2004): Collapse: how societies choose to fail or succeed

back to page 19: toc

(revised Monday 9 September 2024)

back to site toc

copyright:

you may copy this material freely provided only that you quote fairly and provide a link (or reference) to your source.

notes and references

Further reading

Books

Babur (2020), Zahiru'd-din Mohammad, and Annette Susannah Beveridge (Translator) and William Dalrymple (Introduction), The Babur Nama, Knopf / Penguin Random House 2020 Jacket: 'Zahiru'd-din Mohammad Babu (1485-1530) a poet-prince from central Asia, was the first Mughal emperor and author of one of the most remarkable autobiographies in world Literature. The Babur Nama reveals its author as not only a military genius but also a ruler unusually magnanimous for his time, cultured, witty and possessing a talent for poetry, an adventurous spirit and an acute eye for natural beauty.' 
Amazon
  back

Carlson (2006), James, and Arthur Jaffe & Andrew Wiles, The Millennium Prize Problems, Clay Mathematics Institute and American Mathematical Society 2006
1: The Birch and Swinnerton-Dyer Conjecture: Andrew Wiles
2: The Hodge Conjecture: Pierre Deligne
3: The Existence and Smoothness of the Navier-Stokes Equation: Charles L Fefferman
4: The Poincare Conjecture: John Milnor
5: The P versus NP Problem: Stephen Cook
6: The Riemann Hypothesis: Enrico Bombieri
7: Quantum Yang-Mills Theory: Arthur Jaffe and Edward Whitten 
Amazon
  back

Dauben (1990), Joseph Warren, Georg Cantor: His Mathematics and Philosophy of the Infinite, Princeton University Press 1990 Jacket: 'One of the greatest revolutions in mathematics occurred when Georg Cantor (1843-1918) promulgated his theory of transfinite sets. . . . Set theory has been widely adopted in mathematics and philosophy, but the controversy surrounding it at the turn of the century remains of great interest. Cantor's own faith in his theory was partly theological. His religious beliefs led him to expect paradox in any concept of the infinite, and he always retained his belief in the utter veracity of transfinite set theory. Later in his life, he was troubled by attacks of severe depression. Dauben shows that these played an integral part in his understanding and defense of set theory.' 
Amazon
  back

Diamond (2004), Jared, Collapse: How Societies Choose to Fail or Succeed, Viking Adult 2004 'As suggested by its title, this book is about societal collapses - past, present and future - and the factors that cause human societies to fail. ... [Diamond's] primary mission is to determine the ecological, political and cultural conditions that lead to collapse and to contrast these with the conditions that favour success. ... Collapse is based on a series of detailed case studies. ... Diamond then provides a fuller exploration of the many rich parallels between these historic cases and select modern societies. ... What emerges most clearly from [his] analysis is the central role played by environmetnal decay in undermining human societies. ... In the end, [his] painstaking toil in the deep mines of history rewards him with sufficient nuggets of hope that he emerges 'cautiously optimistic' about the human prospect. ... The most important lesson to be drawn from Collapse is that resilient societies are nimble ones, capable of long term planning and of abandoning deeply entrenched but ultimately destructive core values and beliefs. This, in turn, requires a well informed public, inspired leadership and the political will to go against the established order of things. ... ' William Rees, Nature 433:15, 6 January 2005.  
Amazon
  back

Fortun (1998), Mike, and Herbert J Bernstein, Muddling Through: Pursuing Science and Truths in the Twenty-First Century, Counterpoint 1998 Jacket: ' Messy. Clumsy. Volatile. Exciting. These words are not often associated with the science, which for most people still connote exactitude, elegance, reliability and a rather plodding certainty. But the real story is something quite different. The sciences are less about the ability to know and to control than they are about the unleashing of new forces, new capacities for changing the world. The sciences as practised exist not in some pristine world of "objectivity," but in what Mike Fortnum and Herbert Bernstein call "the Muddled Middle". This book explores the way science makes sense of the world and how the world makes sense of science. It is also about politics and culture—how these forces shape the sciences and are shaped by them in return.' 
Amazon
  back

Fuentes (2019), Augustin, Why We Believe: Evolution and the Human Way of Being, Yale University Press 2019 ' Agustín Fuentes, a Professor of Anthropology at Princeton University, focuses on the biosocial, delving into the entanglement of biological systems with the social and cultural lives of humans, our ancestors, and a few of the other animals with whom humanity shares close relations. Earning his BA/BS in Anthropology and Zoology and his MA and PhD in Anthropology from UC Berkeley, he has conducted research across four continents, multiple species, and two-million years of human history. His current projects include exploring cooperation, creativity, and belief in human evolution, multispecies anthropologies, evolutionary theory and processes, and engaging race and racism. Fuentes’ books include “Race, Monogamy, and other lies they told you: busting myths about human nature” (U of California), “The Creative Spark: how imagination made humans exceptional" (Dutton), and “Why We Believe: evolution and the human way of being” (Yale). ' 
Amazon
  back

Hawking (1975), Steven W, and G F R Ellis, The Large Scale Structure of Space-Time, Cambridge UP 1975 Preface: Einstein's General Theory of Relativity . . . leads to two remarkable predictions about the universe: first that the final fate of massive stars is to collapse behind an event horizon to form a 'black hole' which will contain a singularity; and secondly that there is a singularity in our past which constitutes, in some sense, a beginning to our universe. Our discussion is principally aimed at developing these two results.' 
Amazon
  back

Lonergan (1992), Bernard J F, Insight: A Study of Human Understanding (Collected Works of Bernard Lonergan : Volume 3), University of Toronto Press 1992 '. . . Bernard Lonergan's masterwork. Its aim is nothing less than insight into insight itself, an understanding of understanding' 
Amazon
  back

Misner (1973), Charles W, and Kip S Thorne, John Archibald Wheeler, Gravitation, Freeman 1973 Jacket: 'Einstein's description of gravitation as curvature of spacetime led directly to that greatest of all predictions of his theory, that the universe itself is dynamic. Physics still has far to go to come to terms with this amazing fact and what it means for man and his relation to the universe. John Archibald Wheeler. . . . this is a book on Einstein's theory of gravity. . . . ' 
Amazon
  back

Nielsen (2016), Michael A., and Isaac L Chuang, Quantum Computation and Quantum Information, Cambridge University Press 2016 Review: A rigorous, comprehensive text on quantum information is timely. The study of quantum information and computation represents a particularly direct route to understanding quantum mechanics. Unlike the traditional route to quantum mechanics via Schroedinger's equation and the hydrogen atom, the study of quantum information requires no calculus, merely a knowledge of complex numbers and matrix multiplication. In addition, quantum information processing gives direct access to the traditionally advanced topics of measurement of quantum systems and decoherence.' Seth Lloyd, Department of Quantum Mechanical Engineering, MIT, Nature 6876: vol 416 page 19, 7 March 2002. 
Amazon
  back

Péter (1976), Rózsa, and Z. P Dienes (translator), Playing with Infinity: Mathematical Explorations and Excursions, Dover 1976 ' This popular account of the many mathematical concepts relating to infinity is one of the best introductions to this subject and to the entire subject of mathematics. Dividing her book into three parts — the Sorcerer's Apprentice, the Creative Roe of Form, and the Self-Critique of Pure Reason, Rózsa Péter develops her material in twenty-two chapters that sound almost too appealing to be true: playing with fingers, coloring grey number series, we catch infinity again, the line is filled up, some workshop secrets, the building rocks, and so on.' 
Amazon
  back

Piketty (2022), Thomas, A Brief History of Equality, Harvard UP 2022 ' The world's leading economist of inequality presents a short but sweeping and surprisingly optimistic history of human progress toward equality despite crises, disasters, and backsliding. A perfect introduction to the ideas developed in his monumental earlier books.It's easy to be pessimistic about inequality. We know it has increased dramatically in many parts of the world over the past two generations.' 
Amazon
  back

Polanyi (1966, 2009), Michael, and Amaryta Sen (foreword), The Tacit Dimension, University Of Chicago Press 1966, 2009 ' “I shall reconsider human knowledge by starting from the fact that we can know more than we can tell,” writes Michael Polanyi, whose work paved the way for the likes of Thomas Kuhn and Karl Popper. The Tacit Dimension argues that tacit knowledge—tradition, inherited practices, implied values, and prejudgments—is a crucial part of scientific knowledge. Back in print for a new generation of students and scholars, this volume challenges the assumption that skepticism, rather than established belief, lies at the heart of scientific discovery.' 
Amazon
  back

Porter (2003), Roy, and Simon Schiama (foreword), Flesh in the Age of Reason, W W Norton and Company 2003 Jacket: 'How did we come to a modern understanding of our bodies and souls? What were the breakthroughs that allowed human beings to see themselves in a new light? Starting with the grim Britain of the Civil War era, with its punishing sense of the body as a corrupt vessel for the soul Roy Porter charts how, through figures as diverse as Locke, Swift, Johnson and Gibbon, ideas about medicine, politics, and religion fundamentally changed notions of self. He shows how the Enlightenment ... provided a lens through which we can best see the profound shift from the theocentric otherworldly Dark Ages to the modern, earthly, body-centered world we live intoday.' 
Amazon
  back

Streater (2000), Raymond F, and Arthur S Wightman, PCT, Spin, Statistics and All That, Princeton University Press 2000 Amazon product description: 'PCT, Spin and Statistics, and All That is the classic summary of and introduction to the achievements of Axiomatic Quantum Field Theory. This theory gives precise mathematical responses to questions like: What is a quantized field? What are the physically indispensable attributes of a quantized field? Furthermore, Axiomatic Field Theory shows that a number of physically important predictions of quantum field theory are mathematical consequences of the axioms. Here Raymond Streater and Arthur Wightman treat only results that can be rigorously proved, and these are presented in an elegant style that makes them available to a broad range of physics and theoretical mathematics.' 
Amazon
  back

Tanenbaum (1996), Andrew S, Computer Networks, Prentice Hall International 1996 Preface: 'The key to designing a computer network was first enunciated by Julius Caesar: Divide and Conquer. The idea is to design a network as a sequence of layers, or abstract machines, each one based upon the previous one. . . . This book uses a model in which networks are divided into seven layers. The structure of the book follows the structure of the model to a considerable extent.'  
Amazon
  back

Links

1 John 4:7-21, God is love, ' 8. He that loveth not knoweth not God; for God is love.' back

Alan Turing - Wikipedia, Alan Turing - Wikipedia, the free encyclopedia, 'Alan Mathison Turing OBE FRS (23 June 1912 – 7 June 1954) was an English mathematician, computer scientist, logician, cryptanalyst, philosopher, and theoretical biologist.[6] Turing was highly influential in the development of theoretical computer science, providing a formalisation of the concepts of algorithm and computation with the Turing machine, which can be considered a model of a general-purpose computer. He is widely considered to be the father of theoretical computer science and artificial intelligence.' back

Alan Turing (1936), On Computable Numbers, with an application to the Entscheidungsproblem, 'The "computable" numbers may be described briefly as the real numbers whose expressions as a decimal are calculable by some finite means. Although the subject of this paper is ostensibly the computable numbers, it is almost equally easy to define and investigate computable functions of an integral variable of a real or computable variable, computable predicates and so forth. . . . ' back

Aquinas, Summa I, 14, 8, Is the knowledge of God the cause of things?, 'I answer that, The knowledge of God is the cause of things. For the knowledge of God is to all creatures what the knowledge of the artificer is to things made by his art. . . . Now it is manifest that God causes things by His intellect, since His being is His act of understanding; and hence His knowledge must be the cause of things, in so far as His will is joined to it. Hence the knowledge of God as the cause of things is usually called the "knowledge of approbation." ' back

Aquinas, Summa I, 25, 3, Is God omnipotent?, '. . . God is called omnipotent because He can do all things that are possible absolutely; which is the second way of saying a thing is possible. For a thing is said to be possible or impossible absolutely, according to the relation in which the very terms stand to one another, possible if the predicate is not incompatible with the subject, as that Socrates sits; and absolutely impossible when the predicate is altogether incompatible with the subject, as, for instance, that a man is a donkey.' back

Aquinas, Summa I, 50, 4, Is every angel a different species?, ' . . . such things as agree in species but differ in number, agree in form, but are distinguished materially. If, therefore, the angels be not composed of matter and form, as was said above (Article 2), it follows that it is impossible for two angels to be of one species; just as it would be impossible for there to be several whitenesses apart, or several humanities, since whitenesses are not several, except in so far as they are in several substances.' back

Aquinas, Summa, I, 1, 2, Is sacred doctrine is a science?, 'I answer that, Sacred doctrine is a science. We must bear in mind that there are two kinds of sciences. There are some which proceed from a principle known by the natural light of intelligence, such as arithmetic and geometry and the like. There are some which proceed from principles known by the light of a higher science: thus the science of perspective proceeds from principles established by geometry, and music from principles established by arithmetic. So it is that sacred doctrine is a science because it proceeds from principles established by the light of a higher science, namely, the science of God and the blessed.' back

Aquinas, Summa, I, 20, 2, Does God love all things, ' I answer that, God loves all existing things. For all existing things, in so far as they exist, are good, since the existence of a thing is itself a good; and likewise, whatever perfection it possesses. Now it has been shown above (I:19:4) that God's will is the cause of all things. It must needs be, therefore, that a thing has existence, or any kind of good, only inasmuch as it is willed by God. To every existing thing, then, God wills some good. Hence, since to love anything is nothing else than to will good to that thing, it is manifest that God loves everything that exists.' back

Aquinas, Summa, I, 22, 3, Does God have immediate providence over everything?, ' I answer that, Two things belong to providence—namely, the type of the order of things foreordained towards an end; and the execution of this order, which is called government. As regards the first of these, God has immediate providence over everything, because He has in His intellect the types of everything, even the smallest; and whatsoever causes He assigns to certain effects, He gives them the power to produce those effects. Whence it must be that He has beforehand the type of those effects in His mind. As to the second, there are certain intermediaries of God's providence; for He governs things inferior by superior, not on account of any defect in His power, but by reason of the abundance of His goodness; so that the dignity of causality is imparted even to creatures.' back

Aquinas, Summa, I, 3, 7, Is God altogether simple?, 'I answer that, The absolute simplicity of God may be shown in many ways. First, from the previous articles of this question. For there is neither composition of quantitative parts in God, since He is not a body; nor composition of matter and form; nor does His nature differ from His "suppositum"; nor His essence from His existence; neither is there in Him composition of genus and difference, nor of subject and accident. Therefore, it is clear that God is nowise composite, but is altogether simple. . . . ' back

Aquinas, Summa: I, 14, 1, Is there knowledge in God?, ' I answer that, In God there exists the most perfect knowledge. . . . it is clear that the immateriality of a thing is the reason why it is cognitive; and according to the mode of immateriality is the mode of knowledge. Hence it is said in De Anima ii that plants do not know, because they are wholly material. But sense is cognitive because it can receive images free from matter, and the intellect is still further cognitive, because it is more separated from matter and unmixed, as said in De Anima iii. Since therefore God is in the highest degree of immateriality as stated above (Question 7, Article 1), it follows that He occupies the highest place in knowledge.' back

Aquinas, Summa: I, 2, 3, Does God exist?, 'I answer that, The existence of God can be proved in five ways. The first and more manifest way is the argument from motion. . . . ' back

Aristotle (continuity), Physics V, iii, 'A thing that is in succession and touches is 'contiguous'. The 'continuous' is a subdivision of the contiguous: things are called continuous when the touching limits of each become one and the same and are, as the word implies, contained in each other: continuity is impossible if these extremities are two. This definition makes it plain that continuity belongs to things that naturally in virtue of their mutual contact form a unity. And in whatever way that which holds them together is one, so too will the whole be one, e.g. by a rivet or glue or contact or organic union. ' 227a10 sqq. back

Artificial Intelligence - Wikipedia, Artificial Intelligence - Wikipedia, the free encyclopedia, 'Artificial intelligence (AI), sometimes called machine intelligence, is intelligence demonstrated by machines, in contrast to the natural intelligence displayed by humans and other animals. In computer science AI research is defined as the study of "intelligent agents": any device that perceives its environment and takes actions that maximize its chance of successfully achieving its goals.' back

Born rule - Wikipedia, Born rule - Wikipedia, the free encyclopedia, ' The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of the Copenhagen interpretation of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results. . . . The Born rule states that if an observable corresponding to a Hermitian operator A with discrete spectrum is measured in a system with normalized wave function (see bra-ket notation), then the measured result will be one of the eigenvalues λ of A, and the probability of measuring a given eigenvalue λi will equal <ψ|Pi|ψ> where Pi is the projection onto the eigenspace of A corresponding to λi'.' back

Boson - Wikipedia, Boson - Wikipedia, the free encyclopedia, 'In particle physics, bosons are particles with an integer spin, as opposed to fermions which have half-integer spin. From a behaviour point of view, fermions are particles that obey the Fermi-Dirac statistics while bosons are particles that obey the Bose-Einstein statistics. They may be either elementary, like the photon, or composite, as mesons. All force carrier particles are bosons. They are named after Satyendra Nath Bose. In contrast to fermions, several bosons can occupy the same quantum state. Thus, bosons with the same energy can occupy the same place in space.' back

Brain - Wikipedia, Brain - Wikipedia, the free encyclopedia, ' The brain is an organ that serves as the center of the nervous system in all vertebrate and most invertebrate animals. In vertebrates, a small part of the brain called the hypothalamus is the neural control center for all endocrine systems. The brain is the largest cluster of neurons in the body and is typically located in the head, usually near organs for special senses such as vision, hearing and olfaction. It is the most energy-consuming organ of the body, and the most specialized, responsible for endocrine regulation, sensory perception, motor control, and the development of intelligence. ' back

Brouwer fixed point theorem - Wikipedia, Brouwer fixed point theorem - Wikipedia, the free encyclopedia, 'Among hundreds of fixed-point theorems] Brouwer's is particularly well known, due in part to its use across numerous fields of mathematics. In its original field, this result is one of the key theorems characterizing the topology of Euclidean spaces, along with the Jordan curve theorem, the hairy ball theorem, the invariance of dimension and the Borsuk–Ulam theorem. This gives it a place among the fundamental theorems of topology.' back

Christopher Shields (Stanford Encyclopedia of Philosophy b), The Active Mind of De Anima III 5 , ' After characterizingnous the mind (nous) and its activities in De Animaiii 4, Aristotle takes a surprising turn. In De Anima iii 5, he introduces an obscure and hotly disputed subject: the active mind or active intellect (nous poiêtikos). Controversy surrounds almost every aspect of De Anima iii 5, not least because in it Aristotle characterizes the active mind—a topic mentioned nowhere else in his entire corpus—as ‘separate and unaffected and unmixed, being in its essence actuality’ (chôristos kai apathês kai amigês, tê ousia energeia; DA iii 5, 430a17–18) and then also as ‘deathless and everlasting’ (athanaton kai aidion; DA iii 5, 430a23). This comes as no small surprise to readers of De Anima, because Aristotle had earlier in the same work treated the mind (nous) as but one faculty (dunamis) of the soul (psuchê), and he had contended that the soul as a whole is not separable from the body (DA ii 1, 413a3–5). back

Claude E Shannon (1948), A Mathematical Theory of Communication, ' The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point. Frequently the messages have meaning; that is they refer to or are correlated according to some system with certain physical or conceptual entities. These semantic aspects of communication are irrelevant to the engineering problem. The significant aspect is that the actual message is one selected from a set of possible messages.' back

Claude Shannon (1949), Communication in the Presence of Noise, 'A method is developed for representing any communication system geometrically. Messages and the corresponding signals are points in two “function spaces,” and the modulation process is a mapping of one space into the other. Using this representation, a number of results in communication theory are deduced concerning expansion and compression of bandwidth and the threshold effect. Formulas are found for the maximum rate of transmission of binary digits over a system when the signal is perturbed by various types of noise. Some of the properties of “ideal” systems which transmit at this maximum rate are discussed. The equivalent number of binary digits per second for certain information sources is calculated.' [C. E. Shannon , “Communication in the presence of noise,” Proc. IRE, vol. 37, pp. 10–21, Jan. 1949.] back

Eigenvalue algorithm - Wikipedia, Eigenvalue algorithm - Wikipedia, the free encyclopedia, ' In numerical analysis, one of the most important problems is designing efficient and stable algorithms for finding the eigenvalues of a matrix. These eigenvalue algorithms may also find eigenvectors.' back

Eigenvalues and eigenvectors - Wikipedia, Eigenvalues and eigenvectors - Wikipedia, the free encyclopedia, ' In linear algebra, an eigenvector or characteristic vector of a linear transformation is a nonzero vector that changes at most by a scalar factor when that linear transformation is applied to it. The corresponding eigenvalue, often denoted by λ, is the factor by which the eigenvector is scaled. Geometrically, an eigenvector, corresponding to a real nonzero eigenvalue, points in a direction in which it is stretched by the transformation and the eigenvalue is the factor by which it is stretched. If the eigenvalue is negative, the direction is reversed. Loosely speaking, in a multidimensional vector space, the eigenvector is not rotated.' back

Einstein coefficients - Wikipedia, Einstein coefficients - Wikipedia, the free encyclopedia, ' In atomic, molecular, and optical physics, the Einstein coefficients are quantities describing the probability of absorption or emission of a photon by an atom or molecule. The Einstein A coefficients are related to the rate of spontaneous emission of light, and the Einstein B coefficients are related to the absorption and stimulated emission of light. In 1916, Albert Einstein proposed that there are three processes occurring in the formation of an atomic spectral line. The three processes are referred to as spontaneous emission, stimulated emission, and absorption. With each is associated an Einstein coefficient which is a measure of the probability of that particular process occurring. Einstein considered the case of isotropic radiation of frequency ν, and spectral energy density ρ (ν).' back

Electron - Wikipedia, Electron - Wikipedia, the free encyclopedia, ' In 1838, British natural philosopher Richard Laming first hypothesized the concept of an indivisible quantity of electric charge to explain the chemical properties of atoms. Irish physicist George Johnstone Stoney named this charge "electron" in 1891, and J. J. Thomson and his team of British physicists identified it as a particle in 1897 during the cathode-ray tube experiment. Electrons participate in nuclear reactions, such as nucleosynthesis in stars, where they are known as beta particles. Electrons can be created through beta decay of radioactive isotopes and in high-energy collisions, for instance, when cosmic rays enter the atmosphere. The antiparticle of the electron is called the positron; it is identical to the electron, except that it carries electrical charge of the opposite sign. When an electron collides with a positron, both particles can be annihilated, producing gamma ray photons.' back

Elementary particle - Wikipedia, Elementary particle - Wikipedia, the free encyclopedia, ' In particle physics, an elementary particle or fundamental particle is a subatomic particle that is not composed of other particles. Particles currently thought to be elementary include the fundamental fermions (quarks, leptons, antiquarks, and antileptons), which generally are "matter particles" and "antimatter particles", as well as the fundamental bosons (gauge bosons and the Higgs boson), which generally are "force particles" that mediate interactions among fermions. A particle containing two or more elementary particles is a composite particle.' back

Entscheidungsproblem - Wikipedia, Entscheidungsproblem - Wikipedia, the free encyclopedia, 'In mathematics, the Entscheidungsproblem (. . . German for 'decision problem') is a challenge posed by David Hilbert in 1928. The Entscheidungsproblem asks for an algorithm that will take as input a description of a formal language and a mathematical statement in the language and produce as output either "True" or "False" according to whether the statement is true or false. . . . In 1936 and 1937 Alonzo Church and Alan Turing respectively, published independent papers showing that it is impossible to decide algorithmically whether statements in arithmetic are true or false, and thus a general solution to the Entscheidungsproblem is impossible. This result is now known as Church's Theorem or the Church–Turing Theorem (not to be confused with the Church–Turing thesis).' back

Fermion - Wikipedia, Fermion - Wikipedia, the free encyclopedia, 'In particle physics, fermions are particles with a half-integer spin, such as protons and electrons. They obey the Fermi-Dirac statistics and are named after Enrico Fermi. In the Standard Model there are two types of elementary fermions: quarks and leptons. . . . In contrast to bosons, only one fermion can occupy a quantum state at a given time (they obey the Pauli Exclusion Principle). Thus, if more than one fermion occupies the same place in space, the properties of each fermion (e.g. its spin) must be different from the rest. Therefore fermions are usually related with matter while bosons are related with radiation, though the separation between the two is not clear in quantum physics. back

Gemma Conroy, This is the largest map of the human brain ever made, ' Researchers have created the largest atlas of human brain cells so far, revealing more than 3,000 cell types — many of which are new to science. The work, published in a package of 21 papers today in Science, Science Advances and Science Translational Medicine, will aid the study of diseases, cognition and what makes us human, among other things, say the authors. The enormous cell atlas offers a detailed snapshot of the most complex known organ. “It’s highly significant,” says Anthony Hannan, a neuroscientist at the Florey Institute of Neuroscience and Mental Health in Melbourne, Australia. Researchers have previously mapped the human brain using techniques such as magnetic resonance imaging, but this is the first atlas of the whole human brain at the single-cell level, showing its intricate molecular interactions, adds Hannan. “These types of atlases really are laying the groundwork for a much better understanding of the human brain." . . ... Kimberly Siletti, a neuroscientist now at the University Medical Center Utrecht in the Netherlands, and her team laid the cornerstone for the atlas by sequencing the RNA of more than 3 million individual cells from 106 locations covering the entire human brain, using tissue samples from three deceased male donors1. They also included one motor cortex dissection from a female donor that had been used in previous studies. Their analysis documented 461 broad categories of brain cell that included more than 3,000 subtypes. “I was surprised at how many different cell types there were,” says Siletti.' back

Gnosticism - Wikipedia, Gnosticism - Wikipedia, the free encyclopedia, ' Gnosticism (from Ancient Greek: γνωστικός gnōstikós, "having knowledge") is a collection of ancient religious ideas and systems which originated in the first century AD among early Christian and Jewish sects. These various groups emphasised personal spiritual knowledge (gnosis) over orthodox teachings, traditions, and ecclesiastical authority. Gnostic cosmogony generally presents a distinction between a supreme, hidden God and a blind, malevolent demiurge responsible for creating the material universe. Viewing this material existence as flawed or evil, Gnostics considered the principal element of salvation to be direct knowledge of the supreme divinity in the form of mystical or esoteric insight.' back

Gödel's incompleteness theorems - Wikipedia, Gödel's incompleteness theorems - Wikipedia, the free encyclopedia, ' Gödel's incompleteness theorems are two theorems of mathematical logic that establish inherent limitations of all but the most trivial axiomatic systems capable of doing arithmetic. The theorems, proven by Kurt Gödel in 1931, are important both in mathematical logic and in the philosophy of mathematics. The two results are widely, but not universally, interpreted as showing that Hilbert's program to find a complete and consistent set of axioms for all mathematics is impossible, giving a negative answer to Hilbert's second problem. The first incompleteness theorem states that no consistent system of axioms whose theorems can be listed by an "effective procedure" (i.e., any sort of algorithm) is capable of proving all truths about the relations of the natural numbers (arithmetic). For any such system, there will always be statements about the natural numbers that are true, but that are unprovable within the system. The second incompleteness theorem, an extension of the first, shows that such a system cannot demonstrate its own consistency.' back

Gregory J. Chaitin (1982), Gödel's Theorem and Information, 'Abstract: Gödel's theorem may be demonstrated using arguments having an information-theoretic flavor. In such an approach it is possible to argue that if a theorem contains more information than a given set of axioms, then it is impossible for the theorem to be derived from the axioms. In contrast with the traditional proof based on the paradox of the liar, this new viewpoint suggests that the incompleteness phenomenon discovered by Gödel is natural and widespread rather than pathological and unusual.'
International Journal of Theoretical Physics 21 (1982), pp. 941-954 back

Hilbert's program - Wikipedia, Hilbert's program - Wikipedia, the free encyclopedia, ' In mathematics, Hilbert's program, formulated by German mathematician David Hilbert, was a proposed solution to the foundational crisis of mathematics, when early attempts to clarify the foundations of mathematics were found to suffer from paradoxes and inconsistencies. As a solution, Hilbert proposed to ground all existing theories to a finite, complete set of axioms, and provide a proof that these axioms were consistent. Hilbert proposed that the consistency of more complicated systems, such as real analysis, could be proven in terms of simpler systems. Ultimately, the consistency of all of mathematics could be reduced to basic arithmetic.' back

History of computing hardware - Wikipedia, History of computing hardware - Wikipedia, the free encyclopedia, 'The history of computing hardware covers the developments from early simple devices to aid calculation to modern day computers. Before the 20th century, most calculations were done by humans. Early mechanical tools to help humans with digital calculations, such as the abacus, were called "calculating machines", by proprietary names, or even as they are now, calculators. The machine operator was called the computer.' back

History of quantum mechanics - Wikipedia, History of quantum mechanics - Wikipedia, the free encyclopedia, 'The history of quantum mechanics is a fundamental part of the history of modern physics. The major chapters of this history begin with the emergence of quantum ideas to explain individual phenomena—blackbody radiation, the photoelectric effect, solar emission spectra—an era called the Old or Older quantum theories. Building on the technology developed in classical mechanics, the invention of wave mechanics by Erwin Schrödinger and expansion by many others triggers the "modern" era beginning around 1925. Paul Dirac's relativistic quantum theory work lead him to explore quantum theories of radiation, culminating in quantum electrodynamics, the first quantum field theory. The history of quantum mechanics continues in the history of quantum field theory.' back

History of the Internet - Wikipedia, History of the Internet - Wikipedia, the free encyclopedia, ' The history of the Internet begins with the development of electronic computers in the 1950s. Initial concepts of packet networking originated in several computer science laboratories in the United States, United Kingdom, and France. The US Department of Defense awarded contracts as early as the 1960s for packet network systems, including the development of the ARPANET (which would become the first network to use the Internet Protocol).' back

Ian Duck and E.C.G. Sudarshan, Toward an understanding of the spin-statistics theorem, 'We respond to a request from Neuenschwander for an elementary proof of the Spin-Statistics Theorem. First . . . Then we discuss an argument suggested by Sudarshan, which proves the theorem with a minimal set of requirements. . . . Motivated by our particular use of Lorentz invariance, if we are permitted to elevate the conclusion of flavour symmetry (which we explain in the text) to the status of a postulate, one could recast the proof without any relativistic assumptions, and this make it applicable even in the nonrelativistic context. . . . Finally, an argument starting with ordinary number-valued (nonmmuting), and with Grassman valued (anticommuting) operators, shows in a natural way that these relativitcally embed into Klein-Gordon spin-0 and Dirac spin-½ fields, respectively. In this way the Spin Statistics theorem is understood at the expense of admitting the existence of the simplest Grassman-valued field.' back

Internet - Wikipedia, Internet - Wikipedia, the free encyclopedia, 'The Internet is a global system of interconnected computer networks that use the standard Internet protocol suite (TCP/IP) to link several billion devices worldwide. It is an international network of networks that consists of millions of private, public, academic, business, and government packet switched networks, linked by a broad array of electronic, wireless, and optical networking technologies.' back

Jeffrey Nicholls (1967), How universal is the universe?, ' 61 The future is beyond our comprehension, but we can get an idea of it and speed its coming by studying what we already have. Contemplating the size and wonder of the universe as it stands in the light of its openness to the future must surely be a powerful incentive to men to love God. We have come a long way since the little world of St Thomas. Ours is open to all things, even participating in god. This is what I mean by universal. ' back

Jeffrey Nicholls (1987), A theory of Peace, ' The argument: I began to think about peace in a very practical way during the Viet Nam war. I was the right age to be called up. I was exempted because I was a clergyman, but despite the terrors that war held for me, I think I might have gone. It was my first whiff of the force of patriotism. To my amazement, it was strong enough to make even me face death.
In the Church, I became embroiled in a deeper war. Not a war between goodies and baddies, but the war between good and evil that lies at the heart of all human consciousness. Existence is a struggle. We need all the help we can get. Religion is part of that help and theology is the scientific foundation of religion.' back

Jeffrey Nicholls (2019a), Entropy and Metaethics, ' If we reject divine revelation as the foundation for morality we must find our grounds within the universe, which is tantamount to the naturalization of morality. So the question becomes 'how does nature embrace spirit as the sovereign ground for moral value?' . . . I propose an answer in terms of what Einstein considered to be the most fundamental and irrefutable law of nature, the second law of thermodynamics, which expresses the fact that entropy almost never decreases. In a more morally relevant frame, this law expresses the fact that the universe is inherently creative. Human spirituality, whatever it may be, has emerged from the natural world.' back

John von Neumann (2014), Mathematical Foundations of Quantum Mechanics, ' Mathematical Foundations of Quantum Mechanics by John von Neumann translated from the German by Robert T. Beyer (New Edition) edited by Nicholas A. Wheeler. Princeton UP Princeton & Oxford. Preface: ' This book is the realization of my long-held intention to someday use the resources of TEX to produce a more easily read version of Robert T. Beyer’s authorized English translation (Princeton University Press, 1955) of John von Neumann’s classic Mathematische Grundlagen der Quantenmechanik (Springer, 1932).'This content downloaded from 129.127.145.240 on Sat, 30 May 2020 22:38:31 UTC back

Large Hadron Collider - Wikipedia, Large Hadron Collider - Wikipedia, the free encyclopedia, ' The Large Hadron Collider (LHC) is the world's largest and highest-energy particle collider. It was built by the European Organization for Nuclear Research (CERN) between 1998 and 2008 in collaboration with over 10,000 scientists and hundreds of universities and laboratories, as well as more than 100 countries. It lies in a tunnel 27 kilometres (17 mi) in circumference and as deep as 175 metres (574 ft) beneath the France–Switzerland border near Geneva. The first collisions were achieved in 2010 at an energy of 3.5 teraelectronvolts (TeV) per beam, about four times the previous world record. After upgrades it reached 6.5 TeV per beam (13 TeV total collision energy, the present world record). At the end of 2018, it was shut down for three years for further upgrades.' back

Louis de Broglie - Wikipedia, Louis de Broglie - Wikipedia, the free encyclopedia, ' Louis-Victor-Pierre-Raymond, 7th duc de Broglie . . . 15 August 1892 – 19 March 1987) was a French physicist who made groundbreaking contributions to quantum theory. In his 1924 PhD thesis he postulated the wave nature of electrons and suggested that all matter has wave properties. This concept is known as the de Broglie hypothesis, an example of wave-particle duality, and forms a central part of the theory of quantum mechanics.' back

Lov Grover (1996), A fast quantum mechanical algorithm for database search, Lou K. Grover. A fast quantum mechanical algorithm for database search. in LANL e-print quant-ph/9605043, 1996. back

Malthusianism - Wikipedia, Malthusianism - Wikipedia, the free encyclopedia, ' Malthusianism is the idea that population growth is potentially exponential while the growth of the food supply or other resources is linear, which eventually reduces living standards to the point of triggering a population die off. This event, called a Malthusian catastrophe (also known as a Malthusian trap, population trap, Malthusian check, Malthusian crisis, Malthusian spectre, or Malthusian crunch) occurs when population growth outpaces agricultural production, causing famine or war, resulting in poverty and depopulation.' back

Meinard Kuhlmann (Stanford Encyclopedia of Philosophy), Quantum Field Theory, ' Quantum Field Theory (QFT) is the mathematical and conceptual framework for contemporary elementary particle physics. In a rather informal sense QFT is the extension of quantum mechanics (QM), dealing with particles, over to fields, i.e. systems with an infinite number of degrees of freedom. (See the entry on quantum mechanics.) In the last few years QFT has become a more widely discussed topic in philosophy of science, with questions ranging from methodology and semantics to ontology. QFT taken seriously in its metaphysical implications seems to give a picture of the world which is at variance with central classical conceptions of particles and fields, and even with some features of QM.' back

Michael Bordt (2011), Why Aristotle's God is not the Unmoved Mover, ' The aim of this essay is to show that the view—popular among certain philosophers and theologians—that Aristotle’s God is the unmoved mover is incorrect, or at least leads to serious misunderstanding. In a nutshell: among other things, the project of the twelfth book of the Metaphysics is to determine what the first ousia is. This first ousia is not identified with God in so far as it is an unmoved mover, but in so far as it is the actual activity (energeia) of thinking. To put matters differently, the actual activity of the first ousia does not consist in moving anything. Its activity rather consists in the exercise of reason, in thinking. Since, however, thinking is without qualification the best activity, and since God is that being who just does engage in the best activity, the first ousia, in so far as it is the same as the activity of thinking, must be God. Thus we perhaps expect that, at the summit of ontology, God himself will be the object of this first philosophy. Metaphysics Λ meets such an expectation only in a very limited way. The limitation is the following: that which, so to speak, stands at the summit of metaphysics is not God, but the activity of reason. While this activity is identified with God, it is not so identified directly or immediately, but only as mediated by way of the conception of the best possible life. The twelfth book of the Metaphysics thus provides to an even lesser extent than is usually assumed the outlines of a theology. By way of recompense, however, Aristotle offers us a truly breathtaking metaphysics.' back

Minkowski space - Wikipedia, Minkowski space - Wikipedia, the free encyclopedia, ' By 1908 Minkowski realized that the special theory of relativity, introduced by his former student Albert Einstein in 1905 and based on the previous work of Lorentz and Poincaré, could best be understood in a four-dimensional space, since known as the "Minkowski spacetime", in which time and space are not separated entities but intermingled in a four-dimensional space–time, and in which the Lorentz geometry of special relativity can be effectively represented using the invariant interval x2 + y2 + z2 − c2 t2.' back

Nondeterministic algorithm - Wikipedia, Nondeterministic algorithm- Wikipedia, the free encyclopedia, ' In computational complexity theory, nondeterminism is often modeled using an explicit mechanism for making a nondeterministic choice, such as in a nondeterministic Turing machine. For these models, a nondeterministic algorithm is considered to perform correctly when, for each input, there exists a run that produces the desired result, even when other runs produce incorrect results. This existential power makes nondeterministic algorithms of this sort more efficient than known deterministic algorithms for many problems. The P versus NP problem encapsulates this conjectured greater efficiency available to nondeterministic algorithms. Algorithms of this sort are used to define complexity classes based on nondeterministic time and nondeterministic space complexity. They may be simulated using nondeterministic programming, a method for specifying nondeterministic algorithms and searching for the choices that lead to a correct run, often using a backtracking search.' back

P versus NP problem - Wikipedia, P versus NP problem - Wikipedia, the free encyclopedia, ' The P versus NP problem is a major unsolved problem in computer science. It asks whether every problem whose solution can be quickly verified (technically, verified in polynomial time) can also be solved quickly (again, in polynomial time). The underlying issues were first discussed in the 1950s, in letters from John Forbes Nash Jr. to the National Security Agency, and from Kurt Gödel to John von Neumann. The precise statement of the P versus NP problem was introduced in 1971 by Stephen Cook in his seminal paper " The complexity of theorem proving procedures" and is considered by many to be the most important open problem in the field.' back

Papal Encyclicals Online, First Vatican Council: Dogmatic Constitution on the Catholic Faith, ' Dogmatic constitution on the Catholic faith Pius, bishop, servant of the servants of God, with the approval of the Sacred Council, for an everlasting record. 1. The Son of God, redeemer of the human race, our lord Jesus Christ, promised, when about to return to his heavenly Father, that he would be with this Church militant upon earth all days even to the end of the world. Hence never at any time has he ceased to stand by his beloved bride, assisting her when she teaches, blessing her in her labors and bringing her help when she is in danger.' back

Paul Of Tarsus, Galatians, 5:16-24, '16 But I say, walk by the Spirit, and you will not gratify the desires of the flesh. 17 For the desires of the flesh are against the Spirit, and the desires of the Spirit are against the flesh, for these are opposed to each other, to keep you from doing the things you want to do. 18 But if you are led by the Spirit, you are not under the law. 19 Now the works of the flesh are evident: sexual immorality, impurity, sensuality, 20 idolatry, sorcery, enmity, strife, jealousy, fits of anger, rivalries, dissensions, divisions, 21 envy, drunkenness, orgies, and things like these. I warn you, as I warned you before, that those who do such things will not inherit the kingdom of God. 22 But the fruit of the Spirit is love, joy, peace, patience, kindness, goodness, faithfulness, 23 gentleness, self-control; against such things there is no law. 24 And those who belong to Christ Jesus have crucified the flesh with its passions and desires.' back

Photon - Wikipedia, Photon - Wikipedia, the free encyclopedia, ' A photon (from Ancient Greek φῶς, φωτός (phôs, phōtós) 'light') is an elementary particle that is a quantum of the electromagnetic field, including electromagnetic radiation such as light and radio waves, and the force carrier for the electromagnetic force. Photons are massless, so they always move at the speed of light in vacuum, 299792458 m/s . . .. The photon belongs to the class of bosons.' back

Quantum entanglement - Wikipedia, Quantum entanglement - Wikipedia, the free encyclopedia, 'Quantum entanglement is a physical phenomenon which occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the other(s), even when the particles are separated by a large distance—instead, a quantum state must be described for the system as a whole. . . . Entanglement is considered fundamental to quantum mechanics, even though it wasn't recognized in the beginning. Quantum entanglement has been demonstrated experimentally with photons, neutrinos, electrons, molecules as large as buckyballs, and even small diamonds. The utilization of entanglement in communication and computation is a very active area of research.' back

Richard P. Feynman (1985), Quantum mechanical computers, ' We are here considering ideal machines; the effects of imperfections will be considered later. This study is one of principle; our aim is to exhibit some Hamiltonian for a system which could serve as a computer. We are not concerned with whether we have the most efficient system, nor how we could best implement it.' back

Robyn J. Whitaker (2024_09_06), For God and country: why the choice of next US president is a religious choice, too, ' As religion declines in the West, there is a danger that voters are less well equipped to evaluate how religion is used in political life and what constitutes a poor or dishonest use of Christian theology. While the interpretation of almost any biblical passage can be debated (and is), at the heart of Christianity is the term gospel, euanggelion, which literally means “good news”. When Jesus spoke about “good news”, he spoke about it in terms of release from captivity, the liberation of the oppressed, and favour for the poor (Luke 4). It is not a bad measure by which to assess political discourse that claims to be Christian. It can be tempting in the face of the diversity of Christian opinion to want to keep religion out of politics entirely. But the answer to bad theology is not an absence of theology. It is good theology. When Christianity is invoked in politics, people of all faiths – and none – should examine how the way it is being used sits alongside the central Christian message that God is love, and that the people at the very bottom of society always come first in Jesus’ teaching.' back

Router (computing) - Wikipedia, Router (computing) - Wikipedia, the free encyclopedia, 'A router is a networking device that forwards data packets between computer networks. Routers perform the traffic directing functions on the Internet. Data sent through the internet, such as a web page or email, is in the form of data packets. A packet is typically forwarded from one router to another router through the networks that constitute an internetwork (e.g. the Internet) until it reaches its destination node.. . . . The most familiar type of IP routers are home and small office routers that simply forward IP packets between the home computers and the Internet. More sophisticated routers, such as enterprise routers, connect large business or ISP networks up to the powerful core routers that forward data at high speed along the optical fiber lines of the Internet backbone.' back

Second Vatican Council, Dogmatic Constitution on Divine Revelation 'Dei Verbum', Solemnly promulgated by His Holiness Pope Paul VI on November 18, 1965. ' Preface: 1. Hearing the word of God with reverence and proclaiming it with faith, the sacred synod takes its direction from these words of St. John: "We announce to you the eternal life which dwelt with the Father and was made visible to us. What we have seen and heard we announce to you, so that you may have fellowship with us and our common fellowship be with the Father and His Son Jesus Christ" (1 John 1:2-3). Therefore, following in the footsteps of the Council of Trent and of the First Vatican Council, this present council wishes to set forth authentic doctrine on divine revelation and how it is handed on, so that by hearing the message of salvation the whole world may believe, by believing it may hope, and by hoping it may love.' back

Second Vatican Council - Wikipedia, Second Vatican Council - Wikipedia, the free encyclopedia, ' The Second Vatican Ecumenical Council addressed relations between the Catholic Church and the modern world. It was the twenty-first ecumenical council of the Catholic Church and the second to be held at St. Peter's Basilica in the Vatican. The council, through the Holy See, formally opened under the pontificate of Pope John XXIII on 11 October 1962 and closed under Pope Paul VI on the Feast of the Immaculate Conception on 8 December 1965. . . . Vatican II was record-breaking. "[Its features] are so extraordinary [...] that they set the council apart from its predecessors almost as a different kind of entity": its massive proportions, its international breadth, the scope and variety of issues it addressed, its style, and the presence of the media. Its impact on the Church was huge.' back

Shield of the Trinity - Wikipedia, Shield of the Trinity - Wikipedia, the free encyclopedia, 'The Shield of the Trinity or Scutum Fidei is a traditional Christian visual symbol which expresses many aspects of the doctrine of the Trinity, summarizing the first part of the Athanasian Creed in a compact diagram. In late medieval England and France, this emblem was considered to be the heraldic arms of God (and of the Trinity).' back

SIPRI: Stockholm International Peace Research Institute, Role of nuclear weapons grows as geopolitical relations deteriorate—new SIPRI Yearbook out now, ' Nuclear arsenals being strengthened around the world The nine nuclear-armed states—the United States, Russia, the United Kingdom, France, China, India, Pakistan, the Democratic People’s Republic of Korea (North Korea) and Israel—continued to modernize their nuclear arsenals and several deployed new nuclear-armed or nuclear-capable weapon systems in 2023. Of the total global inventory of an estimated 12 121 warheads in January 2024, about 9585 were in military stockpiles for potential use (see the table below). An estimated 3904 of those warheads were deployed with missiles and aircraft—60 more than in January 2023—and the rest were in central storage. Around 2100 of the deployed warheads were kept in a state of high operational alert on ballistic missiles. Nearly all of these warheads belonged to Russia or the USA, but for the first time China is believed to have some warheads on high operational alert. ‘While the global total of nuclear warheads continues to fall as cold war-era weapons are gradually dismantled, regrettably we continue to see year-on-year increases in the number of operational nuclear warheads,’ said SIPRI Director Dan Smith. ‘This trend seems likely to continue and probably accelerate in the coming years and is extremely concerning.’ back

Spin (physics) - Wikipedia, Spin (physics) - Wikipedia, the free encyclopedia, 'In quantum mechanics and particle physics, spin is an intrinsic form of angular momentum carried by elementary particles, composite particles (hadrons), and atomic nuclei.
Spin is one of two types of angular momentum in quantum mechanics, the other being orbital angular momentum. Orbital angular momentum is the quantum-mechanical counterpart to the classical notion of angular momentum: it arises when a particle executes a rotating or twisting trajectory (such as when an electron orbits a nucleus). The existence of spin angular momentum is inferred from experiments, such as the Stern–Gerlach experiment, in which particles are observed to possess angular momentum that cannot be accounted for by orbital angular momentum alone.' back

Spin-statistics theorem - Wikipedia, Spin-statistics theorem - Wikipedia, the free encyclopedia, 'In quantum mechanics, the spin–statistics theorem relates the spin of a particle to the particle statistics it obeys. The spin of a particle is its intrinsic angular momentum (that is, the contribution to the total angular momentum that is not due to the orbital motion of the particle). All particles have either integer spin or half-integer spin (in units of the reduced Planck constant ħ). The theorem states that: The wave function of a system of identical integer-spin particles has the same value when the positions of any two particles are swapped. Particles with wave functions symmetric under exchange are called bosons. The wave function of a system of identical half-integer spin particles changes sign when two particles are swapped. Particles with wave functions antisymmetric under exchange are called fermions.' back

Theology - Wikipedia, Theology - Wikipedia, the free encyclopedia, Theology is the systematic and rational study of concepts of God and of the nature of religious truths, or the learned profession acquired by completing specialized training in religious studies, usually at a university, seminary or school of divinity. . . . 'During the High Middle Ages, theology was therefore the ultimate subject at universities, being named "The Queen of the Sciences" and serving as the capstone to the Trivium and Quadrivium that young men were expected to study. This meant that the other subjects (including Philosophy) existed primarily to help with theological thought.' back

Thomas Aquinas Summa Theologiae I, 1, 2, Is sacred doctrine is a science?, 'I answer that, Sacred doctrine is a science. We must bear in mind that there are two kinds of sciences. There are some which proceed from a principle known by the natural light of intelligence, such as arithmetic and geometry and the like. There are some which proceed from principles known by the light of a higher science: thus the science of perspective proceeds from principles established by geometry, and music from principles established by arithmetic. So it is that sacred doctrine is a science because it proceeds from principles established by the light of a higher science, namely, the science of God and the blessed.' back

Turing machine - Wikipedia, Turing machine - Wikipedia, the free encyclopedia, ' A Turing machine is a hypothetical device that manipulates symbols on a strip of tape according to a table of rules. Despite its simplicity, a Turing machine can be adapted to simulate the logic of any computer algorithm, and is particularly useful in explaining the functions of a CPU inside a computer. The "machine" was invented in 1936 by Alan Turingwho called it an "a-machine" (automatic machine). The Turing machine is not intended as practical computing technology, but rather as a hypothetical device representing a computing machine. Turing machines help computer scientists understand the limits of mechanical computation.' back

Universe - Wikipedia, Universe - Wikipedia, the free encyclopedia, 'The Universe is all of spacetime and everything that exists therein, including all planets, stars, galaxies, the contents of intergalactic space, the smallest subatomic particles, and all matter and energy. Similar terms include the cosmos, the world, reality, and nature. The observable universe is about 46 billion light years in radius. back

Wayne Enright, The Eigenvalue Problem, ' Wayne Enright's Research is concerned with the development of reliable and robust numerical methods for the approximate solution of different classes of ordinary differential equations (ODEs) and itegro-differential equations (IDEs). These problems often arise when modelling important applications in the physical and life sciences as well as in medicine and engineering.' back

Whitehead and Russell (1910), Principia Mathematica, Jacket: 'Principia Mathematica was first published in 1910-1913; this is the fifth impression of the second edition of 1925-7. The Principia has long been recognized as one of the intellectual landmarks of the century. It was the first book to show clearly the close relationship between mathematics and formal logic. Starting with a minimal number of axioms, Whitehead and Russell display the structure of both kinds of thought. No other book has had such an influence on the subsequent history of mathematical philosophy .' back

William Bialek & Allan Schweitzer (1985), Quantum Noise and the Threshold of Hearing, ' We argue that the sensitivity of the ear reaches a limit imposed by the uncertainty principle. This is possible only if the receptor holds the detector elements in a special nonequilibirium state which has the same noise characteristics as a ground (T = 0 K) state. To accomplish this "active cooling" the molecular dynamics of the system must maintain quantum mechanical coherence over the time scale of the measurement.' back

Xavier Waintal (2023_12_29), The Quantum House Of Cards, ' Quantum computers have been proposed to solve a number of important problems such as discovering new drugs, new catalysts for fertilizer production, breaking encryption protocols, optimizing financial portfolios, or implementing new artificial intelligence applications. Yet, to date, a simple task such as multiplying 3 by 5 is beyond existing quantum hardware. This article examines the difficulties that would need to be solved for quantum computers to live up to their promises. I discuss the whole stack of technologies that has been envisioned to build a quantum computer from the top layers (the actual algorithms and associated applications) down to the very bottom ones (the quantum hardware, its control electronics, cryogeny, etc.) while not forgetting the crucial intermediate layer of quantum error correction.' back

Zero-sum game - Wikipedia, Zero-sum game - Wikipedia, the free encyclopedia, ' Zero-sum game is a mathematical representation in game theory and economic theory of a situation which involves two sides, where the result is an advantage for one side and an equivalent loss for the other. In other word, player one gain is equivalent to player two loss, therefore the net improvement in benefit of the game is zero. If the total gains of the participants are added up, and the total losses are subtracted, they will sum to zero. Thus, cutting a cake, where taking a more significant piece reduces the amount of cake available for others as much as it increases the amount available for that taker, is a zero-sum game if all participants value each unit of cake equally.' back

 
 

https://www.cognitivecosmology.com is maintained by the theology company proprietary limited acn 097 887 075 abn 74 097 887 075 copyright 2000-2024 © jeffrey nicholls