There Are Not Enough Probabilistic Resources Available In The Universe To Reject God:
In other words, it would be utter foolishness to bet that God doesn’t exist, because the statistical and probabilistic evidence for His existence is overwhelming!
Today more than ever, science is learning that this universe has a beginning; that all the dimensions of this universe were created at some instant. That time, and space were created and that therefore before their creation, there was no time, or physical space.
Science has also seen overwhelming evidence that the universe that was created is incredibly designed. Also all our experience and evidence teaches us that only intelligence is capable of designing such complex interacting systems such as the cosmos as well as, at the microscopic level, the complex systems found in biological life such as protein machines and propellation motors.
In all the years of studying the Universe, in all the various domains from cosmology to biological systems to neuroscience, it is a fact that no example of complex and functional design has ever been found, where we can even suspect, that this design had no intelligence involved in its creation.
In fact, the exact opposite is the case. Where-ever we find such design and are able to determine the ‘designer’ we do find an intelligent source. For example, many well-accepted, uncontroversial scientific disciplines are utterly dependent on detecting design, and on inferring the past actions of an intelligent agent by examining present evidence. For example, consider the following areas of science:
- Forensic Sciences, where a death is investigated to determine whether the person died by accident (i.e., chance/necessity) or by intent (i.e., murder).
- Cryptanalysis, where code breakers examine patterns of characters to determine whether they convey a message or are simply random and meaningless noise.
- Archaeology, where artifacts are examined to determine whether they were fashioned by man or by nature. Is the rock just a stone, or a tool?
- Arson investigation, where one attempts to discern from charred remains whether the fire was set intentionally (by design) or resulted from a frayed wire (chance/necessity).
- Copyright infringement and plagiarism, where scientists examine writings to determine whether they were accidentally or intentionally similar to the work of others.
- the search for extraterrestrial intelligence (SETI)
All these disciplines make the same inferences that are made from the study of cosmology and astronomy, and from the study of biological information systems.
For example, it is well accepted that the motor in the ATP synthase enzyme is the most efficient motor even seen, with an efficiency close to 100%. It is remarkable to note the similarity between the structure and general operation of ATP synthase and a man-made rotary motor. This similarity extends even to the Brownian motor located within the ATP synthase rotary motor, a molecular-scale machine that drives ATP production.
Another paper (von Ballmoos et al., 2009) states:
“The rotational mechanism of the ATP synthase demands ingeniously designed interfaces between rotor and stator subunits, particularly between the rotating c ring and the laterally abutted subunit a, because rotation speeds up to 500 Hz must be tolerated in the absence of a stabilizing rotor axis. This proteinous interface also acts as the critical scaffold for torque generation and ion translocation across the membrane. To prohibit charge translocation without rotation, ion leakage at the interface must be efficiently prevented.”
Another good example is detailed in this paper titled: ‘Sequence-Specific Peptide Synthesis by an Artificial Small-Molecule Machine’ Science, Vol. 339 no. 6116 pp. 189-193 (11 January 2013):
“Here, we report on the design, synthesis, and operation of a rotaxane-based small-molecule machine in which a functionalized macrocycle operates on a thread containing building blocks in a predetermined order to achieve sequence-specific peptide synthesis. The design of the artificial molecular machine is based on several elements that have analogs in either ribosomal or nonribosomal protein synthesis: Reactive building blocks (the role played by tRNA-bound amino acids) are delivered in a sequence determined by a molecular strand (the role played by mRNA). A macrocycle ensures processivity during the machine’s operation (reminiscent of the way that subunits of the ribosome clamp the mRNA strand) and bears a catalyst–a tethered thiol group–that detaches the amino acid building blocks from the strand and passes them on to another site at which the resulting peptide oligomer is elongated in a single specific sequence, through chemistry related to nonribosomal peptide synthesis.”
They write that their machine “is a primitive analog of the ribosome.” An analog is this case being a copy. A copy of a far more sophisticated design.
To create such complex, even if primitive, molecular motors requires these scientists to generate the complex and specified information of their designs which is then used in making the motor. Information that reliably indicates design has such high levels of such ‘complex and specified information’ (or ‘specified complexity’).
Dr Stephen Meyer points out that “We have repeated experience of rational and conscious agents — in particular ourselves — generating or causing increases in complex specified information, both in the form of sequence-specific lines of code and in the form of hierarchically arranged systems of parts. … Our experience-based knowledge of information-flow confirms that systems with large amounts of specified complexity (especially codes and languages) invariably originate from an intelligent source from a mind or personal agent.” – ‘The origin of biological information and the higher taxonomic categories’, Proceedings of the Biological Society of Washington, Vol. 117(2):213-239 (2004).
Science and our own personal experiences have shown us that only intelligence is capable of creating such ‘prescriptive’ information.
One of the intriguing ways that recent science has further demonstrated the truth of these findings is in the use of reverse engineering.
From the very large aspects of the universe (i.e., big bang cosmology and galactic and stellar evolution) to the very small (i.e., the fitness of the chemical elements and the coding of DNA for life), the cosmos is so readily and profitably reverse engineered by its human inhabitants as to suggest that it was all engineered from the beginning.
“The linking of extraordinarily complex, but stable and functional structures with the production of value provides the strong impression of a calculating intentionality, which is apparently able to operate in a transcendent (overriding, overarching) fashion” – D.Halsmer, J.Asper, N.Roman, T.Todd, “The Coherence of an Engineered World,”International Journal of Design & Nature and Ecodynamics, Vol. 4(1):47-65 (2009)
The most coherent view of the universe is that of a system of interdependent subsystems that efficiently interact to prepare for, develop, and support advanced life, subject to various physical constraints.
Similarly, human-engineered systems are characterized by stability, predictability, reliability, transparency, controllability, efficiency, and (ideally) optimality.
These features are also prevalent throughout the natural systems that make up the cosmos. However, the level of engineering appears to be far above and beyond, or transcendent of, current human capabilities, as well as having been in place long before human beings developed any such sophisticated systems.
Along with the overwhelming evidence that the production of ‘complex specified information’ requires an intelligent source and the growing appreciation that the very best of human designed systems prove to only be primitive analogs of existing biological and astronomical systems, is the apparent, but unexpected, match between the comprehensibility of the universe and the ability of mankind to comprehend it.
This unexplained matching is actually a prerequisite for any kind of reverse engineering activity to be even remotely successful. That is, we can’t effectively design a copy of a machine of system, if we don’t understand how it works. Such understanding, while sufficient to build ‘primitive’ copies, has to date normally proven inadequate to come close to matching the efficiency and effectiveness of the machines or systems being ‘copied’ via such reverse engineering.
Even more intriguing it seems is that we appear to be progressing in a step-by-step fashion, both in our ability to reverse engineer and our subsequent ability to design and produce our ‘copies’. That is, when we reflect on this step-by-step progress, it appears as if we have been led forward in our understanding and wisdom in a tutorial like manner as the puzzles of nature slowly unravel before us.
For example, Science was able to progress to Einstein’s theory of relativity in part because there existed an elegant mathematical description of gravity that approximated reality, namely the one Newton formulated. In a similar manner, much of todays progress in technology can be traced back to the successes of Einstein and others at the turn of last century.
The universe has proven so readily and profitably reverse engineered as to make a compelling argument that it was engineered in the first place, and further that this engineered design was built with humanity in mind.
It may help to appreciate that ‘reverse engineering’ is similar to the historical sciences, which essentially proceed by inferring history from its results; that is they reason from clues back to causes.
Further than this, they investigate various hypotheses to see which hypothesis, if true, would best explain the known data.
This may sound simple but where there are a number of possibly adequate competing hypotheses, this can prove very difficult. Also to establish a casual claim, that is a valid and logically consistent link between the ‘probable’ events of the past and our current understanding or interpretation, this scientific approach requires the identification of three things:
- Evidence that the cause proposed was present;
- Evidence that on other occasions it has demonstrated the capacity to produce the effect under study, and
- That there is an absence of evidence, despite a thorough search, of any other possible causes.
In investigating the apparent design in nature with this approach, we may struggle to explicitly be able to establish that the ‘cause’ was present (namely, the Creator or Intelligent Designer), but we are on strong ground for point 2 in that the evidence of human design does demonstrate such capacity. In terms of point 3, we have not been able find any other possible or plausible causes. Therefore, the analogous nature of our evidence from point 2 certainly gives strong circumstantial evidence for the existence of the Cause of point 1, the Intelligent Designer.
This is also the case with examples like the ATP Synthase motor and the bacterial flagellum motor for example. While their incredibly sophisticated and superior designs to not prove an Intelligent Designer, there are no other known or even hypothetical causes behind such designs.
The biological systems that these motors are part of are in fact, superior analogs of today’s computer systems:
“In each cell, there are multiple Operating Systems, multiple programming languages, encoding/ decoding hardware and software, specialized communications systems, error detection and correction mechanisms, specialized input/output channels for organelle control and feedback, and a variety of specialized “devices” to accomplish the tasks of life.”
– ‘Programming of Life’ by Dr. Donald E Johnson
Science has also learned that all living beings contain a blueprint, a code that determines their design, their structure, their function, etc. We now know that code is in the DNA and RNA of the cells of living organisms.
While there is still an awful lot we don’t know about the design of biological systems and the ‘coding’ used in them, and also for example, the complexities of neuroscience, the evidence continues to grow that such complex specified information and functional design is the result of an intelligent designer.
With respect to our lack of knowledge, Nobel prize-winner David Hubel of Harvard University (Medicine 1981 -Research on information-processing in the visual system) wrote in 1995: “… This abiding tendency for attributes such as form, colour and movement to be handled by separate structures in the brain immediately raises the question how all the information is finally assembled, say, for perceiving a bouncing red ball. These obviously must be assembled—but where and how, we have no idea.“
Since he wrote this very little progress has been made towards answering the question he posed (certainly none of his further papers answer it). The point being that there is still much to learn. However, what we do learn only confirms the incredible design involved that far exceeds our capabilities even today. We also continue to find no other causal agent, even in principle, that can adequately explain such design.
The conclusion that an object has been engineered is only a result of the success of reverse engineering and the consequential success of human designs analogs (almost all of which are only pale comparisons). Whether it is our cameras that mimic the human eye, our memory storage techniques that are still a trillion times less in memory/size of DNA storage, or our various, but much more inefficient motors, all these designs are still not comparable in functionality and sophistication. These biological objects and systems that we are making analogs of clearly have purpose, in the same way that our ‘copies’ are designed with a purpose or ‘goal in mind’.
To repeat, when we look for evidence of plausible alternatives for the existence of such engineered systems, we can not find any. The great weight of evidence for any complex machine (like a car), is that that machine was designed. When the design is far better than the very best that humans can so far achieve the inference is even stronger.
The Anthropic Principle:
The anthropic principle (first proposed in the early 1970’s) states that the universe appears “designed” for the sake of human life. More than a century of astronomy and physics research, but most especially new evidence found since 1998) yields this unexpected observation:
– the emergence of humans and human civilization requires physical constants, laws, and properties that fall within certain narrow ranges
– and this truth applies not only to the cosmos as a whole but also to the galaxy, planetary system, and planet humans occupy.
To state the principle more dramatically, a preponderance of physical evidence points to humanity as the central theme of the cosmos.
While this is an inference from the best evidence (meaning it could conceivably be shown to be false), to date, on a daily basis, the evidence from the study of both the universe continues to confirm the reasonableness of this inference. When all of the factors that are at least somewhat understood are considered together, the prospects of a Universe evolving that is suitable for human life turns out to be astronomically small.
Oxford physicist Roger Penrose said one parameter, the ‘original phase-space volume’, required fine-tuning to an accuracy of one part in ten billion multiplied by itself one hundred and twenty three times.
Penrose remarked that it would be impossible to even write down that number in full, since it would require more zeroes than the number of elementary particles in the entire universe! This showed, he said, ‘the precision needed to set the universe on its course.’
Support for the anthropic principle comes from an unwavering and unmistakable trend line within the data: the more astronomers learn about the universe and the requirements of human existence, the more severe the limitations they find governing the structure and development of the universe to accommodate those requirements. In other words, additional discoveries are leading to more indicators of large-scale and small-scale fine-tuning.
In 1961, astronomers acknowledged just two characteristics of the universe as “fine-tuned” to make physical life possible. The more obvious one was the ratio of the gravitational force constant to the electromagnetic force constant.
It cannot differ from its value by any more than one part in 1040 (one part in ten thousand trillion trillion trillion) without eliminating the possibility for life. Today, the number of known cosmic characteristics recognized as fine-tuned for life—any conceivable kind of physical life—stands at around 38.
Of these, the most sensitive is the ‘space energy density’ (the self-stretching property of the universe). Its value cannot vary by more than one part in 10120 and still allow for the kinds of stars and planets physical life requires.
Evidence of specific preparation for human existence shows up in the characteristics of the solar system, as well. In the early 1960s astronomers could identify just a few solar system characteristics that required fine-tuning for human life to be possible. By the end of 2001, astronomers had identified more than 150 finely-tuned characteristics. In the 1960s the odds that any given planet in the universe would possess the necessary conditions to support intelligent physical life were shown to be less than one in ten thousand.
By 2001 those odds had shrank to less than one in a number so large it might as well be infinity (10173).
As Sir Fred Hoyle commented, `A commonsense interpretation of the facts suggests that a super-intellect has monkeyed with physics, as well as chemistry and biology, and that there are no blind forces worth speaking about in nature.”
In the opinion of physicist Paul Davies, `The impression of design is overwhelming.”
Physics today accepts that some model of ‘Big Bang’ cosmology is the correct model for the creation of the universe from nothing and that this event was not a chaotic, disorderly event. Instead, it appears to have been fine-tuned for the existence of intelligent life with a complexity and precision that literally defies human comprehension.
In other words, the universe we see today-and our very existence-depends upon a set of highly special initial conditions. This phenomenon is strong evidence that the ‘Big Bang’ was not an accident, but that it was designed.
The Big Bang model is the standard paradigm of contemporary cosmology, its broad framework is very securely established as a scientific fact. Stephen Hawking has said, ’Almost everyone now believes that the universe, and time itself, had a beginning.’
What some renown Physicists say:
Tony Rothman, (a theoretical physicist):
The medieval theologian who gazed at the night sky through the eyes of Aristotle and saw angels moving the spheres in harmony has become the modern cosmologist who gazes at the same sky through the eyes of Einstein and sees the hand of God not His angels but in the constants of nature. . . . When confronted with the order and beauty of the universe and the strange coincidences of nature, it’s very tempting to take the leap of faith from science into religion. I am sure many physicists want to. I only wish they would admit it.
Bernard Carr (cosmologist):
One would have to conclude either that the features of the universe invoked in support of the Anthropic principle are only coincidences or that the universe was indeed tailor-made for life. I will leave it to the theologians to ascertain the identity of the tailor!
It would be very difficult to explain why the universe should have begun in just this way, except as the act of a God who intended to create beings like us.”
Allan Sandage, winner of the Crawford prize in astronomy (equivalent to the Nobel prize), remarked,
“I find it quite improbable that such order came out of chaos. There has to be some organizing principle. God to me is a mystery but is the explanation for the miracle of existence, why there is something instead of nothing.””
Robert Griffiths, who won the Heinemann prize in mathematical physics, observed,
“If we need an atheist for a debate, I go to the philosophy department. The physics department isn’t much use.”
Astrophysicist Robert Jastrow, a self-proclaimed agnostic:
For the scientist who has lived by his faith in the power of reason, the story ends like a bad dream. He has scaled the mountains of ignorance; he is about to conquer the highest peak; as he pulls himself over the final rock, he is greeted by a band of theologians who have been sitting there for centuries.”
The evidence for an Intelligent Designer may not be conclusive in the sense that mathematics tells us two plus two equals four, but it is a cumulative argument. The extraordinary fine-tuning of the laws and constants of nature, their beauty, their discoverability, their intelligibility, all combine to make the Intelligent Designer hypothesis the most reasonable choice we have. All other theories fall short.
“Rather than being one planet among billions, Earth now appears to be the uncommon Earth. The data imply that Earth may be the only planet `in the right place at the right time’.”
– ‘Chance Or Dance: An Evaluation of Design’ By Jimmy H. Davis, Harry L. Poe.
It appears that the evidence for a Designer and Creator of the Universe grows daily and exponentially. So accepting that there is a Creator, a God or perhaps Gods, behind it all, the next valid question may be, is he interested in us?
Recognizing that humanity is the pinnacle of creation and that the human brain and the human mind is the pinnacle of the universe being both the most complex and most intelligent creation, we immediately start to sense that this Universe was created with mankind in mind.
There is much cosmological evidence to support this contention; from the unique placement of our Solar System and of Planet Earth, as the book ‘The Privileged Planet’ explains, to the unique time in the evolution of the cosmos that allows us to be in the perfect epoch of time to investigate it.
“The remarkable cosmic coincidence that we happen to live at the only time in the history of the universe when the magnitude of dark energy and dark matter densities are comparable has been a source of great current speculation, leading to a resurgence of interest in possible anthropic arguments limiting the value of the vacuum energy. But this coincidence endows our current epoch with another special feature, namely that we can actually infer both the existence of the cosmological expansion, and the existence of dark energy.
Thus, we live in a very special time in the evolution of the universe: the time at which we can observationally verify that we live in a very special time in the evolution of the universe!
Observers when the universe was an order of magnitude younger would not have been able to discern any effects of dark energy on the expansion, and observers when the universe is more than an order of magnitude older will be hard pressed to know that they live in an expanding universe at all, or that the expansion is dominated by dark energy. By the time the longest lived main sequence stars are nearing the end of their lives, for all intents and purposes, the universe will appear static, and all evidence that now forms the basis of our current understanding of cosmology will have disappeared.”
– ‘The Return of a Static Universe and the End of Cosmology’
by Lawrence M. Krauss and Robert J. Scherrer (June 27, 2007)
“The idea that the natural world was designed especially for mankind is the very bedrock of the Greek, as well as of the Judeo-Christian world view. Western philosophers of the post-Roman era went so far as to formalize a discipline called teleology —the study of the evidence for overall design and purpose in nature. Teleology attracted such luminaries as Augustine, Maimonides, Aquinas, Newton and Paley, all of whom gave it much of their life’s work.” – ‘Design and the Anthropic Principle’ by Hugh Ross
Those who place their faith in materialism are the scientists who, despite the great funding and resources that they enjoy, are making limited progress in their research.
Most of the ground-breaking research especially in biological systems and neuroscience is coming from those who do assume design.
A good example is the research by Brain Surgeon, Dr Michael Egnor. In trying to understand how cerebral blood flow and how the brain was buffered from the force of blood pumped by the heart, he looked to human engineered pumps that did that same thing. Once he understood how they worked, he was able to then find and explain how the brain did a similar thing (another use of the principle of reverse engineering).
For more details on the scientific evidence for God; for the Creator and Intelligent Designer of this Universe I recommend my series of Lessons on Intelligent Design – see the Intelligent Design tab at www.circumcisedheart.info
 Something is complex if it is unlikely, and it is specified if it matches a pre-existing pattern.
 Two excellent books that go into detail on these issues are William Dembski’s ‘The Design Revolution’ and Stephen C Meyer’s ‘Signature in the Cell’
 Scientists now have a good understanding of what the basic requirements are for human life – essentially carbon, water, oxygen, & energy but within an extremely narrow range of values.
 William Dembski, in his ‘The Design Revolution’ shows most convincingly that any probability with less than 1 in 10150 is as good as impossible. That is, it can not possibly have happened by chance, even if given the resources of the full history of time and space of the Universe. That is, if something exists and it’s likelihood of existing is less than 1 in 10150, then it must only exist because it was created.