There are a whole lot of people out there (probably the majority) that believe, unequivocally, that scientists are capable of dating rocks, fossils, and the earth with a reasonable amount of certainty. Why wouldn’t we believe that? It’s what we’re taught in school and the evidence that is presented to us seems unarguable. So, when we hear of alternate views- such as young earth creation in which the earth is somewhere in the neighborhood of 7,000 years old based on Biblical chronologies- it sounds completely ridiculous to us. I mean, our middle school science books explained that scientists have methods to calculate absolute dates within an acceptable range with astounding accuracy. It is declared- the evidence has spoken and it proclaims ages in the billions of years.
As a Bible believing Christian, this leaves you with one of two options. Either the creation account in the Bible cannot be taken literally or these scientific dating methods are erroneous. Since many people believe the science is concrete (I’ve even heard the belief in a young earth compared to a belief in a flat earth- ouch!) we get really creative (and frankly complicated) with our reading of the creation account to make it jive with the findings of modern day science, which has resulted in many different hypotheses to add in the extra time. For a look at the theories we Christians have come up with you can check out my article What in the World Happened Between Genesis 1:1 and 1:2?
Now I’m not saying that the Bible is meant to be a science book. Clearly, it isn’t. But what are the ramifications when we start discounting certain elements of the Bible that don’t corroborate our current scientific understanding? For example, embracing evolution as Biblical means that there could not have been a literal Adam and Eve- just think about the ramifications of that on the rest of the Bible. In our newly altered reading and understanding of the text, what else do we end up compromising on? Again, I’m not saying we should toss out any science that doesn’t seem to go along with our understanding of the Bible. On the contrary, I believe that both the Bible (a proper understanding of it) and real science (not blind acceptance of fallible theory) go together hand in hand- even if we haven’t figured out how just yet.
So, what about that second possibility? Could the problem actually be with our scientific dating methods? If you do a little research into the subject, you’ll find that this possibility is not as crazy as you might think.
Quoting from an article in the July/August 2016 edition of Discover Magazine, “When it comes to determining the age of stuff scientists dig out of the ground, whether fossil or artifact, ‘there are good dates and bad dates and ugly dates,’ says paleoanthropologist John Shea of Stony Brook University. The good dates are confirmed using at least two different methods, ideally involving multiple independent labs for each method to cross-check results. Sometimes only one method is possible, reducing the confidence researchers have in the results. And ugly dates? ‘They’re based on “it’s that old because I say so,” a popular approach by some of my older colleagues,’ says Shea, laughing, ‘though I find I like it myself as I get more gray hair.’”
I don’t recall such a candid admission being recorded in my science textbooks. Even though Shea is obviously speaking tongue in cheek, I think there is an important truth here that shouldn’t be overlooked. Our dating methods are not exact enough to be used alone. In order to be considered “good” more than one method must be used and if more than one method isn’t possible, then the confidence in the results are greatly reduced. What I do not see, however, is anyone acting like these dates are anything less than gospel truth. We (the public) probably place so much faith in these methods because their limitations are never properly disclosed to us.
The article goes on to explain that there are two categories of dating: relative and absolute. Before scientists developed our absolute methods of dating, relative dating methods were used. As the article points out, “Think of it as ordering rather than dating. Basically, scientists could only determine where an item belonged within a particular sequence (think layers of time represented by different sediments or rocks).
Scientists could, for example, say that one particular item is older or younger than another item based on these various relative methods, but they could not assign a numerical age to the item. These methods are still used today, and scientists now are able to assign a numerical age range based on the absolute dating methods that we have at our disposal today.
Do you see the weakness here? If the absolute dating method that dates the area in which the fossil or artifact is found is incorrect- so is the age range applied to the artifact. You see, fossils and artifacts themselves are not usually directly dated because the process of absolute dating actually destroys part of the specimen. So, scientists use “index fossils”, which are defined as forms of life that existed during limited periods of geologic time and are thus used as guides to the age of the rocks in which they are preserved.
It’s really pretty disturbing when you think about it. An age range is assigned for a particular layer based on our absolute dating methods, then the artifacts that are discovered within this layer are assigned an age range, and anything else compared to this artifact is assigned a range based on the range that was assigned to the artifact it was being compared to. This is the very definition of circular reasoning. Far from being exact or reliable, it is open to major error. If one mistake is made in dating, it actually translates into multiple mistakes on down the line. This is why they try to use more than one method (when possible)- to avoid this rabbit hole. In fact, when the multiple dating methods used don’t agree with each other (which happens a lot) the scientists “calibrate” their results. It’s all a big “guestimate”.
But how reliable are our absolute dating methods? After all, an awful lot of assumptions hang on them.
Today, radiometric dating is considered absolute dating. Several different methods fall under the umbrella of radiometric dating, but here is the basic concept: Each chemical element is made up of atoms. (Think all the way back to that periodic table in Chemistry class.) Some variations of the elements are unstable and over time they decay and turn into a different element (a process called radioactive decay). The original element is called the “parent” element and the resulting element is called the “daughter” element. To find the age of a particular rock, scientists first measure the amount of parent and daughter elements it contains, then apply the rate of decay for the particular element being measured (known as a “half life”). Entering this information into a formula results in the age of the rock.
Think about it this way. When you were very first introduced to algebra, you realized horrifically that a math problem could contain letters. An extremely simple example would be: 2a + 3 = 7. In order to find the value of a, the number it is multiplied by (2) and another number that is added (3) are given as well as the number that they are equal to (7). The numbers that are given are the constants and you have to know their values in order to solve for a.
Although the scientific equations that calculate the age of rocks are obviously much more complicated, the same principle applies. In order to solve for the correct age of the rock, you must “plug in” certain constants that are known to be true. And herein lies the rub…
Radiometric dating techniques depend on three unreasonable assumptions which they plug into their equations as constants. I’ll go ahead and state the obvious here: Your equation will never yield a correct answer if the constants that you are entering into your equation are incorrect. It doesn’t matter how accurate your equation is- wrong data=wrong answer.
So what are these assumptions?
The first one is a biggie: The rate of radioactive decay is known and has been constant since the rock formed. While it is true that radioisotope decay rates seem to be constant today, to make the assumption that radioisotope decay rates have always been constant throughout history (according to their theory- constant for billions of years) is unreasonable.
As a matter of fact, we now have evidence that at some point (or points) in the past we have experienced accelerated rates of decay. The RATE group (Radioisotopes and the Age of The Earth) discovered this when examining zircon crystals. Dr. Jeff Miller explains their findings in an article for Apologetics Press, “The RATE team had several zircon crystals dated by expert evolutionists using the uranium-lead evolutionary dating technique and found them to be 1.5 billion years old, assuming a constant decay rate. A by-product of the breakdown of uranium into lead is helium. Content analysis of the crystals revealed that large amounts of helium were found to be present. However, if the crystals were as old as the dating techniques suggested, there should have been no trace of helium left, since helium atoms are known to be tiny, light, unreactive, and able to easily escape from the spaces within the crystal structure.” According to Roger Patterson writing for Answers in Genesis, “Helium escapes from crystals at a known, measurable rate. If those rocks were over a billion years old, as evolutionists claim, the helium should have leaked out of the rock.”
But this isn’t even the only evidence for fluctuating decay rates. New Scientist reported in 2009 that physicist David Alburger found that the nuclear decay rate of silicon-32 actually changed with the seasons. (article cited here: http://www.icr.org/article/5656/259) Similarly, a Purdue physicist found that nuclear decay rates speed up during the winter. The decay rates were found to be altered by the sun, but they are unsure as to exactly how- possibly an unknown particle that the sun emits. (link to the Purdue findings: http://www.purdue.edu/newsroom/research/2010/100830FischbachJenkinsDec.html)
The second assumption is that the amounts of parent and daughter isotopes contained in a rock have not been altered (none gained or lost) by anything other than radioactive decay. This means that the amount of the elements in the rock sample have never been affected by outside elements. In science lingo, this is called a “closed system”. So, in order to arrive at a correct date, this assumption requires that the elements in the rock sample have never- in the course of billions of years (as proposed by scientists)- been affected by weathering of the rock due to ground water, or diffusion of gases, lava flows, floods, mudslides, meteorite activity, or anything else.
Dr. Miller notes, “To suggest a closed system for a specimen that is believed to be very old is a reckless, unreasonable assumption, (1) when there is clear evidence that a closed system cannot be guaranteed, and (2) when, in fact, there is compelling evidence that ancient Earth was rocked by global catastrophe that most certainly would have violated the ‘closed system’ assumption and created an extremely ‘complex geological history.’”
The third assumption is that the original amounts of parent and daughter isotopes that were present when the rock was formed are known. More specifically, that the rock initially contained only the parent isotope and none of the daughter isotope.
I’ll quote Dr. Miller again, “But how could one possibly substantiate an assumption about the initial conditions of a specimen’s decay process, especially when the commencement of its decay was hundreds or thousands (or according to evolutionists, millions or billions) of years ago? Is it not possible, and even likely, that a specimen might have been initially composed of more than one element that blended together during a geologic phenomenon before that rock’s decay processes began?” This assumption cannot be substantiated since no one was present when these rocks formed and frankly such an assumption is illogical, especially when extrapolating billions of years in the past.
These three assumptions are just the tip of the iceberg. They represent a fundamentally flawed, but universally accepted geological assumption called “uniformitarianism” which is the basis for most of the evidence for an extremely old earth and universe.
Where did the theory of uniformitarianism come from? After all, prior to the 1800’s catastrophism ruled the “science” of the day. Charles Kimball gives a short history in his paper The Genesis Chronicles and I’ll summarize here. In 1795, Charles Hutton (pictured above) published his book Theory of Earth in which he described the concept of uniformitarianism. In his work, he ignored the evidence for a catastrophic beginning of the solar system being put forth by the astronomers of his day such as the moon’s craters, the rings of Saturn, and comets. His theory was actually not very well received at that time because just after Hutton died, the first asteroid was discovered. (More evidence of a catastrophy.) Hutton’s theory had, however, attracted the attention of man named Charles Lyell. Lyell was heavily influenced by William Smith,which is the geologist who first suggested that rocks could be dated according to their position (younger rocks will always be on top of older ones and rocks that contain similar fossils are probably the same age). Lyell believed that it took millions of years for any geologic process to occur. Lyell also created the geologic time scale that appears (completely unchanged) in our textbooks today.
Not that Lyell’s theory isn’t logical. It is- to a certain extent. However, even from the beginning Lyell altered evidence to corroborate his theory instead of letting the evidence drive his theory. Here is one example: Lyell theorized that an ice age had occurred in 1 million BC. Niagara Falls and the Great Lakes were actually created by advancing glaciers, and Niagara Falls erodes at a measurable rate. In order to corroborate his theory, Lyell decided to measure the distance from the position of Niagara Falls from their original starting position at the entrance to Lake Ontario. Residents of the area informed Lyell that the Falls receded at a rate of about 3 feet per year, but this created a big problem for Lyell’s ice age theory because at that rate only 12,000 years were needed in order for the Falls to arrive at the position they occupied. Whoops! So, did Lyell revisit his theory? No. Instead, Lyell told the residents they were mistaken in their observance and concluded that the Falls actually receded at a rate of 1 foot per year, which allowed him to date the end of the ice age at 35,000 years ago- in accordance with his theory.
Surely, I don’t have to point out that this is not the way science is supposed to work. Evidence should drive theory, not the other way around.
Indeed, Darwin latched onto Lyell’s uniformitarian theory, applied it to biology, and the theory of evolution was born. Coincidentally, there is even less scientific evidence to corroborate evolution, but since it is the only explanation for the existence of humanity outside of a Creator, secular science has latched onto it like a life preserver. The common denominator between both of these theories is the requirement of billions of years to make them feasible- which is why any scientific evidence that corroborates a solar system younger than billions of years will be promptly discarded, and any scientist who acknowledges this evidence runs the risk of being deligitimized by the majority of scientists in his field.
The problem with uniformitarianism is that we have historically witnessed over and over again catastrophic geologic processes shaping the earth- not primarily uniform slow constant processes across the board. We have multiple examples in geology today that bear witness to the fact that we cannot make across the board assumptions regarding geologic processes. Dr. Miller cites this example: “Consider, as one example of the effect of catastrophic events on geologic phenomena, recent scientific discoveries considering rapid petrification. For years it had been assumed that the process of petrification is a uniformitarian process that takes millions of years to complete. However, in 2004, five Japanese scientists published research in the journal Sedimentary Geology which casts doubt on that assumption. The team studied mineral rich, acidic water from the explosion crater of the Tateyama volcano in central Japan- water which runs over the edge of the volcano as a waterfall. Wood had fallen in the path of the water. The surprising discovery was that the wood had become petrified with silica after only 36 years as the water flowed over the wood.”
I’m not saying that all scientists ascribe 100% to complete uniformitarianism. Obviously scientists witness that some geologic processes are attributable to catastrophism. However, most secular scientists would attribute a disproportionately large amount of the geologic shaping of the earth to uniformitarian forces, while relegating catastrophic forces to the fringe. Conversely, creation scientists draw different conclusions from the very same evidence, and are able to provide equally compelling -yet not equally reported- evidence.
What about when you put radiometric dating to the test? After all, we have rock that we do know the age of- rock whose formation we witnessed. How do these dating methods hold up when checking their accuracy against rocks of known dates? Well…
Of course, scientists attribute these dating inconsistencies to various things. One of the more common arguments I have read “debunking” the erroneous dates involves issues with presence of excess argon which causes errors in dating young rocks in particular (these errors would supposedly resolve themselves and end up being inconsequential- not drastically effecting approximate ages- as the age of the rocks being tested increases). But if we can’t accurately date these young rocks, how can we ever have confidence in the ages that we come up with in the millions and billions? And if the issue is that these rocks were not formed in a closed system, but were actually contaminated at formation by external argon, does that not call into question the likelihood that other rocks being formed supposed millions or billions of years ago were formed in a closed system? How would you ever know? Is it not logical to allow that these older rocks may also have been subject to contamination from outside processes as we have evidenced in the formation of our younger rocks?
The Grand Canyon is another perfect example of the inconsistencies in our dating methods. The layers of rock that make up the Grand Canyon are sedimentary. Sedimentary rocks cannot be dated by radiometric dating. So the dates that we get from the Grand Canyon come from the relative dating process I mentioned above, which takes into consideration the geologic layers (per Lyell) and index fossils. Now experts tout the Grand Canyon as an excellent example of the time scale, but if they consider this example excellent, I’d hate to see a poor one. Yes, it does have Pre-Cambrian rocks on the bottom and Permian rocks on the top, but as Kimball points out in his article, “…there are some layers (as much as 20 million years worth) missing from the middle, with no evidence to explain where they went. Why are the youngest rocks from the Permian period? What happened to the quarter of a billion years worth of rocks that supposedly should have been laid down on top of the Permian? As for those older rocks on top of the younger ones, if they cannot be ignored it is explained that some mighty geologic force flipped them over. If that is so, where is the gravel or breccia that is normally produced when two large rocks scrape against one another? And wouldn’t allowing the moving of those rocks by a titanic force be an admission that the theory of catastrophism might be valid after all?”
Notice all the areas of “disconformity”? This means they don’t conform to the geologic scale for one reason or another. Most explanations of these areas of disconformity are attributed to catastrophic events. But don’t let your textbooks fool you- geologist don’t and can’t explain the causes of all of these disconformities. (Of course some of the areas can be explained, but no where near all.) The point is, geologic processes are highly complicated and science by no means “has it all figured out.”
Another example of disconformity that is not easily explained…
What about when we radiometrically date the layers of basalt and igneous below and above the Grand Canyon respectively? QCCSA.org notes this inconsistency, “The Cardenas Basalt bottom layer (below the Cambrian explosion) is usually dated with Rhobidium-Strontium and is calculated to be about 1 billion years old. Much after the Grand Canyon was already formed, igneous rocks were formed from a volcano on top of the canyon, that Indians saw erupt, only about 1,000 years ago. (The volcano lava flows have Indian artifacts in them, and go over the canyon walls.) These rocks were dated using the same method in the lab and were assigned an age of 1.3 billion years old. How can the very top, volcanic rock be older than the very bottom layer basalt rock? Even evolutionists admit that those Indian artifacts are not 1.3 billion years old!”
Obviously this is has been a simplified and no where near exhaustive discussion of just a few of the problematic issues with scientific dating. I didn’t elaborate on the different methods used for different types of rocks. I didn’t even begin to discuss radiocarbon dating which should have an article all to itself. My intent was not to be exhaustive, but rather to draw your attention to the fact that these dating methods that yield dates which we are in turn taught as indisputable fact- are in fact the subject of quite a lot of well deserved dispute.
The field of creation science and creation scientists are far from “science denying”. Instead, they seek scientific explanations that more accurately describe the evidence that we see around us today. No matter which side of the aisle you fall on, the theories should always be driven by the evidence. There is no truth to be found in manipulating evidence to support theory.