Last
week I read this in an Israeli newspaper:
“A massive tower
that defended [Jerusalem’s] main water source – which was thought to have been
built in the Middle Bronze Age, nearly 4,000 years ago – [Carbon dating results]
have shown the structure likely dates back only to the ninth century B.C.E.”
Like it or not, you need to understand radio-carbon
dating. Nearly every field of science relies on it. Archaeologists in
particular count on carbon dating to help them determine the age of many of the
artifacts they dig up.
What is it? How does it work?
At this moment, powerful cosmic particles from somewhere
out in the Milky Way are striking Earth’s upper atmosphere. They combine with nitrogen atoms to form unstable Carbon 14 atoms – unstable in the sense that the C-14
atoms slowly decay back to nitrogen.
The C-14 and more stable C-12 carbon atoms combine
with oxygen atoms to form carbon dioxide. The ratio between the two types of
carbon dioxide is - currently - one
trillion to 1.
Both types of carbon dioxide are breathed in by living
plants. Animals, of course, breathe in
oxygen and breathe out carbon
dioxide. However, animals do collect C-12 and C-14 from the plants they eat. So
animals and humans, like plants, are assumed to have the same one trillion to 1
ratio of C-12 to C-14. Obviously, rocks cannot be measured by a carbon clock since they don't breathe.
(The archaeologists in Jerusalem had to have dated some piece of wood they found, not the stones the tower is built
from. It was the piece of wood, or technically, the death of the tree the wood
came from, that dated to David’s day, not the stone tower.)
When a plant or animal dies, it stops taking in
carbon dioxide. It is assumed that the C-12 remains
stable for the rest of eternity, but the no-longer-breathing plant's C-14 begins to decay.
It is assumed
that all plants take in C-12 and C-14 in the same amounts and with the same
ratio. It is further assumed that the ratio has remained constant;
that is, that a plant living, say, 6000 years ago took up one C-14 atom to
every one trillion C-12 atoms, just as plants do today.
The rate of decay – that is, the rate at which C-14
leeches away – is currently measured at one half gone every 5,700 years. And it is assumed that it has always been the
same. So if you were to analyze a sample of 100 trillion carbon atoms from a modern
plant, 100 of them would be C-14 atoms. If you looked at 100 trillion atoms of a
5,700 year-old plant you would only count fifty C-14 atoms – half the original
amount. 100 trillion atoms from a plant 11,400 years old would have only twenty-five C-14 atoms, and so on.
Simple, right? The lower the amount of C-14, the
older the sample. Given all that, any sample more than a few thousand years old
will have a microscopically small amount of C-14... one might even say an "immeasurably small amount." And, of course, when you are measuring
things in atoms, the samples (and the machinery) are subject to contamination.
According to radiocarbon.com,
scientists use oxalic acid made from sugar beets known to have been harvested
in 1955 to calibrate their tests. They also use wood from a tree known to have
been cut in 1890, “unaffected by fossil fuel effects.”
Wait: What?
Yup. Turns out that changes to the atmosphere mess
with the carbon levels. Scientists assume
that, prior to the fossil fuel age there were no significant changes to the
atmosphere. That explains that reference to the ‘1955 sugar beet oxalic acid,’
as well: atmospheric testing of hydrogen bombs in the nineteen fifties significantly altered
the levels of C-12 and C-14 in the atmosphere.
Notice all the assumptions involved:
For the carbon clock to be reliable, the amount of the mysterious cosmic
rays striking the atmosphere from the unknown space source would have to remain
constant over tens of thousands of years. Have they? Who knows? The scientists
admit they know little about them.
In addition, cosmic rays are greatly affected by
magnetic fields – both that of the earth and that of the sun. The magnetic
fields, in fact, are the reason scientists can only say the cosmic rays come from
'somewhere in the galaxy' – because
each magnetic field they pass on their way to Earth changes their direction and their
intensity.
- continues below -
- continues below -
If you or someone you know is struggling to support a pioneer lifestyle, maybe this will help: "99 Ways to Fire Your Boss" is a huge collection of income ideas, plus powerful suggestions for living a simpler life! Look for it on Amazon.com
Earth's magnetic field fluctuates dramatically. The
sun’s does as well.
Let’s take it a step further. For much of Earth's geologic history a dense shroud of water, dust or other debris covered the planet. Would this have affected cosmic rays
striking the atmosphere? Absolutely. Would that have altered the relative
amounts of nitrogen, oxygen, and carbon dioxide in the atmosphere? Again, yes.
Atmospheric oxygen is believed to have been as low as
15%, and as high as 35%, at various points in geologic history. The nitrogen
level went down when oxygen went up, and vice versa. Carbon dioxide levels changed
nearly every time a volcano erupted. When humans began cooking and heating with fire (6,000 years
ago according to the Bible or 350,000 years ago according to science) carbon dioxide climbed. When we began using fossil fuels, CO2 really jumped. And atmospheric
testing of nuclear weapons in the 1950s hugely affected levels of C-14 in
the atmosphere.
So how can anyone say that the ratio of
C-14 to ordinary carbon in a plant living today is the same as the ratio in a
plant that lived thousands of years ago?
The scientists themselves, who lean so heavily on the
radiocarbon clock, need to read the work of other scientists:
- “A large and sudden increase in radiocarbon around AD 773 is documented in coral skeletons from the South China Sea…forming a spike of 45% in late spring, followed by two smaller spikes. The carbon anomalies coincide with an historic comet collision with the Earth's atmosphere on 17 January AD 773.” – Nature.com
- “We find [in annual rings of Japanese cedar trees] a rapid increase of about 12% in the C-14 content from a.d. 774 to 775, which is about 20 times larger than the change attributed to ordinary solar modulation.” – Nature, June 2012
- “The rate of carbon 14 radioactive decay may have been different in the past. The amount of carbon dioxide in the atmosphere may have been different in the past. The assumption of a constant ratio of C-14 to C-12 is invalid; equilibrium would require about 30,000 years, (or 50,000 years according to this mathematician) and the C-14/C-12 ratio appears to be increasing still.” – Tufts.edu
That last part there, about “equilibrium,” is
important. If all the assumptions were true, 30,000-50,000 years after the C-14
process began, whenever that was, the atmosphere should have reached
equilibrium… it should have reached a point where the C-14 decayed away at the
same rate at which it is being generated. Otherwise, by now we’d be swimming in
C-14.
But it is still increasing. Which can ONLY mean:
- The
C-14 process – cosmic rays reaching the atmosphere, the atmosphere
containing the present-day levels of nitrogen, oxygen, and CO2, the C-14
being absorbed by plants then decaying out, etc. – that process began less than 30,000
years ago… or
- The
theory on which Carbon 14 dating is based, is just wrong…
Please leave a respectful comment. (Comments are moderated, so if you're a troll or a salesman, don't bother.) To read another of my columns about science versus the Bible, click here.
Bill K. Underwood is the author of several novels and one non-fiction
self-help book, all available
at Amazon.com. You can help support this page by purchasing a book.