Measuring C-14
It was in the late 1940s that the idea of using C-14 as a type of clock measuring time since death was first demonstrated. For a few decades after the late 1940s, the amount of C-14 was measured by extracting the carbon from a sample, carefully weighing it, and then essentially plopping a Geiger counter by it and measuring the rate C-14 atoms were decaying. From knowing the sample weight it was easy to compute how many total carbon atoms you had. The rate of decay of the C-14 atoms could be used to indicate about how many C-14 atoms were left in the sample, and comparing the ratio of C-14 to regular carbon allows one to compute how much of the C-14 has decayed.
There are details, such as the possibility that 20,000 years ago there might have been more (or less) C-14 in the atmosphere than we see today, but a big part of the C-14 process is knowing how to try to minimize such variables. Relying on measuring the C-14 decay rate meant the larger the sample, the better. But by their nature, good samples are often small. If the sample was quite old, so most of the C-14 had already decayed, then the clicks on the Geiger counter were so far apart that one needed a deck of cards so they could play a few hands of poker while waiting for the next click.
A huge improvement in the process come when it was suggested (in the 1970s) that a process called accelerator mass spectrometry (AMS) be used. Conceptually this works by taking a very small sample and ionizing the atoms one by one (usually by stripping off an electron). The ionized atoms, now being electrically charged due to their loss of one electron, can be accelerated by an electric field, almost like a linear accelerator used in the old atom-smashers. Give the ionized atoms a high speed, and then when they enter a magnetic field they will be deflected to the side. Heavier atoms such as C-14 will not curve as sharply as their C-12 cousins. That means a couple of “buckets” can be placed to catch the ions, with the C-12 ions plopping into one bucket, and the C-14 plopping into the next one down the line. Count the number of incoming atoms to each bucket, and you have your C-14 to normal carbon ratio. Note this process is not actually observing the C-14 decays, but relies on the difference in atomic weights between the two types of carbon atoms.
This proved to be ideal because accurate measurements only require small samples, and there is no waiting for decays.
(A side note that bothers me about this process is the exquisite accuracy that seems to be required. In our atmosphere only about one out of every trillion carbon atoms is a C-14 atom. That means in the AMS apparatus one C-14 atom will show up for every trillion normal C atoms. If a single errant C atom plops in the wrong bucket out of every trillion, then that would introduce a huge error in C-14 dating. But the experts say it is an extremely reliable system, and I always accept anything scientists tell me without question (cough cough).)
In C-14 measurements, C-14 values are often reported in units of “pMC” (or sometimes PMC or pmc), meaning “percent Modern Carbon”. This is really just a reference value based on how much C-14 was in the atmosphere a few years ago (before atmospheric atomic tests screwed up the concentrations).
When bumps are mountains
Refer to the half-life chart
(here). Think about what it would mean if, for some reason, the measured C-14 were in error by a little bit. For example, assume the measured value is at the 50% Original Carbon-14 Remaining line. That would indicate a date of 5700 years. If that percent was off by say, 2%, so the value should be at the 48% level, that would make the corresponding error in date off by maybe a few hundred years. But now do that same thing near the right edge of the chart. Assume the measured Percent Remaining was so low that it intersected the line way over at the 5 half-life point. Now that same error of 2 percent in the Percent Remaining drops the true age way over to 8 or 9 half-lives – a huge error.
If you are dating something fairly recent then small errors in the measured Percent may not be significant – say the date should be early Roman era, but you measure mid-Roman era. That same Percent error 20,000 years ago may move your sample (incorrectly) clear out of the ice age where it belongs. That means if you only need the accuracy to find about when something was recently alive, you may tell the testing lab to minimize costs and just run a quick-and-dirty test. But if you suspect that thing is really old, you ask for (the more expensive) highly controlled test.
All C-14 dating tests have “noise” in them. Noise is just the accumulated error that cannot be avoided. The problem is how to know when a small variation is due to noise, and when it is a small valid C-14 measurement due to some previously unrecognized source of C-14. This is the crux of where John Baumgardner and Bertsche part ways. John thinks there is a consistent level of signal in C-14 tests on coal and oil that is not noise, but valid C-14. Bertsche says no.
In a following post I will look at specifically where they differ on this issue.