Since the 1950s, geologists have used radioactive elements as natural "clocks" for determining numerical ages of certain types of rocks. "Forms" means the moment an igneous rock solidifies from magma, a sedimentary rock layer is deposited, or a rock heated by metamorphism cools off.
It's this resetting process that gives us the ability to date rocks that formed at different times in earth history.
Research has been ongoing since the 1960s to determine what the proportion of in the atmosphere has been over the past fifty thousand years.
The resulting data, in the form of a calibration curve, is now used to convert a given measurement of radiocarbon in a sample into an estimate of the sample's calendar age.
Radioactive decay occurs at a constant rate, specific to each radioactive isotope.
Radio carbon dating determines the age of ancient objects by means of measuring the amount of carbon-14 there is left in an object.
A man called Willard F Libby pioneered it at the University of Chicago in the 50's. This is now the most widely used method of age estimation in the field of archaeology.
The igneous activity that produced such intrusions...
...calculation was based on the assumption that the substance of the Earth is inert and thus incapable of producing new heat.