Definitions
Nearby Words

# caries

[kair-eez, -ee-eez]
or tooth decay

Localized disease that causes decay and cavities in teeth. It begins at the tooth's surface and may penetrate the dentin and the pulp cavity. Microorganisms in the mouth are believed to consume sugars and produce acids that eat away at tooth enamel. The dentin's protein structure is then destroyed by enzymes. Diet, general health, structural tooth defects, and heredity affect the risk of having caries. Prevention involves avoiding excessive sweets, brushing and flossing the teeth, and having regular dental care. Treatment includes restoration of teeth with cavities. Fluoridation of water can reduce the occurrence of caries by as much as 65percnt.

Property exhibited by certain types of matter of emitting radiation spontaneously. The phenomenon was first reported in 1896 by Henri Becquerel for a uranium salt, and it was soon found that all uranium compounds are radioactive due to the uranium's radioactivity. In 1898 Marie Curie and her husband discovered two other naturally occurring, strongly radioactive elements, radium and polonium. The radiation is emitted by unstable atomic nuclei (see nucleus) as they attempt to become more stable. The main processes of radioactivity are alpha decay, beta decay, and gamma decay. In 1934 it was discovered that radioactivity could be induced in ordinary matter by artificial transmutation.

Type of radioactivity in the most common form of which an unstable atomic nucleus dissipates energy by gamma emission, producing gamma rays. Gamma decay also includes two other processes, internal conversion and internal pair production. In internal conversion, excess energy in a nucleus is transferred to one of its own orbiting electrons and the electron is ejected from the atom. In internal pair production, excess energy is converted into an electron and a positron, which are emitted together. Typical half-lives (see half-life) for gamma emission range from about 10−9 to 10−14 second.

Any of three processes of radioactive disintegration in which a beta particle is spontaneously emitted by an unstable atomic nucleus in order to dissipate excess energy. Beta particles are either electrons or positrons. The three beta-decay processes are electron emission, positron emission, and electron capture. The process of beta decay increases or decreases the positive charge of the original nucleus by one unit without changing the mass number. Though beta decay is in general a slower process than gamma or alpha decay, beta particles can penetrate hundreds of times farther than alpha particles. Beta decay half-lives are a few milliseconds or more. Seealso radioactivity.

Type of radioactive disintegration (see radioactivity) in which some unstable atomic nuclei dissipate excess energy by spontaneously ejecting an alpha particle. Alpha particles have two positive charges and a mass of four atomic mass units; they are identical to helium nuclei. Though they are emitted at speeds about one-tenth that of light, they are not very penetrating and have ranges in air of about 1–4 in. (2.5–10 cm). Alpha decay commonly occurs in elements with atomic numbers greater than 83 (bismuth), but can occur in some rare-earth elements in the atomic-number range of 60 (neodymium) to 71 (lutetium). Alpha decay half-lives range from about a microsecond (10−6 second) to billions of years (1017 seconds).

c-decay refers to a Young Earth Creationist proposal that the speed of light was over a million times faster within the last 10,000 years, immediately following the creation of the universe, and has been slowing down since then in a logarithmic decay. By carefully selecting the decay rate, one can construct a universe that is billions of light years across, yet objects at these long distances are still visible even though the universe would be only a few thousand years old. It is an alternate to the Omphalos argument, which argues that the universe was created with the light already "in-flight", to the same end.

There is no support for c-decay in the mainstream scientific community and, in fact, little support for it in the creationist community, including the Institute for Creation Research (ICR). Answers in Genesis (AiG), a leading creationist organization, says that this proposal has a number of problems that have not been satisfactorily answered. AiG currently prefers Dr. Russell Humphreys’ explanation for distant starlight.

## History

The concept of c-decay was first proposed by Barry Setterfield in 1981 in an article for the Australian creationist magazine, Ex Nihilo. He selected a number of historical measurements of $c$ starting with the original measurement by Ole Rømer in 1667, and proceeding through a series of more recent experiments, culminating in "modern" measures in the 1960s. These showed a decreasing speed over time, which Setterfield claimed was in fact an exponential decay series that implied an infinite speed in the not distant past. The claim was later expanded to cover an apparent similar decay of several other physical constants. Similar charts have since been displayed in a number of creationist works.

Setterfield argues that this resolves the so-called "starlight problem". As Setterfield's original suggestion in Ex Nihilo notes, "If you propose that the universe and all in it is the product of an act of creation only 6-7000 years ago, many people ask - 'How is it that objects millions of light years away can be seen? Surely such light would take millions of years to reach us." If c is a constant, as is widely accepted, then this implies the universe is billions of years old because we can see objects billions of light years away. However, if the speed was significantly faster in the past, as Setterfield argues, then the light would have traveled most of this distance in a short time. Setterfield proposes this as an alternative to mainstream physical cosmology and, as such, c-decay represents a unique creationist cosmology.

## Criticism

There are any number of problems with this claim, from the obvious to the subtle. One of the more obvious ones simply invokes Einstein's famous formula, $E=mc^2$: If Setterfield is correct and the value of $c$ was much larger in the past, the energy released in chemical reactions would be much higher during this early epoch. When confronted with this argument, Setterfield claimed that the value of Planck constant was increasing to offset this effect. This would have equally noticeable effects on the universe, which are likewise unseen. In a more general sense, $c$ is so "ingrained" into existing physics models that any macroscopic change would be likely to result in the universe not being able to exist in its current form at all, or our currently accepted models to be incorrect. No convincing argument covering these issues has been proposed.

Further, all modern measurements agree to a value that precludes the decay. In Setterfield's report, he introduced a "cutoff date beyond which there is a zero rate of change", apparently to address this issue, making the theory unfalsifiable by new observations of $c$. Further, he claims that the speed was also fixed for some time in this early epoch, apparently to avoid an infinite speed, but offers no strong argument why this would be. So the claim is that the speed was fixed at the beginning of time, is fixed again today, but was decreasing measurably in an arbitrarily selected period between the two.

Just as worrying at a fundamental level is the apparent "cherry picking" of the data in order to fit the original curve. Many experiments measuring the speed of light, some of them famous, were left out of his analysis. When these are included the graph becomes much more "flat". Even when one considers only the quoted experiments, Setterfield left out a number of measurements when attempting to illustrate the statistical accuracy of his claim. When these three points are added back into the set, the decay disappears. More recent versions of Setterfield's paper include these figures, using adjusted mathematics to rebuild the curve. These mathematics have been the object of ridicule. Such ridicule has often used out of date materials, and Barry Setterfield has taken efforts to help steer his critics towards more up-to-date research that may in fact support his theories.

Moreover, Setterfield's argument is most highly dependent on Rømer's original measurement, which is the outlier that defines the curve. His measure was copied from an issue of Sky and Telescope which he stated said the speed of light was "301,300 plus or minus 200 km/sec", about 0.5% above the current value. The article was actually an excerpt from The Astronomical Journal , which disagrees completely, and in fact states quite clearly that "The best fit occurs at zero where the light travel time is identical to the currently accepted value. In other words Setterfield's own set of experiments directly contradict his claims.