Added to Favorites

Related Searches

Definitions

Nearby Words

theory, in music, discipline involving the construction of cognitive systems to be used as a tool for comprehending musical compositions. The discipline is subdivided into what can be called speculative and analytic theory. Speculative theory engages in reconciling with music certain philosophical observations of man and nature. It can be prescriptive when it imposes these extramusical contentions to establish an aesthetic norm. Music theory tended toward this aspect until the 20th cent. An example is the attempt to assert the superiority of tonal music over other systems by reference to the relationship of the triad to the natural overtone series. Analytic theory, on the other hand, undertakes detailed study of individual pieces. Analyses of compositions of a particular genre are synthesized into a general system, or reference, against which the individuality of these pieces can be perceived. In more general usage the term *theory* is used to include the study of acoustics, harmony, and ear training. In ancient Greece music theory was mainly concerned with describing different scales (modes) and their emotional character. This theory was transmitted, largely erroneously, to medieval Europe by the Roman philosopher Boethius in his *De musica* (6th cent. A.D.). Medieval European theory dealt with notation, modal and rhythmic systems, and the relation of music to Christianity. Gioseffo Zarlino (1515-90) was the first to consider the triad as a compositional reference. In the 18th cent. Jean Philippe Rameau sought to show how the major-minor system of tonal harmony derives from the inherent acoustical properties of sound itself, and establish the laws of harmonic progression. The writings of Heinrich Schenker are among the most important in the sphere of tonal theory. Major contemporary theorists are Paul Hindemith, who propounded the idea of non-triadic pitch centrality, and Milton Babbitt, who has published revealing explications of twelve-tone music.

The Columbia Electronic Encyclopedia Copyright © 2004.

Licensed from Columbia University Press

Licensed from Columbia University Press

Attempt to describe all fundamental interactions between elementary particles in terms of a single theoretical framework (a “theory of everything”) based on quantum field theory. So far, the weak force and the electromagnetic force have been successfully united in electroweak theory, and the strong force is described by a similar quantum field theory called quantum chromodynamics. However, attempts to unite the strong and electroweak theories in a grand unified theory have failed, as have attempts at a self-consistent quantum field theory of gravitation.

Learn more about unified field theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In logic, a theory introduced by Bertrand Russell and Alfred North Whitehead in their *Principia Mathematica* (1910–13) to deal with logical paradoxes arising from the unrestricted use of propositional functions as variables. The type of a propositional function is determined by the number and type of its arguments (the distinct variables it contains). By not allowing propositional functions to be applied to arguments of equal or higher type, contradictions within the system are avoided.

Learn more about types, theory of with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

or **value theory**

Philosophical theory of value. Axiology is the study of value, or goodness, in its widest sense. The distinction is commonly made between intrinsic and extrinsic value—i.e., between that which is valuable for its own sake and that which is valuable only as a means to something else, which itself may be extrinsically or intrinsically valuable. Many different answers have been given to the question “What is intrinsically valuable?” For hedonists, it is pleasure; for pragmatists, it is satisfaction, growth, or adjustment; for Kantians, it is a good will. Pluralists such as G.E. Moore and William David Ross assert that there are any number of intrinsically valuable things. According to subjective theories of value, things are valuable only insofar as they are desired; objective theories hold that there are at least some things that are valuable independently of people's interest in or desire for them. Cognitive theories of value assert that ascriptions of value function logically as statements of fact, whereas noncognitive theories assert that they are merely expressions of feeling (*see* emotivism) or prescriptions or commendations (*see* prescriptivism). According to naturalists, expressions such as “intrinsically good” can be analyzed as referring to natural, or non-ethical, properties, such as being pleasant. Moore famously denied this, holding that “good” refers to a simple (unanalyzable) non-natural property. *Seealso* fact-value distinction; naturalistic fallacy.

Learn more about axiology with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Study of the origin, nature, and limits of human knowledge. Nearly every great philosopher has contributed to the epistemological literature. Some historically important issues in epistemology are: (1) whether knowledge of any kind is possible, and if so what kind; (2) whether some human knowledge is innate (i.e., present, in some sense, at birth) or whether instead all significant knowledge is acquired through experience (*see* empiricism; rationalism); (3) whether knowledge is inherently a mental state (*see* behaviourism); (4) whether certainty is a form of knowledge; and (5) whether the primary task of epistemology is to provide justifications for broad categories of knowledge claim or merely to describe what kinds of things are known and how that knowledge is acquired. Issues related to (1) arise in the consideration of skepticism, radical versions of which challenge the possibility of knowledge of matters of fact, knowledge of an external world, and knowledge of the existence and natures of other minds.

Learn more about epistemology with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Theory of circuits made up of ideal digital devices, including their structure, behaviour, and design. It incorporates Boolean logic (*see* Boolean algebra), a basic component of modern digital switching systems. Switching is essential to telephone, telegraph, data processing, and other technologies in which it is necessary to make rapid decisions about routing information. *Seealso* queuing theory.

Learn more about switching theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Any of a number of theories in particle physics that treat elementary particles (*see* subatomic particle) as infinitesimal one-dimensional “stringlike” objects rather than dimensionless points in space-time. Different vibrations of the strings correspond to different particles. Introduced in the early 1970s in attempts to describe the strong force, string theories became popular in the 1980s when it was shown that they might provide a fully self-consistent quantum field theory that could describe gravitation as well as the weak, strong, and electromagnetic forces. The development of a unified quantum field theory is a major goal in theoretical particle physics, but inclusion of gravity usually leads to difficult problems with infinite quantities in the calculations. The most self-consistent string theories propose 11 dimensions; 4 correspond to the 3 ordinary spatial dimensions and time, while the rest are curled up and not perceptible.

Learn more about string theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Concept of an expanding universe whose average density remains constant, matter being continuously created throughout it to form new stars and galaxies at the same rate that old ones recede from sight. A steady-state universe has no beginning or end, and its average density and arrangement of galaxies are the same as seen from every point. Galaxies of all ages are intermingled. The theory was first put forward by William Macmillan (1861–1948) in the 1920s and modified by Fred Hoyle to deal with problems that had arisen in connection with the big-bang model. Much evidence obtained since the 1950s contradicts the steady-state theory and supports the big-bang model.

Learn more about steady-state theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In statistics and related subfields of philosophy, the theory and method of formulating and solving general decision problems. Such a problem is specified by a set of possible states of the environment or possible initial conditions; a set of available experiments and a set of possible outcomes for each experiment, giving information about the state of affairs preparatory to making a decision; a set of available acts depending on the experiments made and their consequences; and a set of possible consequences of the acts, in which each possible act assigns to each possible initial state some particular consequence. The problem is dealt with by assessing probabilities of consequences conditional on different choices of experiments and acts and by assigning a utility function to the set of consequences according to some scheme of value or preference of the decision maker. An optimal solution consists of an optimal decision function, which assigns to each possible experiment an optimal act that maximizes the utility, or value, and a choice of an optimal experiment. See also cost-benefit analysis, game theory.

Learn more about decision theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Branch of mathematics that deals with the properties of sets. It is most valuable as applied to other areas of mathematics, which borrow from and adapt its terminology and concepts. These include the operations of union (∪), and intersection (∩). The union of two sets is a set containing all the elements of both sets, each listed once. The intersection is the set of all elements common to both original sets. Set theory is useful in analyzing difficult concepts in mathematics and logic. It was placed on a firm theoretical footing by Georg Cantor, who discovered the value of clearly formulated sets in the analysis of problems in symbolic logic and number theory.

Learn more about set theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Study of the behaviour of queues (waiting lines) and their elements. Queuing theory is a tool for studying several performance parameters of computer systems and is particularly useful in locating the reasons for “bottlenecks,” compromised computer performance caused by too much data waiting to be acted on at a particular phase. Queue size and waiting time can be looked at, or items within queues can be studied and manipulated according to factors such as priority, size, or time of arrival.

Learn more about queuing theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Branch of mathematical physics that deals with atomic and subatomic systems. It is concerned with phenomena that are so small-scale that they cannot be described in classical terms, and it is formulated entirely in terms of statistical probabilities. Considered one of the great ideas of the 20th century, quantum mechanics was developed mainly by Niels Bohr, Erwin Schrödinger, Werner Heisenberg, and Max Born and led to a drastic reappraisal of the concept of objective reality. It explained the structure of atoms, atomic nuclei (*see* nucleus), and molecules; the behaviour of subatomic particles; the nature of chemical bonds (*see* bonding); the properties of crystalline solids (*see* crystal); nuclear energy; and the forces that stabilize collapsed stars. It also led directly to the development of the laser, the electron microscope, and the transistor.

Learn more about quantum mechanics with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Theory that brings quantum mechanics and special relativity together to account for subatomic phenomena. In particular, the interactions of subatomic particles are described in terms of their interactions with fields, such as the electromagnetic field. However, the fields are quantized and represented by particles, such as photons for the electromagnetic field. Quantum electrodynamics is the quantum field theory that describes the interaction of electrically charged particles via electromagnetic fields. Quantum chromodynamics describes the action of the strong force. The electroweak theory, a unified theory of electromagnetic and weak forces, has considerable experimental support, and can likely be extended to include the strong force. Theories that include the gravitational force (*see* gravitation) are more speculative. *Seealso* grand unified theory, unified field theory.

Learn more about quantum field theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Quantum theory of the interactions of charged particles with the electromagnetic field. It describes the interactions of light with matter as well as those of charged particles with each other. Its foundations were laid by P. A. M. Dirac when he discovered an equation describing the motion and spin of electrons that incorporated both quantum mechanics and the theory of special relativity. The theory, as refined and developed in the late 1940s, rests on the idea that charged particles interact by emitting and absorbing photons. It has become a model for other quantum field theories.

Learn more about quantum electrodynamics (QED) with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Experimental method of computing that makes use of quantum-mechanical phenomena. It incorporates quantum theory and the uncertainty principle. Quantum computers would allow a bit to store a value of 0 and 1 simultaneously. They could pursue multiple lines of inquiry simultaneously, with the final output dependent on the interference pattern generated by the various calculations. *Seealso* DNA computing, quantum mechanics.

Learn more about quantum computing with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Theory that describes the action of the strong force. The strong force acts only on certain particles, principally quarks that are bound together in the protons and neutrons of the atomic nucleus, as well as in less stable, more exotic forms of matter. Quantum chromodynamics has been built on the concept that quarks interact via the strong force because they carry a form of “strong charge,” which has been given the name “colour.” The three types of charge are called red, green, and blue, in analogy to the primary colours of light, though there is no connection with the usual sense of colour.

Learn more about quantum chromodynamics (QCD) with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In physics, a discrete natural unit, or packet, of energy, charge, angular momentum, or other physical property. Light, for example, which appears in some respects as a continuous electromagnetic wave, on the submicroscopic level is emitted and absorbed in discrete amounts, or quanta; for light of a given wavelength, the magnitude of all the quanta emitted or absorbed is the same in both energy and momentum. These particlelike packets of light are called photons, a term also applicable to quanta of other forms of electromagnetic energy such as X rays and gamma rays. Submicroscopic mechanical vibrations in the layers of atoms comprising crystals also give up or take on energy and momentum in quanta called phonons. *Seealso* quantum mechanics.

Learn more about quantum with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Branch of mathematics that deals with analysis of random events. Probability is the numerical assessment of likelihood on a scale from 0 (impossibility) to 1 (absolute certainty). Probability is usually expressed as the ratio between the number of ways an event can happen and the total number of things that can happen (e.g., there are 13 ways of picking a diamond from a deck of 52 cards, so the probability of picking a diamond is

Learn more about probability theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Branch of mathematics concerned with properties of and relations among integers. It is a popular subject among amateur mathematicians and students because of the wealth of seemingly simple problems that can be posed. Answers are much harder to come up with. It has been said that any unsolved mathematical problem of any interest more than a century old belongs to number theory. One of the best examples, recently solved, is Fermat's last theorem.

Learn more about number theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Economic theory relating changes in the price level to changes in the quantity of money. It has often been used to analyze the factors underlying inflation and deflation. The quantity theory was developed in the 17th and 18th centuries by philosophers such as John Locke and David Hume and was intended as a weapon against mercantilism. Drawing a distinction between money and wealth, advocates of the quantity theory argued that if the accumulation of money by a nation merely raised prices, the mercantilist emphasis on a favourable balance of trade would only increase the supply of money without increasing wealth. The theory contributed to the ascendancy of free trade over protectionism. In the 19th–20th centuries it played a part in the analysis of business cycles and in the theory of rates of foreign exchange.

Learn more about money, quantity theory of with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In mathematics, a generalization of the concepts of length and area (*see* length, area, and volume) to arbitrary sets of points not composed of line segments or rectangles. A measure is any rule for associating a number with a set. The result must be nonnegative and also additive, meaning that the measure of two nonoverlapping sets equals the sum of their individual measures. This is simple enough for sets consisting of line segments or rectangles, but the measure of sets such as curved regions or intervals with missing points requires more abstract methods, including limits and upper and lower bounds.

Learn more about measure theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Set of numbers arranged in rows and columns to form a rectangular array. Matrix elements may also be differential operators, vectors, or functions. Matrices have wide applications in engineering, physics, economics, and statistics, as well as in various branches of mathematics. They are usually first encountered in the study of systems of equations represented by matrix equations of the form *math.A**math.x* = *math.B*, which may be solved by finding the inverse of matrix *math.A* or by using an algebraic method based on its determinant.

Learn more about matrix with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In economics, the theory that firms will pay a productive agent only what he or she adds to the financial earnings of the firm. Developed by writers such as John Bates Clark and Philip Henry Wicksteed at the end of the 19th century, marginal productivity theory holds that it is unprofitable to buy, for example, a man-hour of labour if it costs more than it contributes to its buyer's income. The amount in excess of costs that a productive input yields is the value of its marginal product; the theory posits that every type of input should be paid the value of its marginal product.

Learn more about marginal productivity theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Mathematical theory of closed curves in three-dimensional space. The number of times and the manner in which a curve crosses itself distinguish different knots. The fewest possible crossings is three, for the overhand (trefoil) knot, which occurs in two mirror versions according to the directions in which the curve crosses itself. Knot theory has been used to understand both atomic and molecular structures (protein folding).

Learn more about knot theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Theory based on a simple description of a gas, from which many properties of gases can be derived. Established primarily by James Clerk Maxwell and Ludwig Boltzmann, the theory is one of the most important concepts in modern science. The simplest kinetic model is based on the assumptions that (1) a gas is composed of a large number of identical molecules moving in random directions, separated by distances that are large compared to their size; (2) the molecules undergo perfectly elastic (no energy loss) collisions with each other and with the walls of the container; and (3) the transfer of kinetic energy between molecules is heat. This model describes a perfect gas but is a reasonable approximation to a real gas. Using the kinetic theory, scientists can relate the independent motion of molecules of gases to their pressure, volume, temperature, viscosity, and heat conductivity.

Learn more about kinetic theory of gases with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Set of conditions under which a resort to war is morally legitimate (*jus ad bellum*); also, rules for the moral conduct of war (*jus in bello*). Among the proposed conditions for the just resort to war are that the cause be just (e.g., self-defense against an attack or the threat of imminent attack), that the authority undertaking the war be competent, that all peaceful alternatives be exhausted, and that there be a reasonable hope of success. Two of the most important conditions for the just conduct of war are that the force used be “proportional” to the just cause the war is supposed to serve (in the sense that the evil created by the war must not outweigh the good represented by the just cause) and that military personnel be discriminated from innocents (noncombatant civilians), who may not be killed. The concept of just war was developed in the early Christian church; it was discussed by St. Augustine in the 4th century and was still accepted by Hugo Grotius in the 17th century. Interest in the concept thereafter declined, though it was revived in the 20th century in connection with the development of nuclear weapons (the use of which, according to some, would violate the conditions of proportionality and discrimination) and the advent of “humanitarian intervention” to put an end to acts of genocide and other crimes committed within the borders of a single state.

Learn more about just war theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Field of mathematics that studies the problems of signal transmission, reception, and processing. It stems from Claude E. Shannon's mathematical methods for measuring the degree of order (nonrandomness) in a signal, which drew largely on probability theory and stochastic processes and led to techniques for determining a source's rate of information production, a channel's capacity to handle information, and the average amount of information in a given type of message. Crucial to the design of communications systems, these techniques have important applications in linguistics, psychology, and even literary theory.

Learn more about information theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In the philosophy of mind, the doctrine that mental events are identical to physico-chemical events in the brain. So-called “type” identity theory asserts that each type of mental event, such as pain, is identical to some type of event in the brain, such as the firing of c-fibres. In response to objections based on the assumed “multiple realizability” of mental states, “token” identity theory makes the weaker claim that each token of a mental event, such as a particular pain, is identical to some token of a brain event of some type. *Seealso* mind-body problem.

Learn more about identity theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Mathematical theory of networks. A graph consists of vertices (also called points or nodes) and edges (lines) connecting certain pairs of vertices. An edge that connects a node to itself is called a loop. In 1735 Leonhard Euler published an analysis of an old puzzle concerning the possibility of crossing every one of seven bridges (no bridge twice) that span a forked river flowing past an island. Euler's proof that no such path exists and his generalization of the problem to all possible networks are now recognized as the origin of both graph theory and topology. Since the mid-20th century, graph theory has become a standard tool for analyzing and designing communications networks, power transmission systems, transportation networks, and computer architectures.

Learn more about graph theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Concept of the physical basis of heredity expressed by the biologist August Weismann (1834–1914). It claimed that germ plasm, which Weismann believed to be independent from all other cells of the body, was the essential element of germ cells (eggs and sperm) and was the hereditary material passed from generation to generation. First proposed in 1883, his view contradicted Lamarck's then-prevalent theory of acquired characteristics. Though its details have been altered, its idea of the stability of hereditary material is the basis of the modern understanding of physical inheritance.

Learn more about germ-plasm theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Theory that certain diseases are caused by invasion of the body by microorganisms. Louis Pasteur, Joseph Lister, and Robert Koch are given much of the credit for its acceptance in the later 19th century. Pasteur showed that organisms in the air cause fermentation and spoil food; Lister was first to use an antiseptic to exclude germs in the air to prevent infection; and Koch first linked a specific organism with a disease (anthrax). The full implications of germ theory for medical practice were not immediately apparent after it was proven; surgeons operated without masks or head coverings as late as the 1890s.

Learn more about germ theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Branch of applied mathematics devised to analyze certain situations in which there is an interplay between parties that may have similar, opposed, or mixed interests. Game theory was originally developed by John von Neumann and Oscar Morgenstern in their book *The Theory of Games and Economic Behavior* (1944). In a typical game, or competition with fixed rules, “players” try to outsmart one another by anticipating the others' decisions, or moves. A solution to a game prescribes the optimal strategy or strategies for each player and predicts the average, or expected, outcome. Until a highly contrived counterexample was devised in 1967, it was thought that every contest had at least one solution. *Seealso* decision theory; prisoner's dilemma.

Learn more about game theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In mathematics, the study of the structure of a set of objects (e.g., numbers) with two combining operations (e.g., addition and multiplication). Such a system, known as a field, must satisfy certain properties: associative law, commutative law, distributive law, an additive identity (“zero”), a muliplicative identity (“one”), additive inverses (*see* inverse function), and multiplicative inverses for nonzero elements. The sets of rational numbers, real numbers, and complex numbers are fields under ordinary addition and multiplication. The investigation of polynomial equations and their solutions led to the development of field theory.

Learn more about field theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Theory that describes both the electromagnetic force and the weak force. Though the forces appear to be different, they are actually different facets of a more fundamental force. This theory, formulated in the 1960s by Sheldon Glashow (born 1932), Steven Weinberg (born 1933), and Abdus Salam (born 1926), represents a 20th-century scientific landmark and won its authors a 1979 Nobel Prize. It was validated in the 1980s with the discovery of the W particle and Z particle, which it had predicted. *Seealso* fundamental interaction, unified field theory.

Learn more about electroweak theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

or **domino effect**

Doctrine of U.S. foreign policy during the Cold War, according to which the fall of a noncommunist state to communism would precipitate the fall of other neighbouring noncommunist states. The theory was first enunciated by Pres. Harry Truman, who used it to justify sending U.S. military aid to Greece and Turkey in the late 1940s. Dwight D. Eisenhower, John F. Kennedy, and Lyndon B. Johnson invoked it to justify U.S. military involvement in Southeast Asia, especially the prosecution of the Vietnam War.

Learn more about domino theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Field of applied mathematics relevant to the control of certain physical processes and systems. It became a field in its own right in the late 1950s and early '60s. After World War II, problems arising in engineering and economics were recognized as variants of problems in differential equations and in the calculus of variations, though they were not covered by existing theories. Special modifications of classical techniques and theories were devised to solve individual problems, until it was recognized that these seemingly diverse problems all had the same mathematical structure, and control theory emerged. *Seealso* control system.

Learn more about control theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Behaviour in a complex system that appears irregular or unpredictable but is actually determinate. The apparently random or unpredictable behaviour in systems governed by complicated (nonlinear) deterministic laws is the result of high sensitivity to initial conditions. For example, Edward Lorenz discovered that a simple model of heat convection exhibits chaotic behaviour. In a now-classic example of such sensitivity to initial conditions, he suggested that the mere flapping of a butterfly's wings could eventually result in large-scale changes in the weather (the “butterfly effect”).

Learn more about chaotic behaviour with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Branch of mathematics (considered a branch of geometry) that explores how gradual changes to a system produce sudden, drastic results (though usually not as dire as the name suggests). A simple example is how a plastic coffee stirrer subjected to gradually increasing pressure from both ends will suddenly buckle in one direction or another. Other “catastrophes” include optical phenomena such as reflection or refraction of light through moving water. More speculatively, ideas from catastrophe theory have been applied by social scientists to such situations as the sudden eruption of mob violence.

Learn more about catastrophe theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In chemistry and physics, a theoretical model describing the states of electrons in solid materials, which can have energy values only within certain specific ranges, called bands. Ranges of energy between two allowed bands are called forbidden bands. As electrons in an atom move from one energy level to another, so can electrons in a solid move from an energy level in one band to another in the same band or in another band. The band theory accounts for many of the electrical and thermal properties of solids and forms the basis of the technology of devices such as semiconductors, heating elements, and capacitors (*see* capacitance).

Learn more about band theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Body of physical and logical principles underlying the operation of any electromechanical device (an automaton) that converts information input in one form into another, or into some action, according to an algorithm. Norbert Wiener and Alan M. Turing are regarded as pioneers in the field. In computer science, automata theory is concerned with the construction of robots (*see* robotics) from basic building blocks of automatons. The best example of a general automaton is an electronic digital computer. Networks of automata may be designed to mimic human behaviour. *Seealso* artificial intelligence; Turing machine.

Learn more about automata theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Theory that holds that a film's director is its “author” (French, *auteur*). It originated in France in the 1950s and was promoted by *csubcomma*ois Truffaut*Cahiers du Cinéma*. The director oversees and “writes” the film's audio and visual scenario and therefore is considered more responsible for its content than the screenwriter. Supporters maintain that the most successful films bear the distinctive imprint of their director.

Learn more about auteur theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

Comprehensive theory that explains the behaviour of superconducting materials. It was developed in 1957 by John Bardeen, Leon Cooper, and J. Robert Schrieffer (b. 1931), whose surname initials provide its name. Cooper discovered that electrons in a superconductor are grouped in pairs (Cooper pairs) and that the motions of all the pairs within a single superconductor constitute a system that functions as a single entity. An electric voltage applied to the superconductor causes all Cooper pairs to move, forming an electric current. When the voltage is removed, the current continues to flow because the pairs encounter no opposition. *Seealso* superconductivity.

Learn more about BCS theory with a free trial on Britannica.com.

Encyclopedia Britannica, 2008. Encyclopedia Britannica Online.

In theoretical physics, M-theory is a new limit of string theory in which 11 dimensions of spacetime may be identified. Because the dimensionality exceeds the dimensionality of five superstring theories in 10 dimensions, it was originally believed that the 11-dimensional theory is more fundamental and unifies all string theories (and supersedes them). However, in a more modern understanding, it is another, sixth possible description of physics of the full theory that is still called "string theory." Though a full description of the theory is not yet known, the low-energy dynamics are known to be supergravity interacting with 2- and 5-dimensional membranes. ## History and Development

### Prior to May 1995

### Type I string theory and others

### String vibrational patterns

### M-theory

### Type IIA and Type IIB

### Other dualities

### Only two string theories

### Last step

### Supergravity theories

### Same underlying theory

## Nomenclature

There are two issues to be dealt with here:## M-theory and membranes

In the standard string theories, strings are assumed to be the single fundamental constituent of the universe. M-theory adds another fundamental constituent - membranes. Like the tenth spatial dimension, the approximate equations in the original five superstring models proved too weak to reveal membranes.
### P-branes

### Strings with "loose ends"

### Strings with closed loops

## Membrane Interactions

## Matrix theory

### Analogy with water

### 9 matrices

## See also

## References

## Books

## External links

This theory is the unique supersymmetric theory in eleven dimensions, with its low-energy matter content and interactions fully determined, and can be obtained as the strong coupling limit of type IIA string theory because a new dimension of space emerges as the coupling constant increases.

Drawing on the work of a number of string theorists (including Ashoke Sen, Chris Hull, Paul Townsend, Michael Duff and John Schwarz), Edward Witten of the Institute for Advanced Study suggested its existence at a conference at USC in 1995, and used M-theory to explain a number of previously observed dualities, sparking a flurry of new research in string theory called the second superstring revolution.

According to Witten and others, the M in M-theory could stand for master, mathematical, mother, mystery, membrane, magic, or matrix. Witten reluctantly admits the M in M-theory can also stand for murky because the level of understanding of the theory is so primitive. However, originally the letter was taken from membrane, but since Witten was more skeptical to membranes than his colleagues, he just kept the "M". Later, he let the meaning be a matter of taste for the user of the word "M-theory".

In the early 1990s, it was shown that the various superstring theories were related by dualities, which allow physicists to relate the description of an object in one super string theory to the description of a different object in another super string theory. These relationships imply that each of the super string theories is a different aspect of a single underlying theory, proposed by Witten, and named "M-theory".

M-theory is not yet complete; however it can be applied in many situations (usually by exploiting string theoretic dualities). The theory of electromagnetism was also in such a state in the mid-19th century; there were separate theories for electricity and magnetism and, although they were known to be related, the exact relationship was not clear until James Clerk Maxwell published his equations, in his 1864 paper A Dynamical Theory of the Electromagnetic Field. Witten has suggested that a general formulation of M-theory will probably require the development of new mathematical language. However, some scientists have questioned the tangible successes of M-theory given its current incompleteness, and limited predictive power, even after so many years of intense research.

In late 2007, Bagger, Lambert and Gustavsson set off renewed interest in M-theory with the discovery of a candidate Lagrangian description of coincident M2-branes, based on a non-associative generalization of Lie Algebra, Nambu 3-algebra or Filippov 3-algebra. Practitioners hope the Bagger-Lambert-Gustavsson action (BLG action) will provide the long-sought microscopic description of M-theory.

Prior to 1995 there were five (known) consistent superstring theories (here on referred to as string theories), which were given the names Type I string theory, Type IIA string theory, Type IIB string theory, heterotic SO(32) (the HO string) theory, and heterotic E_{8}×E_{8} (the HE string) theory. The five theories all share essential features that relate them to the name of string theory. Each theory is fundamentally comprised of vibrating, one dimensional strings at approximately the length of the Planck length. Calculations have also shown that each theory requires more than the normal four spacetime dimensions (although all extra dimensions are in fact spatial.) However, when the theories are analyzed in detail, significant differences appear.

The Type I string theory has vibrating strings like the rest of the string theories. These strings vibrate both in closed loops, so that the strings have no ends, and as open strings with two loose ends. The open loose strings are what separates the Type I string theory from the other four string theories. This was a feature that the other string theories did not contain (The Type IIA and Type IIB string theories also contain open strings, however these strings are bound to D-branes, that is to say, they are tight).

Furthermore, calculations show that the list of string vibrational patterns and the way each pattern interacts and influences others vary from one theory to another. These and other differences hindered the development of the string theory as being the theory that united quantum mechanics and general relativity successfully. Attempts by the physics community to eliminate four of the theories, leaving only one string theory, have not been successful.

M-theory attempts to unify the five string theories by examining certain identifications and dualities. Thus each of the five string theories becomes a special case of M-theory.

As the names suggest, some of these string theories were thought to be related to each other. In the early 1990s, string theorists discovered that some relations were so strong that they could be thought of as an identification.

The Type IIA string theory and the Type IIB string theory were known to be connected by T-duality; this essentially meant that the IIA string theory description of a circle of radius R is exactly the same as the IIB description of a circle of radius 1/R, where distances are measured in units of the Planck length.

This was a profound result. First, this was an intrinsically quantum mechanical result; the identification did not hold in the realm of classical physics. Second, because it is possible to build up any space by gluing circles together in various ways, it would seem that any space described by the IIA string theory can also be seen as a different space described by the IIB theory. This implies that the IIA string theory can identify with the IIB string theory: any object which can be described with the IIA theory has an equivalent, although seemingly different, description in terms of the IIB theory. This suggests that the IIA string theory and the IIB string theory are really aspects of the same underlying theory.

There are other dualities between the other string theories. The heterotic SO(32) and the heterotic E_{8}×E_{8} theories are also related by T-duality; the heterotic SO(32) description of a circle of radius R is exactly the same as the heterotic E_{8}×E_{8} description of a circle of radius 1/R. This implies that there are really only three superstring theories, which might be called (for discussion) the Type I theory, the Type II theory, and the heterotic theory.

There are still more dualities, however. The Type I string theory is related to the heterotic SO(32) theory by S-duality; this means that the Type I description of weakly interacting particles can also be seen as the heterotic SO(32) description of very strongly interacting particles. This identification is somewhat more subtle, in that it identifies only extreme limits of the respective theories. String theorists have found strong evidence that the two theories are really the same, even away from the extremely strong and extremely weak limits, but they do not yet have a proof strong enough to satisfy mathematicians. However, it has become clear that the two theories are related in some fashion; they appear as different limits of a single underlying theory.

Given the above commonalities there appear to be only two string theories: the heterotic string theory (which is also the type I string theory) and the type II theory. There are relations between these two theories as well, and these relations are in fact strong enough to allow them to be identified.

This last step is best explained first in a certain limit. In order to describe our world, strings must be extremely tiny objects. So when one studies string theory at low energies, it becomes difficult to see that strings are extended objects — they become effectively zero-dimensional (pointlike). Consequently, the quantum theory describing the low energy limit is a theory that describes the dynamics of these points moving in spacetime, rather than strings. Such theories are called quantum field theories. However, since string theory also describes gravitational interactions, one expects the low-energy theory to describe particles moving in gravitational backgrounds. Finally, since superstring string theories are supersymmetric, one expects to see supersymmetry appearing in the low-energy approximation. These three facts imply that the low-energy approximation to a superstring theory is a supergravity theory.

The possible supergravity theories were classified by Werner Nahm in the 1970s. In 10 dimensions, there are only two supergravity theories, which are denoted Type IIA and Type IIB. This similar denomination is not a coincidence; the Type IIA string theory has the Type IIA supergravity theory as its low-energy limit and the Type IIB string theory gives rise to Type IIB supergravity. The heterotic SO(32) and heterotic E_{8}×E_{8} string theories also reduce to Type IIA and Type IIB supergravity in the low-energy limit. This suggests that there may indeed be a relation between the heterotic/Type I theories and the Type II theories.

In 1994, Edward Witten outlined the following relationship: The Type IIA supergravity (corresponding to the heterotic SO(32) and Type IIA string theories) can be obtained by dimensional reduction from the single unique eleven-dimensional supergravity theory. This means that if one studied supergravity on an eleven-dimensional spacetime that looks like the product of a ten-dimensional spacetime with another very small one-dimensional manifold, one gets the Type IIA supergravity theory. (And the Type IIB supergravity theory can be obtained by using T-duality.) However, eleven-dimensional supergravity is not consistent on its own — it does not make sense at extremely high energy, and likely requires some form of completion. It seems plausible, then, that there is some quantum theory — which Witten dubbed M-theory — in eleven-dimensions which gives rise at low energies to eleven-dimensional supergravity, and is related to ten-dimensional string theory by dimensional reduction. Dimensional reduction to a circle yields the Type IIA string theory, and dimensional reduction to a line segment yields the heterotic SO(32) string theory.

M-theory would implement the notion that all of the different string theories are different special cases and/or different presentations of the same underlying theory (M-theory). Thus the concept of string theory is expanded. Unfortunately little is known about M-theory, but there is a great deal of interest in the concept from the theoretical physics community. Computations in M-theory and string theory in general are extremely complex, so concrete results are very difficult to produce. It may be some time before the full implications of these theories are known.

The promise of M-theory is that all of the different string theories would become different limits of a single underlying theory.

There are two issues to be dealt with here:

- When Witten named M-theory, he did not specify what the "M" stood for, presumably because he did not feel he had the right to name a theory which he had not been able to fully describe. According to Witten himself, "'M' stands for "magic," "mystery" , or "matrix", according to taste. According to the BBC/TLC documentary Parallel Universes, the M stands for "membrane". Other suggestions by people such as Michio Kaku, Michael Duff and Neil Turok in that documentary are "mother" (as in "mother of all theories"), and "master" theory.

Cynics have noted that the M might be an upside down "W", standing for Witten. Others have suggested that for now, the "M" in M-theory should stand for Missing or Murky. The various speculations as to what "M" in "M-theory" stands for are explored in the PBS documentary based on Brian Greene's book The Elegant Universe.

- The name M-theory is slightly ambiguous. It can be used to refer to both the particular eleven-dimensional theory which Witten first proposed, or it can be used to refer to a kind of theory which looks in various limits like the various string theories. Ashoke Sen has suggested that more general theory could go by the name U-theory, which might stand for Ur, Uber, Ultimate, Underlying, or perhaps Unified. (It might also stand for U-duality, which is both a reference to Sen's own work and a kind of particle physics pun.)

M-theory in the following descriptions refers to the more general theory, and will be specified when used in its more limited sense.

A membrane, or brane, is a multidimensional object, usually called a p-brane, with p referring to the number of dimensions in which it exists. The value of 'p' can range from zero to nine, thus giving branes dimensions from zero (0-brane ≡ point particle) to nine - five more than the world we are accustomed to inhabiting (3 spatial and 1 time). The inclusion of p-branes does not render previous work in string theory wrong on account of not taking note of these p-branes. P-branes are much more massive ("heavier") than strings, and when all higher-dimensional p-branes are much more massive than strings, they can be ignored, as researchers had done unknowingly in the 1970s.

Shortly after Witten's breakthrough in 1995, Joseph Polchinski of the University of California, Santa Barbara discovered a fairly obscure feature of string theory. He found that in certain situations the endpoints of strings (strings with "loose ends") would not be able to move with complete freedom as they were attached, or stuck within certain regions of space. Polchinski then reasoned that if the endpoints of open strings are restricted to move within some p-dimensional region of space, then that region of space must be occupied by a p-brane. These type of "sticky" branes are called Dirichlet-p-branes, or D-p-branes. His calculations showed that the newly discovered D-p-branes had exactly the right properties to be the objects that exert a tight grip on the open string endpoints, thus holding down these strings within the p-dimensional region of space they fill.

Not all strings are confined to p-branes. Strings with closed loops, like the graviton, are completely free to move from membrane to membrane. Of the four force carrier particles, the graviton is unique in this way. Researchers speculate that this is the reason why investigation through the weak force, the strong force, and the electromagnetic force have not hinted at the possibility of extra dimensions. These force carrier particles are strings with endpoints that confine them to their p-branes. Further testing is needed in order to show that extra spatial dimensions indeed exist through experimentation with gravity.

One of the reasons M-theory is so difficult to formulate is that the numbers of different types of membranes in the various dimensions increases exponentially. For example once you get to 3 dimensional surfaces you have to deal with solid objects with knot shaped holes, and then you need the whole of knot theory just to classify them. Since M-theory is thought to operate in 11 dimensions this problem then becomes very difficult. But just like string theory, in order for the theory to satisfy causality, the theory must be local, and so the topology changing must occur at a single point. The basic orientable 2-brane interactions are easy to show. Orientable 2-branes are tori with multiple holes cut out of them.

The original formulation of M-theory was in terms of a (relatively) low-energy effective field theory, called 11-dimensional Supergravity. Though this formulation provided a key link to the low-energy limits of string theories, it was recognized that a full high-energy formulation (or "UV-completion") of M-theory was needed.

For an analogy, the Supergravity description is like treating water as a continuous, incompressible fluid. This is effective for describing long-distance effects such as waves and currents, but inadequate to understand short-distance/high-energy phenomena such as evaporation, for which a description of the underlying molecules is needed. What, then, are the underlying degrees of freedom of M-theory?

Banks, Fischler, Shenker and Susskind (BFSS) conjectured that Matrix theory could provide the answer. They demonstrated that a theory of 9 very large matrices, evolving in time, could reproduce the Supergravity description at low energy, but take over for it as it breaks down at high energy. While the Supergravity description assumes a continuous space-time, Matrix theory predicts that, at short distances, noncommutative geometry takes over, somewhat similar to the way the continuum of water breaks down at short distances in favour of the graininess of molecules.

- Banks, T., W. Fischer, S.H. Shenker, L. Suskind (1996). M Theory As A Matrix Model: A Conjecture
- B. de Wit, J. Hoppe, H. Nicolai, "On The Quantum Mechanics Of Supermembranes". Nucl.Phys. B305:545 (1988).
- Duff, Michael J., M-Theory (the Theory Formerly Known as Strings), International Journal of Modern Physics A, 11 (1996) 5623-5642, online at Cornell University's arXiv ePrint server
- Gribbin, John. The Search for Superstrings, Symmetry, and the Theory of Everything, ISBN 0-316-32975-4, Little, Brown & Company, 1ST BACK B Edition, August 2000, specifically pages 177-180.
- Greene, Brian. The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate Theory, ISBN 0-393-04688-5, W.W. Norton & Company, February 1999
- Kaku, Michio (December 2004). Parallel Worlds: A Journey Through Creation, Higher Dimensions, and the Future of the Cosmos. Doubleday. ISBN 0-385-50986-3, 448.
- Taubes, Gary. "String theorists find a Rosetta Stone." Science, v. 285, July 23, 1999: 512-515, 517. Q1.S35
- Smolin, Lee. "The Trouble with Physics", ISBN 0-618-91868-X, Houghton Mifflin, Mariner 2007
- Witten, Edward. Magic, Mystery and Matrix, Notices of the AMS, October 1998, 1124-1129
- Duff, Michael J. , "The Theory Formerly Known As Strings". Scientific American, February 1998, pages 64-69.

- Brian Greene has written books explaining string theory and M-theory for the layperson in 1999, The Elegant Universe, ISBN 0-375-70811-1 and in 2004, The Fabric of the Cosmos, ISBN 0-375-41288-3.
- Kaku, Michio
*Strings, Conformal Fields, and M-Theory*. New York: Springer. for a more advanced introduction.

- The Elegant Universe - A Three-Hour Miniseries with Brian Greene by NOVA (original PBS Broadcast Dates: October 28, 8-10 p.m. and November 4, 8-9 p.m., 2003). Various images, texts, videos and animations explaining string theory and M-theory.
- Superstringtheory.com - The "Official String Theory Web Site", created by Patricia Schwarz. Excellent references on string theory and M-theory for the layperson and expert.
- Basics of M-Theory by A. Miemiec and I. Schnakenburg is a lecture note on M-Theory published in Fortsch.Phys.54:5-72,2006.
- M-Theory-Cambridge
- M-Theory-Caltech

Wikipedia, the free encyclopedia © 2001-2006 Wikipedia contributors (Disclaimer)

This article is licensed under the GNU Free Documentation License.

Last updated on Friday October 10, 2008 at 11:54:23 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

This article is licensed under the GNU Free Documentation License.

Last updated on Friday October 10, 2008 at 11:54:23 PDT (GMT -0700)

View this article at Wikipedia.org - Edit this article at Wikipedia.org - Donate to the Wikimedia Foundation

Copyright © 2014 Dictionary.com, LLC. All rights reserved.