Definitions

spoils for

Science and technology in the United States

The United States came into being around the Age of Enlightenment (circa 1680 to 1800), a period in which writers and thinkers rejected the superstitions of the past. Instead, they emphasized the powers of reason and unbiased inquiry, especially inquiry into the workings of the natural world. Enlightenment philosophers envisioned a "republic of science," where ideas would be exchanged freely and useful knowledge would improve the lot of all citizens.

From its emergence as an independent nation, the United States has encouraged science and invention. It has done this by promoting a free flow of ideas, by encouraging the growth of "useful knowledge," and by welcoming creative people from all over the world. The bulk of Research and Development funding (64%) comes from the private sector, rather than from taxes.

The United States Constitution itself reflects the desire to encourage scientific creativity. It gives the United States Congress the power "to promote the progress of science and useful arts, by securing for limited times to authors and inventors the exclusive right to their respective writings and discoveries." This clause formed the basis for the U.S. patent and copyright systems, which ensured that inventions and other creative works could not be copied or used without the creator's receiving some kind of compensation.

Early North American science

In the early decades of its history, the United States was relatively isolated from Europe and also rather poor. At this stage America's scientific infrastructure was still quite primitive compared to the long-established societies, institutes, and universities in Europe.

Two of America's founding fathers were scientists of some repute. Benjamin Franklin conducted a series of experiments that deepened human understanding of electricity. Among other things, he proved what had been suspected but never before shown: that lightning is a form of electricity. Franklin also invented such conveniences as bifocal eyeglasses. He did not invent the Franklin stove, however, it was named after him but is a much simpler version of his original "Pennsylvania Fireplace."

Thomas Jefferson was a student of agriculture who introduced various types of rice, olive trees, and grasses into the New World. He stressed the scientific aspect of the Lewis and Clark expedition (1804-06), which explored the Pacific Northwest, and detailed, systematic information on the region's plants and animals was one of that expedition's legacies.

Like Franklin and Jefferson, most American scientists of the late 18th century were involved in the struggle to win American independence and forge a new nation. These scientists included the astronomer David Rittenhouse, the medical scientist Benjamin Rush, and the natural historian Charles Willson Peale.

During the American Revolution, Rittenhouse helped design the defenses of Philadelphia and built telescopes and navigation instruments for the United States' military services. After the war, Rittenhouse designed road and canal systems for the state of Pennsylvania. He later returned to studying the stars and planets and gained a worldwide reputation in that field.

As United States Surgeon General, Benjamin Rush saved countless lives of soldiers during the Revolutionary War by promoting hygiene and public health practices. By introducing new medical treatments, he made the Pennsylvania Hospital in Philadelphia an example of medical enlightenment, and after his military service, Rush established the first free clinic in the United States.

Charles Willson Peale is best remembered as an artist, but he also was a natural historian, inventor, educator, and politician. He created the first major museum in the United States, the Peale Museum in Philadelphia, which housed the young nation's only collection of North American natural history specimens. Peale excavated the bones of an ancient mastodon near West Point, New York; he spent three months assembling the skeleton, and then displayed it in his museum. The Peale Museum started an American tradition of making the knowledge of science interesting and available to the general public.

Science immigration

American political leaders' enthusiasm for knowledge also helped ensure a warm welcome for scientists from other countries. A notable early immigrant was the British chemist Joseph Priestley, who was driven from his homeland because of his dissenting politics. Priestley, who went to the United States in 1794, was the first of thousands of talented scientists who emigrated in search of a free, creative environment.

Other scientists had come to the United States to take part in the nation's rapid growth. Alexander Graham Bell, who arrived from Scotland by way of Canada in 1872, developed and patented the telephone and related inventions. Charles Steinmetz, who came from Germany in 1889, developed new alternating-current electrical systems at General Electric Company, and Vladimir Zworykin, who left Russia in 1919 and later invented a television camera. The Serb Nikola Tesla went to the United States in 1884, where he invented the brushless electrical motor based on rotating magnetic fields.

Into the early 1900s Europe remained the center of science research, notably in England and Germany. However with the rise of the Nazi party in Germany, a huge number of scientists, many of them of Jewish descent, left the country and travelled to the US. One of the first to do so was Albert Einstein in 1933. At his urging, and often with his support, a good percentage of Germany's theoretical physics community, previously the best in the world, left for the US. Enrico Fermi, came from Italy in 1938 and led the work that produced the world's first self-sustaining nuclear chain reaction.

In the post-war era the US was left in a position of unchallenged scientific leadership, being one of the few industrial countries not ravaged by war. Additionally, science and technology were seen to have greatly added to the Allied war victory, and were seen as absolutely crucial in the Cold War era. As a result, the US government became, for the first time, the largest single supporter of basic and applied scientific research. By the mid-1950s the research facilities in the US were second to none, and scientists were drawn to the US for this reason alone. The changing pattern can be seen in the winners of the Nobel Prizes in physics and chemistry. During the first half-century of Nobel Prizes – from 1901 to 1950 – American winners were in a distinct minority in the science categories. Since 1950, Americans have won approximately half of the Nobel Prizes awarded in the sciences.

American applied science

During the 19th century, Britain, France, and Germany were at the forefront of new ideas in science and mathematics. But if the United States lagged behind in the formulation of theory, it excelled in using theory to solve problems: applied science. This tradition had been born of necessity. Because Americans lived so far from the well-springs of Western science and manufacturing, they often had to figure out their own ways of doing things. When Americans combined theoretical knowledge with "Yankee ingenuity," the result was a flow of important inventions. The great American inventors include Robert Fulton (the steamboat); Samuel Morse (the telegraph); Eli Whitney (the cotton gin); Cyrus McCormick (the reaper); and Thomas Alva Edison, the most fertile of them all, with more than a thousand inventions credited to his name.

Edison was not always the first to devise a scientific application, but he was frequently the one to bring an idea to a practical finish. For example, the British engineer Joseph Swan built an incandescent electric lamp in 1860, almost 20 years before Edison. But Edison's light bulbs lasted much longer than Swan's, and they could be turned on and off individually, while Swan's bulbs could be used only in a system where several lights were turned on or off at the same time. Edison followed up his improvement of the light bulb with the development of electrical generating systems. Within 30 years, his inventions had introduced electric lighting into millions of homes.

Another landmark application of scientific ideas to practical uses was the innovation of the brothers Wilbur and Orville Wright. In the 1890s they became fascinated with accounts of German glider experiments and began their own investigation into the principles of flight. Combining scientific knowledge and mechanical skills, the Wright brothers built and flew several gliders. Then, on December 17, 1903, they successfully flew the first heavier-than-air, mechanically propelled airplane.

An American invention that was barely noticed in 1947 went on to usher in the Information Age. In that year John Bardeen, William Shockley, and Walter Brattain of Bell Laboratories drew upon highly sophisticated principles of quantum physics to invent the transistor, a small substitute for the bulky vacuum tube. This, and a device invented 10 years later, the integrated circuit, made it possible to package enormous amounts of electronics into tiny containers. As a result, book-sized computers of today can outperform room-sized computers of the 1960s, and there has been a revolution in the way people live – in how they work, study, conduct business, and engage in research.

The Atomic Age and "Big Science"

One of the most spectacular – and controversial – accomplishments of US technology has been the harnessing of nuclear energy. The concepts that led to the splitting of the atom were developed by the scientists of many countries, but the conversion of these ideas into the reality of nuclear fission was accomplished in the United States in early 1940s, both by many Americans but also aided tremendously by the influx of European intellectuals fleeing the growing conflagration sparked by Adolf Hitler and Benito Mussolini in Europe.

During these crucial years, a number of the most prominent European scientists, especially physicists, immigrated to the United States, where they would do much of their most important work: including Hans Bethe, Albert Einstein, Enrico Fermi, Leó Szilárd, Edward Teller, Felix Bloch, Emilio Segrè, and Eugene Wigner, among many, many others. American academics worked hard to find positions at laboratories and universities for their European colleagues.

After German physicists split a uranium nucleus in 1938, a number of scientists concluded that a nuclear chain reaction was feasible and possible. In a letter to President Franklin Roosevelt, written by Leó Szilárd and signed by Albert Einstein, warned that this breakthrough would permit the construction of "extremely powerful bombs." This warning inspired an executive order towards the investigation of using uranium as a weapon, which later was superseded during World War II by the Manhattan Project the full Allied effort to be the first to build an atomic bomb. The project bore fruit when the first such bomb was exploded in New Mexico on July 16, 1945.

The development of the bomb and its use against Japan in August 1945 initiated the Atomic Age, a time of anxiety over weapons of mass destruction that has lasted through the Cold War and down to the anti-proliferation efforts of today. But the Atomic Age has also been characterized by peaceful uses of atomic energy, as in nuclear power and nuclear medicine.

Along with the production of the atomic bomb, World War II also saw the entrance of an era known as "Big Science" with increased government patronage of scientific research. The advantage of a scientifically and technologically sophisticated country became all too apparent during wartime, and in the ideological Cold War to follow the importance of scientific strength in even peacetime applications became too much for the government to any more leave to philanthropy and private industry alone. This increased expenditure on scientific research and education propelled the United States to the forefront of the international scientific community -- an amazing feat for a country which only a few decades before still had to send its most promising students to Europe for extensive scientific education.

The first US commercial nuclear power plant started operation in Illinois in 1956. At the time, the future for nuclear energy in the United States looked bright. But opponents criticized the safety of power plants and questioned whether safe disposal of nuclear waste could be assured. A 1979 accident at Three Mile Island in Pennsylvania turned many Americans against nuclear power. The cost of building a nuclear power plant escalated, and other, more economical sources of power began to look more appealing. During the 1970s and 1980s, plans for several nuclear plants were cancelled, and the future of nuclear power remains in a state of uncertainty in the United States.

Meanwhile, American scientists have been experimenting with other renewable energy, including solar power. Although solar power generation is still not economical in much of the United States, recent developments might make it more affordable.

Telecom and technology

For the past 80 years, the United States has been integral in fundamental advances in telecommunications and technology. For example, AT&T's Bell Laboratories spearheaded the American technological revolution with a series of inventions including the light emitted diode (LED), the transistor, the C programming language, and the UNIX. SRI International and Xerox PARC in Silicon Valley helped give birth to the personal computer industry, while ARPA and NASA funded the development of the ARPANET and the Internet.

The "Space Age"

Running almost in tandem with the Atomic Age has been the Space Age. American Robert Goddard was one of the first scientists to experiment with rocket propulsion systems. In his small laboratory in Worcester, Massachusetts, Goddard worked with liquid oxygen and gasoline to propel rockets into the atmosphere, and in 1926 successfully fired the world's first liquid-fuel rocket which reached a height of 12.5 meters. Over the next 10 years, Goddard's rockets achieved modest altitudes of nearly two kilometers, and interest in rocketry increased in the United States, Britain, Germany, and the Soviet Union.

As Allied forces advanced during World War II, both the American and Russian forces searched for top German scientists who could be claimed as "spoils" for their country. In particular, the American effort to bring home German rocket technology in Operation Paperclip, and the bringing of German rocket scientist Wernher von Braun (who would later sit at the head of NASA) stand out in particular.

Expendable rockets provided the means for launching artificial satellites, as well as manned spacecraft. In 1957 the Soviet Union launched the first satellite, Sputnik I, and the United States followed with Explorer I in 1958. The first manned space flights were made in early 1961, first by Soviet cosmonaut Yuri Gagarin and then by American astronaut Alan Shepard.

From those first tentative steps, to the 1969 Apollo program landing on the Moon, to today's reusable Space Shuttle, the American space program has brought forth a breathtaking display of applied science. Communications satellites transmit computer data, telephone calls, and radio and television broadcasts. Weather satellites furnish the data necessary to provide early warnings of severe storms.

Medicine and health care

As in physics and chemistry, Americans have dominated the Nobel Prize for physiology or medicine since World War II. The private sector has been the focal point for biomedical research in the United States, and has played a key role in this achievement. As of 2000, for-profit industry funded 57%, non-profit private organizations such as the Howard Hughes Medical Institute funded 7%, and the tax-funded National Institutes of Health funded 36% of medical research in the U.S. However, by 2003, the NIH funded only 28% of medical research funding; funding by private industry increased 102% from 1994 to 2003.

The National Institutes of Health consists of 24 separate institutes, the NIH occupies 75 buildings on more than 1.2 km² in Bethesda, Maryland. The goal of NIH research is knowledge that helps prevent, detect, diagnose, and treat disease and disability -- everything from the rarest genetic disorder to the common cold. At any given time, grants from the NIH support the research of about 35,000 principal investigators, working in every US state and several foreign countries. Among these grantees have been 91 Nobel Prize-winners. Five Nobelists have made their prize-winning discoveries in NIH laboratories.

NIH research has helped make possible numerous medical achievements. For example, mortality from heart disease, the number-one killer in the United States, dropped 41 percent between 1971 and 1991. The death rate for strokes decreased by 59 percent during the same period. Between 1991 and 1995, the cancer death rate fell by nearly 3 percent, the first sustained decline since national record-keeping began in the 1930s. And today more than 70 percent of children who get cancer are cured.

With the help of the NIH, molecular genetics and genomics research have revolutionized biomedical science. In the 1980s and 1990s, researchers performed the first trial of gene therapy in humans and are now able to locate, identify, and describe the function of many genes in the human genome. Scientists predict that this new knowledge will lead to genetic tests for susceptibility to diseases such as colon, breast, and other cancers and to the eventual development of preventive drug treatments for persons in families known to be at risk.

Research conducted by universities, hospitals, and corporations also contributes to improvement in diagnosis and treatment of disease. NIH funded the basic research on Acquired Immune Deficiency Syndrome (AIDS), for example, but many of the drugs used to treat the disease have emerged from the laboratories of the American pharmaceutical industry; those drugs are being tested in research centers across the country.

Emphasis on prevention

While the American medical community has been making strides in the diagnosis and treatment of disease, the American public also has become more aware of the relationship between disease and personal behavior. Since the US surgeon general first warned Americans about the dangers of smoking in 1964, the percentage of Americans who smoke has declined from almost 50 percent to approximately 25 percent. Smoking is no longer permitted in most public buildings or on trains, buses, and airplanes traveling within the United States, and most American restaurants are divided into areas where smoking is permitted and those where it is not. Studies have linked a significant drop in the rate of lung cancer to a nationwide decline in cigarette smoking.

The federal government also encourages Americans to exercise regularly and to eat healthily, including large quantities of fruits and vegetables. More than 40 percent of Americans today exercise or play a sport as part of their regular routine. The per capita consumption of fruit and vegetables has increased by about 20 percent since 1970.

Donna Shalala, Secretary of Health and Human Services in the Clinton administration, frequently speaks out in support of scientific research and preventive medicine. Addressing a conference of medical and public health professionals in 1996 she said:

We must continue to unlock the incremental mysteries in basic science that culminate in blockbuster discoveries over time. But, we must cast our net wider than that. It must encompass behavioral research, occupational research, health services and outcomes research, and environmental research -- all of which hold the potential to prevent disease – and help Americans live healthier lives.

References

Search another word or see spoils foron Dictionary | Thesaurus |Spanish
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature
FAVORITES
RECENT

;