Organs of the human reproductive system. In a male, the scrotum, a pouch of skin, is divided into elipsis
Learn more about reproductive system, human with a free trial on Britannica.com.
Permanent change of residence by an individual or group, excluding such movements as nomadism and migrant labour. Migrations may be classed as internal or international and as voluntary or forced. Voluntary migration is usually undertaken in search of a better life; forced migrations include expulsions during war and the transportation of slaves or prisoners. The earliest humans migrated from Africa to all the continents except Antarctica within about 50,000 years. Other mass migrations include the forced migration of 20 million people as slaves from Africa to North America in the 16th–19th centuries and the Great Atlantic Migration of 37 million people from Europe to North America between 1820 and 1980. War-related forced migrations and refugee flows continue to be very large, as are voluntary migrations from developing nations to industrialized ones. Internal migrations have tended to be from rural areas to urban centers.
Learn more about migration, human with a free trial on Britannica.com.
Profession of designing machines, tools, and work environments to best accommodate human performance and behaviour. It aims to improve the practicality, efficiency, and safety of a person working with a single machine or device (e.g., using a telephone, driving a car, or operating a computer terminal). Taking the user into consideration has probably always been a part of tool design; for example, the scythe, one of the oldest and most efficient human implements, shows a remarkable degree of ergonomic engineering. Examples of common devices that are poorly designed ergonomically include the snow shovel and the computer or typewriter keyboard.
Learn more about ergonomics with a free trial on Britannica.com.
Malignant tumour of the skin, including some of the most common human cancers. Though recognizable at an early stage, it has a significant death rate. Light-skinned people have the highest risk but can reduce it by limiting exposure to sunlight and to ionizing radiation. The most common types arise in the epidermis (outer skin layer) and have become more frequent with the thinning of the atmosphere's ozone layer. The most serious form is melanoma, which is frequently fatal if not treated early with surgery. Cancers arising from the dermis are rare; the best-known is Kaposi sarcoma.
Learn more about skin cancer with a free trial on Britannica.com.
A section through the skin. The tough, dead cells of the outer epidermal surface (corneal layer) elipsis
Learn more about skin with a free trial on Britannica.com.
Offering of the life of a human being to a god. In some ancient cultures, the killing of a human being, or the substitution of an animal for a person, was an attempt to commune with the god and to participate in the divine life. It also sometimes served as an attempt to placate the god and expiate the sins of the people. It was especially common among agricultural people (e.g., in the ancient Near East), who sought to guarantee the fertility of the soil. The Aztecs sacrificed thousands of victims (often slaves or prisoners of war) annually to the sun, and the Incas made human sacrifices on the accession of a ruler. In ancient Egypt and elsewhere in Africa, human sacrifice was connected with ancestor worship, and slaves and servants were killed or buried alive along with dead kings in order to provide service in the afterlife. A similar tradition existed in China. The Celts and Germanic peoples are among the European peoples who practiced human sacrifice.
Learn more about human sacrifice with a free trial on Britannica.com.
Rights that belong to an individual as a consequence of being human. The term came into wide use after World War II, replacing the earlier phrase “natural rights,” which had been associated with the Greco-Roman concept of natural law since the end of the Middle Ages. As understood today, human rights refer to a wide variety of values and capabilities reflecting the diversity of human circumstances and history. They are conceived of as universal, applying to all human beings everywhere, and as fundamental, referring to essential or basic human needs. Human rights have been classified historically in terms of the notion of three “generations” of human rights. The first generation of civil and political rights, associated with the Enlightenment and the English, American, and French revolutions, includes the rights to life and liberty and the rights to freedom of speech and worship. The second generation of economic, social, and cultural rights, associated with revolts against the predations of unregulated capitalism from the mid-19th century, includes the right to work and the right to an education. Finally, the third generation of solidarity rights, associated with the political and economic aspirations of developing and newly decolonized countries after World War II, includes the collective rights to political self-determination and economic development. Since the adoption of the Universal Declaration of Human Rights in 1948, many treaties and agreements for the protection of human rights have been concluded through the auspices of the United Nations, and several regional systems of human rights law have been established. In the late 20th century ad hoc international criminal tribunals were convened to prosecute serious human rights violations and other crimes in the former Yugoslavia and Rwanda. The International Criminal Court, which came into existence in 2002, is empowered to prosecute crimes against humanity, crimes of genocide, and war crimes.
Learn more about human rights with a free trial on Britannica.com.
As air enters the nasal cavity through the nostrils, it is warmed and moistened by mucous membranes elipsis
Learn more about respiratory system with a free trial on Britannica.com.
Any of a group of viruses that cause warts and other harmless tumours in humans. More than 100 distinct types are known. Different types are responsible for warts of the hands, plantar warts (of the feet), and throat warts. Genital warts are caused by other types, which are spread by sexual intercourse. Some types of papillomaviruses that cause genital infections have been linked with various cancerous tumours, especially cervical cancers; their presence can be detected through a Pap smear.
Learn more about papillomavirus with a free trial on Britannica.com.
In government and military operations, evaluated information concerning the strength, activities, and probable courses of action of international actors that are usually, though not always, enemies or opponents. The term also refers to the collection, analysis, and distribution of such information and to the secret intervention in the political or economic affairs of other countries, an activity commonly known as “covert action.” Intelligence is an important component of national power and a fundamental element in decision making regarding national security, defense, and foreign policies. It is conducted on three levels: strategic, tactical, and counterintelligence. Despite the public image of intelligence operatives as cloak-and-dagger secret agents, much intelligence work involves an undramatic search of “open” sources, such as radio broadcasts and various publications. Among covert sources of intelligence are imagery intelligence, which includes aerial and space reconnaissance, signals intelligence, which includes electronic eavesdropping and code breaking, and human intelligence, which involves the secret agent working at the classic spy trade. Leading national intelligence organizations are the Central Intelligence Agency (CIA) in the U.S.; the Federal Security Service in Russia; MI5 and MI6 in Britain; and the Mossad in Israel.
Learn more about intelligence with a free trial on Britannica.com.
Ability of a machine to perform tasks thought to require human intelligence. Typical applications include game playing, language translation, expert systems, and robotics. Although pseudo-intelligent machinery dates back to antiquity, the first glimmerings of true intelligence awaited the development of digital computers in the 1940s. AI, or at least the semblance of intelligence, has developed in parallel with computer processing power, which appears to be the main limiting factor. Early AI projects, such as playing chess and solving mathematical problems, are now seen as trivial compared to visual pattern recognition, complex decision making, and the use of natural language. Seealso Turing test.
Learn more about artificial intelligence (AI) with a free trial on Britannica.com.
Principal intelligence and counterintelligence agency of the U.S., established in 1947 as a successor to the World War II-era Office of Strategic Services. The law limits its activities to foreign countries; it is prohibited from gathering intelligence on U.S. soil, which is a responsibility of the Federal Bureau of Investigation. Officially a part of the U.S. Defense Department, it is responsible for preparing analyses for the National Security Council. Its budget is kept secret. Though intelligence gathering is its chief occupation, the CIA has also been involved in many covert operations, including the expulsion of Mohammad Mosaddeq from Iran (1953), the attempted Bay of Pigs invasion of Cuba (1961), and support of the Nicaraguan contras in the 1980s.
Learn more about Central Intelligence Agency (CIA) with a free trial on Britannica.com.
all the genetic content contained within an organism. An organism's genome is made up of molecules of deoxyribonucleic acid (DNA) that form long strands that are tightly wound into chromosomes, which are found in the nucleus of eukaryotic organisms and in the cytoplasm of prokaryotic organisms. Chromosomes that are unique to certain organelles within a cell, such as mitochondria or chloroplasts, are also considered a part of an organism's genome. A genome includes all the coding regions (regions that are translated into molecules of protein) of DNA that form discrete genes, as well as all the noncoding stretches of DNA that are often found on the areas of chromosomes between genes. The sequence, structure, and chemical modifications of DNA not only provide the instructions needed to express the information held within the genome but also provide the genome with the capability to replicate, repair, package, and otherwise maintain itself. The human genome contains approximately 25,000 genes within its 3,000,000,000 base pairs of DNA, which form the 46 chromosomes found in a human cell. In contrast, Nanoarchaeum equitans, a parasitic prokaryote in the domain Archaea, has one of the smallest known genomes, consisting of 552 genes and 490,885 base pairs of DNA. The study of the structure, function, and inheritance of genomes is called genomics. Genomics is useful for identifying genes, determining gene function, and understanding the evolution of organisms.
Learn more about genome with a free trial on Britannica.com.
U.S. research effort initiated in 1990 by the U.S. Department of Energy and the National Institutes of Health to analyze the DNA of human beings. The project, intended to be completed in 15 years, proposed to identify the chromosomal location of every human gene, to determine each gene's precise chemical structure in order to show its function in health and disease, and to determine the precise sequence of nucleotides of the entire set of genes (the genome). Another project was to address the ethical, legal, and social implications of the information obtained. The information gathered will be the basic reference for research in human biology and will provide fundamental insights into the genetic basis of human disease. The new technologies developed in the course of the project will be applicable in numerous biomedical fields. In 2000 the government and the private corporation Celera Genomics jointly announced that the project had been virtually completed, five years ahead of schedule.
Learn more about Human Genome Project with a free trial on Britannica.com.
Evolution of modern human beings from extinct nonhuman and humanlike forms. Genetic evidence points to an evolutionary divergence between the lineages of humans and the great apes on the African continent 8–5 million years ago (mya). The earliest fossils considered to be remains of hominins (members of the human lineage) date to at least 4 mya in Africa; they are classified as genus
Learn more about human evolution with a free trial on Britannica.com.
Major glands of the human endocrine system. The hypothalamus stimulates the pituitary gland and elipsis
Learn more about endocrine system with a free trial on Britannica.com.
Branch of psychology concerned with changes in cognitive, motivational, psychophysiological, and social functioning that occur throughout the human life span. In the late 19th and early 20th centuries, developmental psychologists were concerned primarily with child psychology. In the 1950s they became interested in the relationship between child rearing and adult personality, as well as in examining adolescence in its own right. By the late 20th century they had become interested in all aspects of psychological development and change over the entire life span.
Learn more about developmental psychology with a free trial on Britannica.com.
In zoology, the eating of any animal by another member of the same species. Certain ants regularly consume injured immatures and, when food is scarce, eat healthy immatures; this practice allows the adults to survive the food shortage and live to breed again. Male lions taking over a pride may kill and eat the existing young. After losing her cubs the mother will become impregnated by the new dominant male, thereby ensuring his genetic contribution. Aquarium guppies sometimes regulate their population size by eating most of their young.
Learn more about cannibalism with a free trial on Britannica.com.
Human–computer interaction or HCI is the study of interaction between people (users) and computers. It is often regarded as the intersection of computer science, behavioral sciences, design and several other fields of study. Interaction between users and computers occurs at the user interface (or simply interface), which includes both software and hardware, for example, general-purpose computer peripherals and large-scale mechanical systems, such as aircraft and power plants. The following definition is given by the Association for Computing Machinery:
A long term goal of HCI is to design systems that minimize the barrier between the human's cognitive model of what they want to accomplish and the computer's understanding of the user's task.
Professional practitioners in HCI are usually designers concerned with the practical application of design methodologies to real-world problems. Their work often revolves around designing graphical user interfaces and web interfaces.
Researchers in HCI are interested in developing new design methodologies, experimenting with new hardware devices, prototyping new software systems, exploring new paradigms for interaction, and developing models and theories of interaction.
HCI differs with human factors in that there is more of a focus on users working with computers rather than other kinds of machines or designed artifacts, and an additional focus on how to implement the (software and hardware) mechanisms behind computers to support human-computer interaction. HCI also differs with ergonomics in that there is less of a focus on repetitive work-oriented tasks and procedures, and much less emphasis on physical stress and the physical form or industrial design of physical aspects of the user interface, such as the physical form of keyboards and mice. More discussion of the nuances between these fields is at
Two areas of study have substantial overlap with HCI even as the focus of inquiry shifts. In computer supported cooperative work (CSCW) emphasis is placed on the use of computing systems in support of the collaborative work of a group of people. In the study of personal information management (PIM) human interactions with the computer are placed in a larger informational context. People may work with many forms of information, some computer-based, many not (e.g., whiteboards, notebooks, sticky notes, refrigerator magnets) in order understand and effect desired changes in their world.
Repeat the iterative design process until a sensible, user-friendly interface is created.
THIRTEEN PRINCIPLES OF DISPLAY DESIGN
These principles of human perception and information processing can be utilized to create an effective display design. A reduction in errors, a reduction in required training time, an increase in efficiency, and an increase in user satisfaction are a few of the many potential benefits that can be achieved through utilization of these principles.
Certain principles may not be applicable to different displays or situations. Some principles may seem to be conflicting, and there is no simple solution to say that one principle is more important than another. The principles may be tailored to a specific design or situation. Striking a functional balance among the principles is critical for an effective design.
1. Make displays legible (or audible)
A display’s legibility is critical and necessary for designing a usable display. If the characters or objects being displayed cannot be discernible, then the operator cannot effectively make use of them.
2. Avoid absolute judgment limits
Do not ask the user to determine the level of a variable on the basis of a single sensory variable (e.g. color, size, loudness). These sensory variables can contain many possible levels.
3. Top-down processing
Signals are likely perceived and interpreted in accordance with what is expected based on a user’s past experience. If a signal is presented contrary to the user’s expectation, more physical evidence of that signal may need to be presented to assure that it is understood correctly.
4. Redundancy gain
If a signal is presented more than once, it is more likely that it will be understood correctly. This can be done by presenting the signal in alternative physical forms (e.g. color and shape, voice and print, etc.), as redundancy does not imply repetition. A traffic light is a good example of redundancy, as color and position are redundant.
5. Similarity causes confusion: Use discriminable elements
Signals that appear to be similar will likely be confused. The ratio of similar features to different features causes signals to be similar. For example, A423B9 is more similar to A423B8 than 92 is to 93. Unnecessary similar features should be removed and dissimilar features should be highlighted.
Mental Model Principles
6. Principle of pictorial realism
A display should look like the variable that it represents (e.g. high temperature on a thermometer shown as a higher vertical level). If there are multiple elements, they can be configured in a manner that looks like it would in the represented environment.
7. Principle of the moving part
Moving elements should move in a pattern and direction compatible with the user’s mental model of how it actually moves in the system. For example, the moving element on an altimeter should move upward with increasing altitude.
Principles Based on Attention
8. Minimizing information access cost
When the user’s attention is averted from one location to another to access necessary information, there is an associated cost in time or effort. A display design should minimize this cost by allowing for frequently accessed sources to be located at the nearest possible position. However, adequate legibility should not be sacrificed to reduce this cost.
9. Proximity compatibility principle
Divided attention between two information sources may be necessary for the completion of one task. These sources must be mentally integrated and are defined to have close mental proximity. Information access costs should be low, which can be achieved in many ways (e.g. close proximity, linkage by common colors, patterns, shapes, etc.). However, close display proximity can be harmful by causing too much clutter.
10. Principle of multiple resources
A user can more easily process information across different resources. For example, visual and auditory information can be presented simultaneously rather than presenting all visual or all auditory information.
11. Replace memory with visual information: knowledge in the world
A user should not need to retain important information solely in working memory or to retrieve it from long-term memory. A menu, checklist, or another display can aid the user by easing the use of their memory. However, the use of memory may sometimes benefit the user rather than the need for reference to some type of knowledge in the world (e.g. a expert computer operator would rather use direct commands from their memory rather than referring to a manual). The use of knowledge in a user’s head and knowledge in the world must be balanced for an effective design.
12. Principle of predictive aiding
Proactive actions are usually more effective than reactive actions. A display should attempt to eliminate resource-demanding cognitive tasks and replace them with simpler perceptual tasks to reduce the use of the user’s mental resources. This will allow the user to not only focus on current conditions, but also think about possible future conditions. An example of a predictive aid is a road sign displaying the distance from a certain destination.
13. Principle of consistency
Old habits from other displays will easily transfer to support processing of new displays if they are designed in a consistent manner. A user’s long-term memory will trigger actions that are expected to be appropriate. A design must accept this fact and utilize consistency among different displays.
The future for HCI is expected to include the following characteristics:
Ubiquitous communication Computers will communicate through high speed local networks, nationally over wide-area networks, and portably via infrared, ultrasonic, cellular, and other technologies. Data and computational services will be portably accessible from many if not most locations to which a user travels.
High functionality systems Systems will have large numbers of functions associated with them. There will be so many systems that most users, technical or non-technical, will not have time to learn them in the traditional way (e.g., through thick manuals).
Mass availability of computer graphics Computer graphics capabilities such as image processing, graphics transformations, rendering, and interactive animation will become widespread as inexpensive chips become available for inclusion in general workstations.
Mixed media Systems will handle images, voice, sounds, video, text, formatted data. These will be exchangeable over communication links among users. The separate worlds of consumer electronics (e.g., stereo sets, VCRs, televisions) and computers will partially merge. Computer and print worlds will continue to cross assimilate each other.
High-bandwidth interaction The rate at which humans and machines interact will increase substantially due to the changes in speed, computer graphics, new media, and new input/output devices. This will lead to some qualitatively different interfaces, such as virtual reality or computational video.
Large and thin displays New display technologies will finally mature enabling very large displays and also displays that are thin, light weight, and have low power consumption. This will have large effects on portability and will enable the development of paper-like, pen-based computer interaction systems very different in feel from desktop workstations of the present.
Embedded computation Computation will pass beyond desktop computers into every object for which uses can be found. The environment will be alive with little computations from computerized cooking appliances to lighting and plumbing fixtures to window blinds to automobile braking systems to greeting cards. To some extent, this development is already taking place. The difference in the future is the addition of networked communications that will allow many of these embedded computations to coordinate with each other and with the user. Human interfaces to these embedded devices will in many cases be very different from those appropriate to workstations.
Augmented reality A common staple of science fiction, augmented reality refers to the notion of layering relevant information into our vision of the world. Existing projects show real-time statistics to users performing difficult tasks, such as manufacturing. Future work might include augmenting our social interactions by providing additional information about those we converse with.
Group interfaces Interfaces to allow groups of people to coordinate will be common (e.g., for meetings, for engineering projects, for authoring joint documents). These will have major impacts on the nature of organizations and on the division of labor. Models of the group design process will be embedded in systems and will cause increased rationalization of design.
User Tailorability Ordinary users will routinely tailor applications to their own use and will use this power to invent new applications based on their understanding of their own domains. Users, with their deeper knowledge of their own knowledge domains, will increasingly be important sources of new applications at the expense of generic systems programmers (with systems expertise but low domain expertise).
Information Utilities Public information utilities (such as home banking and shopping) and specialized industry services (e.g., weather for pilots) will continue to proliferate. The rate of proliferation will accelerate with the introduction of high-bandwidth interaction and the improvement in quality of interfaces.
There are also dozens of other smaller, regional or specialized HCI-related conferences held around the world each year, the most important of which include: