Racism in the United States has been around since the 1400s, beginning when European settlers began colonizing America. Europeans, in their quest to "civilize" people, began taking slaves and treating people who looked different from them as inferior.
The first racist acts took place against the Native American people. When the Europeans landed in America, they conquered the Indians in wars and massacres. Those left alive suffered through forced treaties, food restrictions and land seizures. The European settlers tried to force Christianity on the Native Americans, and those who didn't conform were often punished.
African-Americans have been the focus of racism since the first English colonists settled Virginia. They were taken as slaves and often beaten, starved and treated like animals.