What does "colonialism" mean?


Quick Answer

According to Oxford Dictionaries, "colonialism" means "the policy of acquiring full or partial political control over another country, occupying it with settlers, and exploiting it economically." Many countries and empires, from the Greeks to the Americans, have engaged in colonialism.

Continue Reading

Full Answer

The Stanford Encyclopedia of Philosophy argues that colonialism is very similar to imperialism; however, it makes a distinction between the two, saying that imperialism is usually just domination while colonialism requires the dominating power to send settlers who change the character of the dominated nation. For example, America engaged in colonialism when it divested American Indians of their land and settled white families in their places, but it engaged in imperialism when it took the Spanish colonies of Puerto Rico and the Philippines. Colonialism is not a recent innovation; ancient Greece colonized much of the Mediterranean Sea, planting colonies on Sicily, Asia Minor and the North African coast.

Learn more about Exploration & Imperialism

Related Questions