Q:

What do liberals believe?

A:

Quick Answer

Classical liberalism focuses on individual freedom; contemporary liberalism focuses more on social infrastructure and an expanded definition of human rights. Liberalism is considered the standard left-wing position in the United States. However, it is considered more right-wing in some nations.

Continue Reading
What do liberals believe?
Credit: Steve Russell / Contributor Toronto Star via Getty Images Getty Images

Full Answer

Before the 20th century, the definition of liberal was similar to modern libertarianism. There was a strong emphasis on industry, which created many of the jobs at the time, and on human rights. However, liberals were not as opposed to regulations as libertarians, and they were not opposed to somewhat large governments.

Through the years, liberalism changed to take on some aspects of socialism. In addition, liberals spoke out on a wider range of what they considered to be human rights. Liberal activism led to women being given the right to vote, and liberals campaigned for stronger worker unions to improve job safety. This new liberalism placed less of an emphasis on industry.

In the middle of the 20th century, liberalism changed again, and many embraced what some call "new liberalism." The civil rights movement in the United States and radical politics in the 1960s encouraged a new brand of thinking. Environmental policy became a top issue, and many liberals made anti-war policies a part of their political philosophy.

Learn more about Political Parties

Related Questions

Explore