1 definition by historia

Top Definition
A view held in the era of Imperialism that the more developed and "civilized" peoples of Europe should, through their leadership and domination, bring civilization, modernity, wealth, and advancement to the lesser peoples of the world.

This mostly extended to non-white peoples, but various white ethnic groups subjected to this ideology were various Slavic groups in the east, Arabs, Turks, Finns, and Boers.

The ideology crated a sense of moral justification of Imperialism over many developed and long existing cultures, including long wealthy civilizations like China, Persia, Ottoman Turkey, and India.
Europeans took over Africa in the the name of the White Man's Burden among other things.
#imperialism #racism #europe #victorian era #history
by historia February 18, 2012
Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×