1 definition by hurdrenee

The belief that some races are inherently superior (physically, intellectually, or culturally) to others and therefore have a right to dominate them. In the United States, racism, particularly by whites against blacks, has created profound racial tension and conflict in virtually all aspects of American society. Until the breakthroughs achieved by the civil rights movement in the 1950s and 1960s, white domination over blacks was institutionalized and supported in all branches and levels of government, by denying blacks their civil rights and opportunities to participate in political, economic, and social communities.
"The horror of class stratification, racism, and prejudice is that some people begin to believe that the security of their families and communities depends on the oppression of others, that for some to have good lives there must be others whose lives are truncated and brutal." - Dorothy Allison
by hurdrenee January 22, 2011

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×