1 definition by TheUrbainDict

The word given to black people as a way of making them feel less american then everyone else. Making them feel like they all belong or grew up in Africa when most have families that trace back to america in the 18th-19th century. Or 1700's-1800's for people who don't know.

- A word given to black americans only by other americans, not used in any other countries to refer to black americans.
Son: Why are brown people called African-Americans?

DAD: Because they're all African!
Son: Then why aren't we called European-American?
Dad: Because white people are true Americans.
by TheUrbainDict July 18, 2015
Get the African-American mug.