Skip to main content

The west has fallen

an older catchphrase of alt-right and adjacent groups. It's the idea that, to them, the social decay and fall from grace perceived in western countries is akin to the fall of Rome; that the Americas and Europe are beginning to collapse or lose their glory.
Paul: The western countries are declining crime is rising, whites are getting replaced, values are lost and cultures are getting eradicated

Thomas: The west has fallen!
by Amba_Singh June 11, 2023
mugGet the The west has fallenmug.

Share this definition

Sign in to vote

We'll email you a link to sign in instantly.

Or

Check your email

We sent a link to

Open your email