Skip to main content

Economike's definitions

left coast

A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.
Arnie must feel very alone on the left coast.
by Economike August 9, 2004
mugGet the left coast mug.

Share this definition

Sign in to vote

We'll email you a link to sign in instantly.

Or

Check your email

We sent a link to

Open your email