Skip to main content
Home
Add a Definition
UrbanDictionary.store
Blog
World
Advertise
User Settings
A
B
C
D
E
F
G
H
I
J
K
L
M
N
O
P
Q
R
S
T
U
V
W
X
Y
Z
#
Economike's definitions
left coast
Share definition
A slightly derogatory word for America's west coast, used by
Republicans
to refer to the primarily
Democratic
California
, Oregon, and Washington.
Arnie
must feel very
alone
on
the left
coast.
by
Economike
August 9, 2004
Flag
Get the
left coast
mug.
More random definitions
Share this definition
Copy Link
Facebook
X
Pinterest
WhatsApp
Reddit
Email