A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.
Arnie must feel very alone on the left coast.
Free Daily Email
Emails are sent from email@example.com. We'll never spam you.