left coast
A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.
left coast by Economike August 9, 2004
Get the left coast mug.