Economike's definitions
A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.
by Economike August 9, 2004
Get the left coast mug.