A slightly derogatory word for America's west coast, used by Republicans to refer to the primarily Democratic California, Oregon, and Washington.
by Economike August 09, 2004
Used to describe the obvious side of the map by those who live on the opposite. Otherwise known as the West Coast.
by Kamikazileo March 04, 2009