1 definition by OrbitalCat

The States that boarders the Pacific Ocean. However, when most people/ celebrities say "West Coast", they usually refer to California.

The West Coast is the place you want to be.

Classy.
: "The West Coast is the place to be."

Nuff said.
by OrbitalCat August 7, 2009
Get the West Coast mug.