1 definition by OrbitalCat

Top Definition
The States that boarders the Pacific Ocean. However, when most people/ celebrities say "West Coast", they usually refer to California.

The West Coast is the place you want to be.

Classy.
: "The West Coast is the place to be."

Nuff said.
by OrbitalCat August 07, 2009

Mug icon
Buy a West Coast mug!