The States that boarders the Pacific Ocean. However, when most people/ celebrities say "West Coast", they usually refer to California.
The West Coast is the place you want to be.
: "The West Coast is the place to be."
© 1999-2015 Urban Dictionary ®
Free Daily Email
Type your email address below to get our free Urban Word of the Day every morning!
Emails are sent from email@example.com. We'll never spam you.