Californication means when Western civilization has an effect on another state (or country). This isn't only limited to California. They mean all WESTERN states. Besides, all the smart Californians know that this state isn't just a beach, or one city (Hollywood)
Does anyone posting here even live in California? Jeez, live here, then insult us.
Prices shown in USD.