I know you've seen the ads- your travel agent or that university recruiter has filled your head with images of beaches, palm trees, forests and a night life that's second to none. You need to ask yourself one question: why would they put so much effort into creating that image? Do you know of any other state that is so heavily marketed as a place you should go other than California?
The culture here is defined by belligerence, ignorance, narcissism, and apathy. If you ever leave the big cities, be prepared to be surrounded by inbred, racist/sexist, psycho-christian hillbillies. You heard me right- there are massive amounts of trailer trash hicks everywhere you go, they've just become good at hiding it.
And while you're stuck on the 10, 101, 215, 91, or the 60 Freeways with a raised up Bro-truck tailgating you all the way to work, surrounded by industry, warehouses, smog, boarded up buildings, graffiti, and strip malls, you will attempt to replace what you are actually seeing with what you *thought* you should be seeing.
California is a LIE. It's a dreamworld that doesn't exist. Oh, and trying to find a job here is next to impossible. Right now, we have the highest unemployment. Same thing with teen pregnancy rates.
Yeah, I'm some asshole on the internet, but I'd give anything to save another person out there who reads this the time, toil, trouble, and inevitable misery that will come from believing the "California Dream".
Palm trees and beaches? My apartment got broken into 3 times, that pot-bellied, white, middle-aged douchebag broke a window on my car because i asked him to quiet down in the theatre, and I now have lung cancer from the smog.