Any type of conduct or policy that improves a company's profits, either through reputation or simply through finances. Can in some cases be beneficial if people are watchful of said company, but more often than not, people are not. Thus, with big companies that have loads of power, influence, and money lying around, they use it in not so nice ways to maximize their revenue. Simply put; if your first order of 'business' is to make money, and you have the means to influence the government, the media, and even your own 'studies', would you do so? Of course you would. Well, at least most would. The major corporations of the US and the world have a monopoly on qualification and misinformation, allowing them to 'purify' their ranks into an increasingly self-serving elite with no danger from outsiders. Those with opposing agendas are systematically crushed through manipulation of financial necessities, or likely in far darker ways when the need becomes dire.
It's not a conspiracy theory, it's just good business!
by Shadow Creator December 06, 2007
used to describe something excellent and/or desirable. Best spoken at the workplace.
See that new secretary in those super tight pants? Yes, that's good business.
by sean_sean_sean December 05, 2006