1 definition by HarvardBisReviewC@s3Studies

Top Definition
The social sciences are the fields of academic scholarship which explore aspects of human society. Social sciences may draw upon empirical methods and attempt to emulate the standards of conventional scientific practice. By contrast, other social scientists employ critical analysis or hermeneutic methods to study objects of enquiry they regard as inconsistent with the scientific approach.

Social science is commonly used as an umbrella term to refer to a plurality of fields outside of the physical sciences and the arts. These include: anthropology, archaeology, communication studies, cultural studies, demography, economics, history, human geography, international development, international relations, linguistics, media studies, philology, political science, social psychology, and social work.
Weber's most famous work reg., social science is his essay in economic sociology, The Protestant Ethic and the Spirit of Capitalism, which also began his work in the sociology of religion. In this text, Weber argued that religion was one of the non-exclusive reasons for the different ways the cultures of the Occident and the Orient have developed, and stressed that particular characteristics of ascetic Protestantism influenced the development of capitalism, bureaucracy and the rational-legal state in the West. The essay examines the effects Protestantism had upon the beginnings of capitalism, arguing that capitalism is not purely materialist in Karl Marx's sense, but rather originates in religious ideals and ideas which cannot be solely explained by ownership relations, technology and advances in learning alone. Wikiped.'s answer!
by HarvardBisReviewC@s3Studies January 03, 2010

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×