A religion which, according to its holy book, the bible, began over 2000 years ago.
It has for a long time been the most commonly followed religion in the world, mostly in the west and parts of Africa.
Recently, there has been a decline in followers, mostly in the west, due to a lot of anti-Christian material in the media - for example, controversies surrounding homosexuality, and in the US, the Westboro Baptist Church.
The truth is, even as a non-Christian, I know that in a nutshell, the bible teaches pretty much nothing more than to love one another and to have a fulfilling life... contrary to what a lot of people say; it being nothing but made up gibberish. True, most of the stories in the bible are made up, but those stories are used mainly to interpret many unanimous Christian beliefs.
Sadly, many modern Christians face a lot of prejudice due to what the media portrays them to be. Most Christians just want to live a fulfilling life and don't wish to impede their beliefs on others.
I don't follow Christianity, but at least it isn't as bad as Islam.