1 definition by jedi_llama

Top Definition
Type of Christianity that places authority in the Bible as the infallible word of God. This does not necessarily mean a literal interpretation of everything in the Bible (cf. creationism). Not to be confused with evangelism, which means something entirely different.
No-one knows what evangelical means.
by jedi_llama April 13, 2006

Free Daily Email

Type your email address below to get our free Urban Word of the Day every morning!

Emails are sent from daily@urbandictionary.com. We'll never spam you.

×