by SalmonTasty March 20, 2017
Derogatory term to describe Christians and all followers and who believe in Jesus Christ and who hold the Bible as their holy book, which includes the following: Catholics, Protestants, Orthodox, Mormons, Evangelicals, etc.
Christians are cannibals because Jesus Christ told them to eat of his flesh and drink of his blood in the Bible.
by Suspectthirteen13 April 30, 2022
by a friend for dinner October 17, 2022
Dude: Have you seen the homeless guy down the street?
Other Dude: Yeah, I heard rumors that he's a cannibal
Other Dude: Yeah, I heard rumors that he's a cannibal
by MeatGrease May 05, 2021
A more accurately scientific word would be homo sapiens. Scientist spent many years studying Ludwig Feuerbach's phrase "you are what you eat" and concluded that the species homo sapiens refer to cannibals
by Shfsjdbrf March 16, 2022
by The Good Friend, Garrison May 24, 2018