Definitions by Youknowwhatsup101
Walking the Dog
Walking the Dog is a term in which a female will be with a male and "Walk their Dog" as in, giving a hand job or anything with a girl handling a dick really.
Walking the Dog by Youknowwhatsup101 September 25, 2011