Youknowwhatsup101's definitions
Walking the Dog is a term in which a female will be with a male and "Walk their Dog" as in, giving a hand job or anything with a girl handling a dick really.
by Youknowwhatsup101 September 25, 2011
Get the Walking the Dogmug.