1 definition by Youknowwhatsup101

Walking the Dog is a term in which a female will be with a male and "Walk their Dog" as in, giving a hand job or anything with a girl handling a dick really.
Becca: Hey where's Tori?

Ali: Tori and Pablo are 'walking the dog.' You know what that means.....
by Youknowwhatsup101 September 23, 2011
Get the Walking the Dog mug.