1 definition by UrbanDictionaryIsCrazy

Nazi Germany is the state in which Germany was in under Adolf Hitler's rule from 1938 to 1945. Nazi Germany first took over Czechslovakia and Poland, causing WWII. Then Nazi Germany took over Norway, Denmark, The Netherlands, and France. Finally, after a long war, Nazi Germany was defeated due to several critical mistakes which if they had not been made, we would all be speaking German right now.
History Teacher: And that was how Nazi Germany came close to taking over the world.
by UrbanDictionaryIsCrazy June 25, 2019
Get the Nazi Germany mug.