- Jan 24, 2015
- Reaction score
Brain amputated think so.
Hitler did not declare war on America. He declared war on the USA after Pearl Harbor - and the USA did follow this idea of the nationalistic extremist idiot Hitler and sold a big part of Europe to Stalin and more than 50% of the world to the Soviets before the USA awoke in the war in Korea.Hitler declared war on America
By the way: Why end all political discussions in the USA with the universal excuse for every bullshit "We wan against Hitler, so we are always right"? Do you sit together with Hitler in his bonker?