Did the US ever declare war on Germany after PH?

ginscpy

Senior Member
Sep 10, 2010
7,950
228
48
Think Hitler beat them to the punch - declared war on the US. (bad move)

Not sure if the US ever declared war on Germany
 
Think Hitler beat them to the punch - declared war on the US. (bad move)

Not sure if the US ever declared war on Germany

Yes, Hitler declared War on The USA after Pearl Harbor. He had no choice because of Germany's axis treaty with Japan. I am pretty sure that we decared war on Germany at the same time as Japan because of that same reason, they were allies. Japan never told Hitler they were gonna attack Hawaii. In fact, The agreement was that Japan would attack Russia from the East. ~BH
 
way I understand it, Germany declared war on us. then we declared war on them as a formality?

sorry, I don't have time to read MM's link. time to go.
 
ww2-declaration-war-germany-l.jpg




Larger Image
 

Forum List

Back
Top