Wilson's military campaigns into Cuba, Haiti, and mexico? The American eugenics movement in the 1920's? The 'Right to Rape' and other crimes committed by Northern armies during the War for Southern Independece? Banana Republics and America's anti-freedom, neo-colonial past? The hypocrisy of espousing self-determinism while fighting to keep Vietnam subjugated? American support for the Russian Czar (the Whites)? The fact that Lincoln made it clear he did not believe in black equality? The annexation/takeover of Hawaii? We learn all about the evils of german and japanese imperialism, but never talk about our own. We denounce both tyrants and communists, yet do not speak of America's role in drawing out the Russian Revolutionary War- something Russia remembers well, and which helped fuel the suspicions that led to the 'Cold War'. We learn that tyrants are bad, but fail to mention that America has no problem with oppressive regimes if it benefits our economic/industrial interests? Is it really any wonde the world sees America and a bunch of hypocrites?