Here's a bit of an interesting essay:
"The United States has been an empire since at least 1803, when Thomas Jefferson purchased the Louisiana Territory. Throughout the 19th century, what Jefferson called the "empire of liberty" expanded across the continent. When U.S. power stretched from "sea to shining sea," the American empire moved abroad, acquiring colonies ranging from Puerto Rico and the Philippines to Hawaii and Alaska.
While the formal empire mostly disappeared after World War II, the United States set out on another bout of imperialism in Germany and Japan. Oh, sorry — that wasn't imperialism; it was "occupation." But when Americans are running foreign governments, it's a distinction without a difference. Likewise, recent "nation-building" experiments in Somalia, Haiti, Bosnia, Kosovo and Afghanistan are imperialism under another name.
Mind you, this is not meant as a condemnation. The history of American imperialism is hardly one of unadorned good doing; there have been plenty of shameful episodes, such as the mistreatment of the Indians. But, on the whole, U.S. imperialism has been the greatest force for good in the world during the past century. It has defeated the monstrous evils of communism and Nazism and lesser evils such as the Taliban and Serbian ethnic cleansing. Along the way, it has helped spread liberal institutions to countries as diverse as South Korea and Panama.
Yet, while generally successful as imperialists, Americans have been loath to confirm that's what they were doing. That's OK. Given the historical baggage that "imperialism" carries, there's no need for the U.S. government to embrace the term. But it should definitely embrace the practice."
USATODAY.com - American imperialism? No need to run away from label
IMO The Bush doctrine has been imperialist.