Is it anti-american to wish to see the role and influence of the United States in world affairs diminished? I believe it is anti american. I know you libs will say you don't want to diminish U.S. influence, you just don't believe Bush is growing our influence "in the right way". It's the same nonsensical way you deprive Bush of any credit for elections in Iraq and the spread of democracy in the mideast by saying "that wasn't the reason given". But yet you also believe Bush planned the invasion from the first moment he was in office. SO which is it?