You either;
A) Need to stop watching so much Faux news and listening to so much conservative talk radio, the propaganda is conditioning your mind. Obama is a center-left politician. Compared to European politicians? He is a fascist.
or
B) Take an elementary class in political science at the University level. Americans have not shifted to the left in any substantial way, nor have their politics. Sure, the press make it seem that way. Socially they may have, but this is more a product of Hollywood than it is a product of the Beltway. The global elites would like to push their agenda, but as was made clear by their gun agenda, they have obviously overplayed their hand. Even half of the so called "liberal" Democrats realize you can't enact gun control w/o amending the Constitution.
Lol you must never have seen any of my other posts
I don't have to. If you believe that the body politic isn't already "in the center" you need to do some reading. Where ever the combined average is at, that is defined as, "the center."
As it so happens, even with the election of Obama, Americans are significantly to the right of nearly every other western nation on the Globe. You really would have to stretch things to say Obama's policies are anything other than centrist.
Why did the Chief Justice of the Supreme Court rule with the other liberal Justices that the national health care law was constitutional? Why? Because it wasn't socialism, it was an example of fascism. What a compromise! How could an arch-conservative not jump at the chance to not make fascism constitutional? Men fought and died in a world war to defeat fascism, and now in one stroke, a supposedly progressive president passes legislation that would make Hitler and Mussolini proud. So a coalition of social democrats and a business friendly conservative judge on the highest court in the land rule it constitutional?
No sir, the nation is far from veering to the left. Since the days of Carter, we have continually lost more and more of our civil liberties and civil rights to government and corporate power. Regardless of your other posts. I base it on your single post.
The fact is, left and right are losing their meaning currently in America and more and more people are beginning to awaken to what is really going on. I can't discuss it with you, because you aren't one of them. Just the fact that you think still in terms of left/right is an indicator. Both the left and the right still believe in the funny money lie. And those that run the nation make laws that everyone MUST go to school, and that lie must be taught in school, and they own all the media, so it is what is constantly drilled in our brains. But don't you believe it. There are other ways. Any time any nation on Earth dares to try one? They get invaded by us. But we use some other pretext and lie about it.
So what is the point? Really? We don't even speak the same political language. We're both nice people I'm sure, with just two radically different epistemological systems of thought. Mine is my own, yours is the one they gave you. . . . .