Or at least, that's what I always hear at parties, dinners, events, at the office, around the in-laws' place, etc. Now, I'm not a die-hard Christianity defender, though I do defend it as a healthy part of traditional America. And I can't stand some of its (cough!) biggest enemies. But I have a simple question for the people who proclaim that "The Christian Right Controls America." What evidence is there of this "control"? *If they did, abortion would be illegal. But it's not. *If they did, you wouldn't see near-porn on TV. But you do. *If they did, traditional roles for women would be encouraged - and practiced. But they aren't. *If they did, teen pregnancy wouldn't be a big problem. But it is. *If they did, you wouldn't find prayer in schools banned and Christian symbols and displays made illegal. But they are. Just a few examples here, but you get the point: if the evil Christian right "controls America," as the left likes to say, they have a dang funny way of exercising their control. Personally I think Jewish power is a lot more potent than Christian power, though I realize many folks (especially, weirdly enough, Christians) get all hot 'n bothered if you say that. Still. As far as Christians "controlling" America, ask the next lefty punk who whines that whine what his evidence is.