You know, I said this partially seriously in another thread but then I realized I think it's true. I cannot think of a single thing the GOP has done effectively or done correctly in the last decade. Aside from pushing religion, war and rape what exactly have these people done for Americans? You also have to look at the colossal failures of the Tea Party and John Boehner. Sarah Palin running away from her elected office like a silly twat for money and fame. The hilarious trainwreck that was the Romney campaign. And creating a North Korean like mindset where everything is the strawman liberals' fault. Please enlighten me, what has the GOP done that has been positive for this nation? And please keep in mind, I'm not attacking conservatives or conservative ideology in this thread. I'm genuinely curious what the Republican Party has done that's been good.