What is liberalism in America?

Apr 17, 2011
1,616
103
0
Liberalism has always meant smaller government.

It means diplomacy over sanctions.
It means bailing out families before the banks.
It means constructing an equal field, not selling out to lobbyists.
It means that gays can marry and serve
It means that America becomes a beacon, not the world's police
It means strengthening SS, not making cuts to it
It means ending the war on drugs
It means ending the war on Bush policies of spying on Americans

It means having a humble, small government that leads by example. That is liberalism.

But what happened?



 
Last edited by a moderator:

Forum List

Back
Top