DOTR
Gold Member
- Oct 24, 2016
- 14,270
- 3,522
- 290
The United States was always a Christian nation. The government was not.
Excellent point! And this is why I keep saying our problems arent political. The liberals (and their Libertarian allies) will always refer everything to the government. Since it is the source of their power in crushing American culture they cant think in any other milieu than government.
But the US has always been a Christian nation. In fact the Democrats created a report in 2012 crowing about their success in dechristianizing America which they hoped would lead to more Democrats being elected (in addition to bringing in third world immigrants to outvote Americans..the second prong of their attack on America) Christianity and Democrats simply dont mix. You cant have both and they know this.
Your country and your nation are two different things.