DustyInfinity
Platinum Member
With religion dying in the United States, and already gone in Europe, what does this do to the concept of inalienable rights? How does Western Culture define basic human rights without a concept of God? Does this mean the government is the sole decider on what human dignity and sovereignty is? What is to stop the government from deciding that political enemies can be used for firewood? If there are no God given rights, does that mean there are no rights at all?