Many people agree that we are in a period of moral decline. When and why do you think it started? The Enlightenment, Rationalism, Science, the Industrial Revolution, the Rise of Nationalism, the Jazz Age, the decline of the Nuclear Family, the Counterculture, the Clinton Presidency, etc. What started the undermining of religion in Western Society, or has it not started yet? Has our morality actually been ever increasing after giving women's suffrage and I was personally taught that the decline started in the late 18th to early 19th centuries. People had started to look to science for solutions they couldn't find in the Bible, leaving the Bible to be more of a cultural identity than a guidebook anymore. Then the rise of nationalism changed the ingroup/outgroup for cultural identity to the Nation instead of Religion, and the social problems from the Industrial Revolution made many people lose faith in a loving God.