Are you one of those people that think white people are born evil and that everyone is literally Hitler? Western Society has a deep history, should it simply self-destruct? Why not let other people join instead of destroying what came before? Why get rid of marriage, two working parents, and the importance of religion. Today, the left views getting married as outdated, religion is despised, and instead of promoting talent, pride, and hard work, we see government assistance as a lifestyle choice. I was born in a military family. We had two rules. Work your butt off, and to believe in God. The left doesn't value either of those things.