As I sit here watching the groups of immigrants at the border wall trying to get into the US, I sit here and ask myself, is the US that much better than the rest of the world?
If so, why?
I don't think the US is better, I'm kind of biased though since I'm European. On the other hand my wife's American and she thinks so too. What does that tell you? In my opinion America has one big draw that most other countries don't have. They call it Hollywood and it sells an idea of the US that sadly is as fake as the movies it produces. America is a great country.... if your rich. Most people though aren't rich and when you aren't rich the rest of the Western world scores better in most criteria that one would use to asses how good a country is.
I think it is indisputably better than most of the world. When it comes to other Western nations, it is harder to say. The U.S. probably has among the worst education in the Western world, sadly. It starts to look not-so-great in a number of metrics.
But compared to Haiti, Russia, or the Democratic Republic of Congo? It's an amazing place.