Christianity appeared and the Middle East and spread in Europe profoundly.
Taking into consideration the severe years of inquisition I may suppose that Europe should be far more "Christian" than the USA...
Americans tend to follow the traditions and the belief of Christianity a lot more than the rest of Western civilization. How can this be?
I would suggest that one possible reason is that in America religion has never been directly sanctioned by the state. That isn't to suggest that values based on religion have not been 'promoted' by those in office, but that there has never been a law directly imposing one religious view.
One of the reasons that right-wing people like myself, want government involved in as few topics as possible, is that once something is politicized, people hate or oppose a particular topic for no other reason, than because they oppose whoever is in power.
Europe for generations had religion and government intertwined. Thus religion was politicized.
Not so much in America. While there are religious groups that support one view point or another, no one generally identifies one directly with the other. No one says "All Mormons are Republican", or "All Baptists are Democrats".
That said, I would question the claim that America is still Christian. I find that claim to be dubious. Nothing in my experience suggests that is still true. No doubt we have lasted longer than Europe at being Christian, but I'm convinced that period is at an end.