The political debate in the US seems to bitter these days....can the US ever be a united country again? By united I don't mean that everyone would agree on everrything, but that people would accept the result of a democractic election, and see themselves as Americans first and members of a party second. The reason this interests me is because in Europe I don't sense the same kind of division very often. It does happen - in Britain under Thatcher, perhaps France under Sakorzy or the end of the Toby Blair era in the UK - but by and large people consider themselves Finns or French or British first, and are proud of that. What party people voted for just isn't much of an issue in discussions around the water cooler. The kind of loathing for liberalism and conservatism we see here seems almost uniquely American - why is that?