How disgraced and destructive has Donald Trump been to the US Presidency? Will Americans and foreigners ever trust a US President to be honest and truthful again? How will the Presidency ever become a symbol of integrity or at least dignity? Sure, some may argue there have been other really bad Presidents that gave America black eyes and bruised America's integrity and dignity before, but never the way Trump has. Trump may have totally destroyed any chance for redemption.
So, can America ever recover from the Donald Trump era?
So, can America ever recover from the Donald Trump era?