It's not just colonization; don't forget that after WW2 european whites redrew the world map how they saw fit in many "brown" countries and installed oppressive dictators into many of them too. And they've continued to meddle in their economies and politics to this day. So it's coming back to bite them in the ass? Oh well, let's think of it as a "teaching moment."
No, after WWII the colonies were freed, mostly peacefully. In other countries they just withdrew because of revolutions, and would in spite or because they couldn't, make proper goverments. You just use Europe as a scapegoate. No dictator were installed and only America are medeling in the countries. For different reason.