Ask the Philippines about that. Ask Hawaìi. And Puerto Rico. Usually we're more subtle though.
We didn't "colonize" in those places. The Brits did.
The history of Hawaii includes both natural and human history. After the creation of the islands by volcanic forces, the islands began developing their flora and fauna. Sometime around 1 AD, the earliest Polynesian settlers began to populate the islands. Around 1200 AD Tahitian explorers found and began settling the area as well. This became the rise of the Hawaiian civilization and would be separated from the rest of the world for another 500 years until the arrival of the British.
Ooooh yes we did.
>> The Reciprocity Treaty of 1875 between the Kingdom of Hawaii (explicitly acknowledged as a sovereign nation) and the United States allowed for duty-free importation of Hawaiian sugar into the United States beginning in 1876. This further promoted plantation agriculture, which was in the hands of foreign Whites. Hawai'i ceded Pearl Harbor, including Ford Island (Hawaiian: Mokuʻumeʻume), together with its shoreline and four to five miles of land adjacent to the shore, free of cost to the U.S.[7] The U. S. demanded this area based on an 1873 report commissioned by the U. S. Secretary of War. Native Hawaiians protested the treaty on the streets until the revolt was suffocated by U.S. marines.[2]Engineering coups d'état has been a major US export. I'm running short on time but check out the origin of Panama, which until we got there was part of Colombia. And of course the Shah in Iran. Matter of fact, just follow the Dulles Brothers around the globe. And Woodrow Wilson.
.... In the late 19th century the dominant White minority overthrew the Hawaiian Kingdom and founded a brief Republic that was finally annexed by the United States. << -- History of Hawaìi
Fun fact: what's the only country in the world besides the US that has a pledge of allegiance?
A -- the Philippines. Why do you think that is?
(/somewhat offtopic)
Pogo, the Europeans are responsible for most "colonization" efforts on earth. There is no denying that little factoid.
That in no way means "we have never done that". Because we sure the hell have.
We just took over from the British. They are the ultimate "colonists." If you are going to lay blame, then lie it on them.
What exactly does the sentence "we have never done that" mean to you?
Maybe I'm just, I dunno, reading the words on the screen or sump'm.