Lastamender
Diamond Member
- Dec 28, 2011
- 63,938
- 57,581
- 3,600
Of course white boys stole land. Thats their racial pastime.they didn't steal landIMHO, perhaps Ms. Bergdorf does have a point, at least in regard to the United States.
I do NOT wish to use the meaningless word "racism," however.
*****
NEVERTHELESS, when I read the thread title, I immediately thought to myself the following:
1. The United States used slaves to grow cotton.
2. The United States stole land from the Native Americans and killed quite a few. (When the Supreme Court told President Jackson not to take their land, he in effect told the justices to go to Hell.)
3. The United States stole the Southwest from Mexico. (Congressman Abraham Lincoln opposed that war.)
4. After the Chinese were "imported" to build the railroads, they were told to go back to their own country.
5. The United States has intervened many times in Central America in order to protect American commercial interests.
6. All the Japanese (including citizens) in California were rounded up during World War II and sent to relocation camps.
7. The United States helped free the Philippines from Spanish rule and then decided to keep the Philippines for another 50 years or so.
8. American settlers in Hawaii took over the islands, and the American government reluctantly agreed to accept this aggression.
*****
As one can see, those actions involved African Americans, Native Americans, Mexicans, Chinese, Central Americans, Japanese, Filipinos, and Hawaiians.
I won't use the word "racism," but as a fair-minded moderate, I acknowledge that "race/ethnicity" has certainly played a prominent role in America's unique history.
...the NAs displaced/decimated/tortured/warred on/murdered other NAs long before the whites came
...the NAs didn't own land/the land
..this was the Era of Exploration/etc ....the whites conquered just as the Hawaiians/Chinese/NAs conquered