We weren't taught world history at some point during our education?
We were all taught world history but curiously never taught never taught that. Plus, if we are going to learn about OUR HISTORY then perhaps it shouldn't be taught or even insinuated as it is that blacks were kidnapped from Africa by Europeans.
How about teaching the truth about the Trans Atlantic slave trade and how it was an extention from the long 700 year long Trans Sahara slave trade industry. That is how and where that happened. Nothing to do with our history? Yeah actually it is. The Trans Atlantic trade began in the 1400s between Portugal and the Northern Ottoman Empire and other Muslim conquered territories.
The slaves brought from Africa, were in fact already slaves. Also, during American history why not talk teach about the truth of the Native American tribes. I mean forget about the South American indigenous tribes since that doesn't have anything to do with North America. However they could teach about the many wars between tribes before pale face showed up and not imply they were nothing but peaceable peoples simply growing maze or smoking peyote under harmonious trees.
They could also teach about the numerous tribes that fought for the confederacy and owned slaves.
View attachment 972148
They can start by presenting that photo of the leaders of the Cherokee among others.
Why do you think those facts are kept from our "public education system.?" Take some guesses.