There was a time when we taught American History in school too.
Not complete history. Certain parts were always excluded. The Tulsa riots weren't taught. Intentionally giving black men STDs to test cures wasn't taught. Our part in the overthrow of a democratically elected foreign gov't wasn't taught. The United Daughters of the Confederacy whitewashed the history surrounding the Civil War by creating a textbook censureship movement.
I get that many of these events were embarrassing to the US. But to remove them from our history does not make us better.
I learned that shit in public school, back in the 70s. Not sure what country you are from. Maybe not the STDs one.