AveryJarhman
Gold Member
Racism, particularly during the period when Africans were enslaved, taught white men that it is all right to rape black women, and also exacerbated the devaluation of white women.
During a period of human evolution when our ignorance resulted with people being oppressed and enslaved on the North American continent....
...the word RACISM and concept of RACISM did not exist!
Racism - Wikipedia
Peace.