TheProgressivePatriot
Platinum Member
. Your question is not completely coherent If I am reading your question correctly, but I will take a shot, We have not been taught the truth about slavery and racism because the teaching of history has been controlled by-and is still controlled by- the white establishment whose goal is to sugar coat the awful reality of America's past-and presentYou learn about it many times over and how it was wrong but let me ask you one simple question and that is why do you believe we have never been taught any of this?
Last edited: