...for problems in the black community? I mean, starting from American history, have whites ever done anything that could be said to have harmed blacks culturally, economically, socially, psychologically, etc? I mean, there is plenty of history that shows white society doing bad things to blacks, my question is did those bad things ever impact blacks negatively? I mean, it seems like a common sense thing that oppressing people would create negative consequences, but I am not getting that impression from dissenters. All that history is simply looked at as "excuses" for black failure. Was there ever a time when white racism was not an excuse for black failure, but rather, a big reason blacks were failing more than whites in this society?