This is a tricky question. Obviously no matter what you are you need to be able to deal with living in a white dominated society. That includes learning the english language and the norms of the society you live in. I might add that learning the english language is imperative to being able to decode the laws in this country and understand the beliefs of whites. Once you get past that however, how productive is it to be brainwashed by a educational system that tries to convince you that whites are superior?
If your ultimate goal is a community separate and apart from the dominant culture, like the Amish, for instance, you need your own educational system. It depends on what you want the outcome to be. If you want full assimilation for African Americans as the full Americans they are, education should not be separate or different, because keeping yourselves separate will not achieve your ends. Do you really think the educational system itself is still teaching that American blacks are inferior? I don't mean the media, I mean the educational system. What makes you think that?