I've been seriously thinking about the civil war and the circumstances that led to the confederacy breaking the Union. I grew up as thinking South bad - North good, but my mind is changing. Where are States Right anymore, now i know how they feel. Yea Slavery was wrong and I'm more than glad it was abolished, but as for states rights, who was really right and who was really in the wrong. Tell me your opinion.