While the states do retain their power, more is being done from the national level. Which goverment is better at addressing our social and economic problems - the federal or the state governments? And why do you think so?
we can have both unfettered economic rights and political equality in America. Can both principles stand together? Why or why not?
This is more of what I was getting at :)