So apparently life within America has come to be so much more about retaining one's Rights instead of attaining one's Rights through responsibility. With Healthcare being a right... more so than a privilege or a responsibility it seems we've dug graves for bodies that perhaps should be recycled by and for science instead. Obamacare is supposedly not favored by many but are there good points to it? Other than the obvious monetary price of 'it all' what edge is it supposed to bring to our nation? According the the above article the doctors and healthcare practitioners won't even be getting paid the amount that Medicare/Medicaid pays at this point. KEY: More home healthcare aides? More individuals trained to take care of individuals within their own chosen environment? More basic awareness for mainstream about their *gulp* responsibility for their own health? Raising our children to deal with a few bumps and bruises as being not only acceptable but expected considering the knocks that living life will potentially bring them? Grouping more seniors and individuals incapable of self-sufficiency in homes or apartment complexes together so that they may not only have better socialization outlets but also so that they will have one another to help in the ways that are environmentally supported by the home/complex? Wondering... Is it something an administration should be responsible for at all?