The debate about universal healthcare and so forth, as we all know, has been raging for several years now. To be quite frank, I'm still undecided and conflicted about where I stand on the issue. My question to the members of this forum is this: Is healthcare a right? And by "right," I don't mean the unalienable kind endowed by the creator, as articulated in the Declaration of Independence, but rather the kind defined by the laws of man. What do you think?