Question

Vel

Platinum Member
Oct 30, 2008
7,017
4,039
1,030
Tennessee
It seems a major component of all the healthcare proposals being floated is that employers should be responsible for providing health insurance to their employees and their families. My question is..

Why is it an employer's responsibility to provide anything other than Workman's Comp insurance to employees?
 
In a Capitalist society it is not, in a Socialist society it would be. Healthcare is not a right, it is something you work for and earn.
 
Actually it is a leftover from WW2 when wages were capped. Since the market has to work somehow, employers figured out that offering this as a benefit could substitute for actual money. The gov't played along by allowing companies to deduct the premiums, rather than taxing them as wages. Which is what they ought to do.

The reason employers ought to be responsible is because employers already provide most of the health insurance int his country and business owners are all rich greedy guys who can easily afford all this anyway. Right? Right??
 

Forum List

Back
Top