Question

Discussion in 'Healthcare/Insurance/Govt Healthcare' started by Vel, Nov 9, 2009.

  1. Vel
    Offline

    Vel Gold Member

    Joined:
    Oct 30, 2008
    Messages:
    5,463
    Thanks Received:
    1,913
    Trophy Points:
    248
    Location:
    Tennessee
    Ratings:
    +1,914
    It seems a major component of all the healthcare proposals being floated is that employers should be responsible for providing health insurance to their employees and their families. My question is..

    Why is it an employer's responsibility to provide anything other than Workman's Comp insurance to employees?
     
  2. NO!bama08
    Offline

    NO!bama08 Active Member

    Joined:
    Sep 13, 2008
    Messages:
    133
    Thanks Received:
    30
    Trophy Points:
    31
    Ratings:
    +30
    In a Capitalist society it is not, in a Socialist society it would be. Healthcare is not a right, it is something you work for and earn.
     
  3. strollingbones
    Online

    strollingbones Diamond Member

    Joined:
    Aug 18, 2008
    Messages:
    65,620
    Thanks Received:
    15,620
    Trophy Points:
    2,190
    Location:
    chicken farm
    Ratings:
    +31,926
  4. The Rabbi
    Offline

    The Rabbi Diamond Member

    Joined:
    Sep 16, 2009
    Messages:
    67,619
    Thanks Received:
    7,821
    Trophy Points:
    1,840
    Location:
    Nashville
    Ratings:
    +18,214
    Actually it is a leftover from WW2 when wages were capped. Since the market has to work somehow, employers figured out that offering this as a benefit could substitute for actual money. The gov't played along by allowing companies to deduct the premiums, rather than taxing them as wages. Which is what they ought to do.

    The reason employers ought to be responsible is because employers already provide most of the health insurance int his country and business owners are all rich greedy guys who can easily afford all this anyway. Right? Right??
     

Share This Page