It seems a major component of all the healthcare proposals being floated is that employers should be responsible for providing health insurance to their employees and their families. My question is..
Why is it an employer's responsibility to provide anything other than Workman's Comp insurance to employees?
Why is it an employer's responsibility to provide anything other than Workman's Comp insurance to employees?