Businesses are NOT obligated to provide health care
Read that headline again.
It’s the literal truth. (For the moment anyway.)
Health insurance is a benefit some companies choose to offer employees in lieu of other compensation; just like retirement funds, sildenafil viagra sale vacation time, viagra buy educational reimbursements, or company cars.
It’s a historical accident that companies offer health insurance at all.
Despite attempts like this to make it seem like a company that doesn’t provide health insurance is evil, the only real evil is the insidious assumption that no one is responsible for taking care of themselves anymore.
1 comment June 22nd, 2007