Businesses are NOT obligated to provide health care

Read that headline again.

It’s the literal truth. (For the moment anyway.)

Health insurance is a benefit some companies choose to offer employees in lieu of other compensation; just like retirement funds, sildenafil viagra sale vacation time, viagra buy educational reimbursements, or company cars.

It’s a historical accident that companies offer health insurance at all.

Despite attempts like this to make it seem like a company that doesn’t provide health insurance is evil, the only real evil is the insidious assumption that no one is responsible for taking care of themselves anymore.

1 comment June 22nd, 2007


About

Being in a wheelchair gives you a unique perspective on the world. This blog features many of my views on politics, art, science, and entertainment. My name is Elliot Stearns. More...

The Abortionist

Recent Comments

Categories

Meta