According to the new “ABC News” poll on health care, Americans are eager to have the government force employers to provide heath insurance: “Nearly eight in 10 favor a federal requirement that all employers offer insurance to their full-time workers.”
Why?! Do our employers pay for our food, clothing, or shelter? If they did, why would that be good? Having my health care tied to my boss invites him to snoop into my private health issues, and if I change jobs, I lose coverage.
Employer-paid health insurance isn’t free. It just means we get insurance instead of higher salaries. I’d rather have the cash and buy my own insurance. Companies only provide it because of a World War II-era tax break that never went away.
- Are Americans Cheap? - Tuesday November 28, 2006 at 10:01 pm
- Working Mothers Need The Free Market, Too - Tuesday November 21, 2006 at 11:14 pm
- White Guilt Doesn’t Help Blacks - Tuesday November 14, 2006 at 10:01 pm