Despite having spent over 25 years living in the United States, sometimes I just can't figure out the way my countrymen see things. Here's a case in point: city workers in NW Pennsylvania are getting their health insurance premiums paid by their employer. And apparently that's a bad thing.
The article takes issue with this practice largely in light of difficult financial times. While I can see the point of their argument, I'd posit that such benefits aren't as rare as the article makes them out to be. While it's increasingly less likely that your employer will cover 100% of your health insurance costs, it does happen.
If an employer is unable or unwilling to pay out for cover, perhaps we should start asking why. As in why is it increasingly hard for companies to provide basic health care to their employees?
I'm well aware (from my ivory towers on the other side of the Atlantic) that the US is about to embark on a national dialog concerning health care. While I hear pundits worrying about what will happen to the health insurance companies, I bet that isn't quite the concern of millions of Americans who depend on their jobs not just for a wage, but also for any sort of health care coverage.
There's nothing wrong with asking hard questions Ė and that goes for small to mid-sized business owners, who should start pushing back. In a land where health care access is tied to your ability to stay employed in troubles times, from afar, things are looking more and more like the TV show Survivor.