For decades millions of Americans and their families have gotten their health insurance through a job, but that might not be as true as it used to be:
Jobs are the number-one source of health insurance coverage for Americans, but fewer employers are offering health benefits to their workers and more employees are becoming uninsured, a new survey shows.
In 2010, 67.5 percent of U.S. workers had jobs that included health benefits, down from 70.1 percent in 1997, the Employee Benefit Research Institute reports. The high cost of health insurance and other factors also led to a greater share of employees turning down health benefits even when available, the report says. Two years ago, 83.6 percent of workers who were offered benefits signed up for company health insurance compared to 86 percent in 1997, the report says. (h/t The Hill.)
Is your employer going to be providing health insurance this fall? Tell us about it in our discussion forum!
- McKinsey Survey Predicts 30% of Employers Will Drop Health Insurance By 2014
- Open Enrollment: Is Your Employer Pushing You To High-Deductible Health Insurance?
- Fewer Employees in NY Get Employer-Based Health Insurance, Will the Rest of the Country Follow?
- Smokers Being Charged More Even For Employer-Based Health Insurance
- Will a Third of Employers Really Drop Health Insurance Benefits?