Virginia Adopts COVID-Related Workplace Safety Mandate
July 16, 2020
Supreme Court Won't Resolve Pay Discrimination Dispute Over Salary History
July 16, 2020

Ask HR: Are Companies Required to Offer Health Insurance?

While health care insurance is an important benefit, it is legal for an employer to not offer these benefits. However, just because a company doesn’t provide health insurance for this role doesn’t mean you can’t access coverage and care.​ SHRM President and CEO Johnny C. Taylor, Jr., SHRM-SCP, is answering HR questions as part of a series for USA Today. The questions are submitted by readers.
Source: New feed 2