What types of companies come into a business and help improve wellness of employees?
I’m trying to research what types of firms come into businesses and help their employees get more fit, stop smoking, lose weight, etc. Is there a name for these types of organizations? I know it’s not always the insurance companies that do this sort of work. Is there an association of these types of companies?
This question is in the General Section. Responses must be helpful and on-topic.