How and why did healthcare and insurance become tied to employment?
I have lived in America and this always mystified me. How did it evolve that the usual way to get medical care was to have an employer related insurance thing? I am watching this Obamacare thing as I do have family members there. I apologise if this question seems inflammatory it is certainly not my intention, seeking information only. It seems to me a healthier population would make better workers, but if it’s tied to employment only… now I’m sorta lost..
This question is in the General Section. Responses must be helpful and on-topic.