Why would a US company want to provide health, or for that matter, any kind of insurance?
I mean, the vast majority are not in the insurance business, why would they want to be responsible for this? It is just an additional expense that requires hiring more people to implement. I cannot fathom why they would agree to do such a thing unless insurance was not available to the populace and it was viewed as a “perk” that only those who are important enough to warrant would receive.
If you assume that health this is something that we, as a society, both want and need, why would you expect your employer and not your government to provide it?