What do you think about being required to buy insurance?
With all the talk about the health bill requiring people to purchase health insurance, (unless they are unable to afford it, in which case the government will help pay for it), I’ve seen no mention of the fact that states require car owners to purchase automobile insurance. Any comments?
This question is in the General Section. Responses must be helpful and on-topic.