Do any doctors on here feel like insurance companies cause you to not practice medicine how you ideally want to?
Do you feel like patients do not get optimum care because of the constraints and rules dictated by the insurance companies? I know two x-navy doctors who in many ways preferred being able to work without billing concerns, and felt they were able to practice better medicine without insurance companies dictating things. Two of my favorite doctors did not take any insurance. They spent time getting to know my entire situation, and I felt like they were partners in my health care, rather than dictating to me.
This question is in the General Section. Responses must be helpful and on-topic.