We need to stop letting health care be a business and start making it a necessity. Other countries (ALL other countries) take health care as a need and supply health care, our country sees it as a business, a way to make money. If we treated health care in this country as a need there would be no need for insurance to be involved. I met a surgeon from Romania once. He made $60K/year. He said that he was disgusted with how doctors here make so much money. In his country being a doctor is a calling, a valued ability, not a business to make money. He is paid well but not getting rich off of the misery of others.
It is not a matter of changing how we perform health care, but changing how we view health care that needs to happen.
Money and greed are the root of all evil. When will people learn that?