Canada, France and the United Kingdom provide quality healthcare for its citizens. Why is the United States the only country in the Western world without Universal Healthcare?
Are you or anyone you know not covered by health insurance? Do you currently pay out-of-pocket for your health insurance? Have you been denied coverage for any reason?
This is a problem!! Healthcare should be a right, not a privilege. How have we become a nation that supports a system that rewards people for NOT caring for the sick? This is amoral and disdainful.
“If you can find money to kill people, you can find money to help people.”
--French doctor, Sicko
Please, please, please, I beg you... write your Congresspeople, your Senators, sign petitions. Use this link as a starting point: