Is Health Care a Right? History Says No

As this presidential election year rolls along, we’re sure to hear lots of talk about the proper role of the federal government in health care. Liberals like President Obama, in arguing for a government take-over of the industry, make a claim that sounds logical as long as you don’t think too hard about it; that good health care is a “fundamental human right.”

As compelling as that idea sounds at first blush, it doesn’t stand up to scrutiny.
Continue reading »