What Happened to the Right to Choose Your Healthcare?

From atop the lecterns at the Democratic presidential debates and the White House, a common trope is dismantling and rejiggering how health care is delivered in America. For the left, the emphasis is on expanding who can access government-backed health insurance programs while cutting off the role of the private sector. On the right, President Donald Trump is looking to import drugs and pharmaceutical price controls from abroad. Missing in both of these visions is the essential component that governs every other sector of the economy: the freedom to choose. Much like housing, transportation and education, it’s clear that the […]