A great debate of United States politics in the 20th century rages on. The moment Donald Trump took to the White House as the unlikeliest candidate in recent history in January 2017, the Republican Party set one policy square in its crosshairs: the Affordable Care Act (ACA), colloquially “Obamacare,” the most widely touted achievement of Trump’s predecessor Barack Obama.
The ACA itself was mired in over a year of debates, resulting in legislation that improved some sectors of the troubled US health care industry while also forgoing some of the bolder directives Obama initially suggested, such as the “public option.” Despite the ACA’s closeness to proposals suggested by Republican officials like Newt Gingrich and Mitt Romney, Republicans in Congress made it their rallying cry that Obamacare is a failure, and that when given power the GOP would bring an end to the legislation. Trump’s election, in the minds of GOP leaders like Senate Majority Leader Mitch McConnell and Speaker of the House Paul Ryan, signaled Obamacare’s time to face the ax. Yet following a series of contentious and democratically dubious votes — which
Trump’s election, in the minds of GOP leaders like Senate Majority Leader Mitch McConnell and Speaker of the House Paul Ryan, signaled Obamacare’s time to face the ax. Yet following a series of contentious and democratically dubious votes — which were rushed in the manner that Republicans accused Obama of doing for the ACA — the Republicans’ first attempts to crush Obamacare flopped spectacularly, with three GOP senators, particularly Senators Lisa Murkowski (Alaska) and Susan Collins (Maine) halting any further progress with their no votes.
For now, the debate over health care hasn’t stopped, but it has been stymied. News coverage of the attempt to replace the ACA with the American Health Care Act (AHCA) plasters the major cable news channels in the US. While tense debates over health care have long been understood as a common feature of US politics, most of the Western world looks on from the outside, perplexed. To understand why, consider this humorous comic by Christopher Keelty, which depicts what would have happened if the events of the AMC television series Breaking Bad had taken places in Canada:
Almost all of the US’ Western counterparts, and indeed much of the world, utilizes government-run universal healthcare, in contrast to the private market-driven model in the US. There does exist in the US a kind of universal health care in that, according to the Emergency Medical Treatment and Active Labor Act of 1986, no US hospital can turn away a patient for emergency care irrespective of his or her ability to pay for such care. While as a last-ditch resort this policy can save lives, emergency care is pronouncedly more expensive than preventative care, a fact that in some part contributes to the standing of US health care overall: more money spent, yet worse results. A Bloomberg study found the US’ health care system to rank 50th out of 55 countries surveyed. Amidst the rampant dysfunction of the US Congress, it’s surprising that there’s only a small contingent currently backing a “Medicaid For All” quasi-universal healthcare bill in Congress when most of the US’ allies have reaped the benefits of single-payer healthcare, and with much better results to show for it.
Of course, the difficulties faced by US politicians in passing single payer-style bills come from issues unique to the country. Timothy Gallaghan cites the individualistic character of the United States and the influence of moneyed lobbying groups, especially those in the insurance industry. But of the numerous arguments levied against single-payer health care in the United States, one stands out as particularly appealing for many Americans. Yes, the argument goes, the United States spends a lot on health care. But it does so because it is on the cutting edge of scientific research into health and medicine, a process where money can’t seem to be spent fast enough. So while the rest of the world fares better with its healthcare outcomes, they do so in part because research done by US scientists and firms is shared globally. The US may spend a lot of money, but the benefit reaped is medical research that saves lives across the world. To abandon the US’ health care system, however flawed it is, risks forgoing the massive advances made available by the free market.
This argument, advanced in the major newspapers of the US and elsewhere, holds a lot of appeal for numerous reasons. First and foremost, it’s at least partially true: the US spends a lot on research and development (R&D), and numerous breakthroughs in medical science came from American doctors. Secondly, there’s an intuitive appeal to free market arguments like this one in the US, where individualism reigns and the word “freedom,” wherever it is deployed, immediately registers as “good.” Now, in the Trumpish present, where the health care debate has reached peak dysfunction, it’s prudent for all parties on the varied sides of these debates to re-examine longstanding arguments. Merely rehashing the talking points, the default mode of politics for many members of Congress, is not good enough. When health care is the subject, lives are on the line. Treatment needs to be effective and the science needs to be superlative.
US Health Care Research by the Numbers
The figures for US medical research aren’t small. A report by Research America shows total US expenditures on health care research totaling $158.7 billion US dollars (USD). The National Institutes of Health (NIH), a publicly run research institution, utilizes a budget of $32.3 billion USD. The gap between the relatively small federal contribution to health care research and overall health care expenditures is filled by money from private donors and industry. The graph below maps out the financial breakdown of R&D funding in the US:
With a robust industry funneling millions of dollars into research and established government agencies whose purpose it is to further health care research, surely American citizens can rest comfortably knowing that truckloads of money are being used to find the next cure for cancer and disease. Yet the enormity of these budgets only looks that way in isolation; taken in the context of overall US health expenses, a startling disparity reveals itself.
In its report, Research America includes the following information below the previous pie chart:
In just a period of a year, research funding faced a near 50 percent reduction in growth. Insofar as the figures in question are in billions of dollars, some might not take this slowed growth as a sign of doom. Yet consider this more striking graph, which shows the relationship between R&D funding and overall health care costs:
A skinny slice of the pie, research. The proportion of R&D to overall expenses seriously puts into question the belief that Americans pay more for health care because of the quality of the country’s researchers. These scientists do undeniably important work, but their work is not so unwieldy in cost as to justify the steep prices Americans pay for health care overall. In fact, not only does research not constitute even a significant minority of US health care costs, but trends show a decline in money spent on research.
The Decline of American R&D Funding for Medical Research
The Journal of the American Medical Association, the leading publication of the American Medical Association, in 2015 published a special issue on the topic “Scientific Discovery and the Future of Medicine.” One of the lead articles in that issue, “The Anatomy of Medical Research: The US and International Comparisons,” published by six doctors, scientists, and researchers, found that “New investment is required if the clinical value of past scientific discoveries and opportunities to improve care are to be fully realized. Sources could include repatriation of foreign capital, new innovation bonds, administrative savings, patent pools, and public-private risk sharing collaborations.” Another article in the issue by Victor J. Dzau and Harvey V. Fineberg reinforces the claims of “The Anatomy of Medical Research,” pointing out that the US’ global investment in medical research declined by 13 percent from 2004 to 2012. Famed scientist Francis S. Collins also contributed to the JAMA reflection on health care research from the perspective of the NIH, arguing that “unprecedented budget pressures” and inflation have truncated the NIH’s purchasing power by 25 percent, which has had the effect of reducing the number of projects the NIH could fund from 1 in 3 to 1 in 6 — a 50 percent reduction. Despite the popularity of the “free markets means more and better research” argument, health care researchers and professionals — those closest to the discipline — paint a very different picture.
In trying to explain the decline in research spending over the course of the 2010s, free market advocates might point to the passage of Obamacare as a means of illustrating the changing price dynamics in health care. Were the free market left to its own devices, this arguments, competition could drive prices down enough to encourage greater consumer purchasing, which in turn could fund more health care research.
The repeated insistence that the US's free-market system is unique in the world is only true in that those who participate in it spend much more money than their global counterparts.
Although this initially makes some sense, this ignores what Obamacare does, and the reality of medical research irrespective of free market or single payer systems. First, while Obamacare does facilitate more government involvement with health care via subsidies and the exchanges, it is not, as was erroneously claimed when the bill was being debated, a “government takeover of healthcare.” If anything, the ACA operates as a kind of hand-out to insurance companies, for it directly incentivizes people to get health care through the insurers participating in health care exchanges, lest they face a penalty levied by the government. Insurers still hold the most bargaining power, meaning that the ACA doesn’t fundamentally disadvantage them such that their ability to sponsor research is substantially thwarted. Secondly, no matter which kind of system in which it’s taking place, health care research is expensive. Good research requires teams of highly qualified scientists and doctors, patients willing to participate in drug trials, and more importantly proven results for particular drugs, which can often take decades to achieve. Moreover, diseases like cancer are expensive to research by the difficult nature of the disease itself.
Ultimately, government and private sources alike will need to spend heaps of money to effectively promote medical research. That the former often faces hurdles in funding research has less to do, as Collins’ article attests, with some inherent inability to promote research but rather political pressures brought about by numerous external forces, such as the existence of austerity hawks in countries like the US and the United Kingdom.
Medical Innovation in America and the Rest of the World
The repeated insistence that the US’s free-market system is unique in the world is only true in that those who participate in it spend much more money than their global counterparts. While one might think that increased revenues for private medical companies and insurers would lead to greater innovation, the results increasingly show a decline, not a triumph, for US scientific innovation in this field. A 2011 study by the US firm PricewaterhouseCoopers (PwC) found that while the US continues to spend more on R&D as a percentage of GDP than any other country in the world, fast-rising countries like China and Brazil (both of which use health care systems closer or identical to single-payer models) are catching up to the US’ lead. PwC also observes that the US ranks fourth behind China, India, and Brazil in early-stage entrepreneurial activity. (India, it should be noted, has a privately-driven health care market like the US.) Furthermore, while the US does spend a great deal on R&D per capita, it is facing a decline in that spending rate, unlike countries like Japan where a universal health care system is in place. Take this graph, formed from the data collected by PwC:
The US undeniably spends more money than any other country on health care, but countries like Japan and Israel are able to spend more money on R&D as a percentage of GDP, while spending exponentially fewer millions on R&D. Israel and Japan, both of which have widely respected and efficient healthcare systems, both use single-payer rather than free market systems, which has not stopped their ability to increase investment in urgently needed medical research.
It’s not long into a cursory look at the data on US health care R&D and innovation that a resoundingly simple picture becomes clear: the US effectively spends more money than anybody else, but that doesn’t mean it is in a better place to innovate than other countries with developed health care systems and economies. The US has been able to innovate in the field of scientific research because of extremely wealthy private companies who fund the long periods of trial and error, which often result in drugs that don’t or can’t work. To take on the risk on such Herculean projects, large sums of money are needed. But as Ezra Klien writes at Vox, “If innovation is our goal, we can incentivize it in more targeted ways than paying any price for anything drug companies develop.” In order for the “free market equals better innovation” argument to work, the free market proponent must disprove the comparatives. Certainly, free markets have led to innovation in healthcare and other sectors of the economy in the United States, but that is no proof that no other system could achieve greater results. Considering that US citizens and the US government spend more money on health care than any other industrialized nation yet face worse outcomes for doing so, putting a generic idea of innovation on a pedestal isn’t good enough for American citizens.
Innovation figures as only one element in an overall health care scheme. Numerous other debates need to take place in the US to figure out what kind of health care system makes the most sense in terms of patient outcomes and cost efficiency. In order to speed those debates along, bad arguments — however longstanding or intuitively appealing they may be — need to be cast aside. Scientific and medical research gobbles up a lot of money, but based on the data, there’s no reason to believe that the US’ health care system leads to innovation in a way that mandates the maintaining of that status quo. Whatever change the members of the US Congress hope to enact must take that into account, lest any new legislation succumb to the same old pitfalls.