Responding to Critics of Medicaid Expansion Study
WILL’s Medicaid study showing a net negative impact on Wisconsin citizens upset a lot of people who are committed to the fiction of “free” Medicaid. A lot of it came from partisans and that’s to be expected. But an article in the Milwaukee Journal Sentinel misleadingly implied “economists and policy analysts” as a group had criticized the study. To be sure, the author of the article found a couple of academics to offer criticism. One even called the study “garbage”—partisan language that professionals don’t often use. It is, in fact, the criticisms offered by these two academics that misses the mark.
We stand by our research and methodology, which is commonplace in econometrics. And while we don’t mind the criticism from Guy Boulton and other academics, we think it’s worthy of explaining why:
1. The article argues that we cannot estimate the impact of Medicaid expansion in Wisconsin based on other states’ experiences because some states expanded Medicaid from different starting points. In other words, some states expanded from a very low level of benefits and others from more generous levels. We disagree that these differences undermine our findings. We were aware of these differences and used control variables to attempt to account for differences between states. Moreover, when we change the model to that suggested by critics, the results show that Medicaid expansion actually costs more! The author of the Journal Sentinel article knew that because we told him. Yet he failed to share that information with his readers.
Here are the details. There is indeed variation in where states started out prior to expansion, but a number of states did provide extensive benefits – meaning, the average we calculated accounts for states with situations both different and very similar to Wisconsin, like any good average. As is noted in the Journal Sentinel piece, states like Arizona and Vermont offered extensive coverage prior to taking Medicaid dollars. The inclusion of these states in our model means that we are not assuming that the effects of expansion in Wisconsin are based on projections of states that started expansion from 0% of the Federal Poverty Limit.
That said, we conducted an additional analysis based on this question that only used data from states that had extensive benefits prior to expansion (those more similar to Wisconsin). In this more conservative model, the results were even stronger! The estimated effect of expansion was a $299 increase in spending per person, compared to the $177 per person reported in our paper. This analysis has been uploaded to our website here. Although we can’t say for sure, there are a number of reasons that this could have happened. Perhaps the most likely is that states that offered more extensive benefits prior to expansion had less uncompensated care than states that expanded from a lower tier, and thus did not see the cost reductions that higher expanders could. But the bottom line: the model in our paper presents, if anything, conservative results.
While we have no desire to pick fights with a newspaper, we have to point out that the author of this piece, Guy Boulton, was fully aware of this reanalysis. We told him about it and we shared it with him. But he failed to mention it. Whether this was an honest mistake or whether the reanalysis was omitted because it would have undermined the paper’s narrative about our conclusion being “wrong,” we can’t say. But we can say that readers were left in the dark.
2. Our methodology is in-line with similar studies by academics and scholars across the ideological spectrum. Our study’s findings have been endorsed by others in academia and healthcare policy.
The methodology that is so criticized is actually commonplace. The following groups have published studies that extrapolate the policy impact on a state by using other data from states: National Bureau of Economic Research, Brookings Institute, University of Chicago Journal of Legal Studies, and Stanford Law School. Not exactly far-right organizations.
Additionally, our Medicaid study was read by University of Wisconsin – La Crosse Professor Adam Hoffner who remarked:
“I recently had the opportunity to review “The Impact of Medicaid Expansion” by Flanders and Williams. I find their methodology – a panel fixed effects regression model – to be a reasonably sound methodological approach. I believe that a deeper understanding of the role Medicaid expansion plays on private health insurance costs is crucial to inform policy in Wisconsin. I think the work by Flanders and Williams offers key insights for Wisconsin policymakers and voters to consider as we discuss reform to Medicaid.”
Our study has additionally received support from the Mackinac Center in Michigan. And of course, Noah Williams Professor of Economics at University of Wisconsin is a co-author. The paper’s suggestion that “economics and policy analysts” criticized the study is false, the ones that the paper chose to quote did so.
3. The article argues that the time frame of the study – ending in 2014 – does not allow sufficient time to observe the full effects of expansion. They are incorrect because several states took expansion early and we have more than 1 year of data for several additional states.
First, several states used a waiver to take Medicaid expansion early. For these states, we have additional years of data on the relationship between expansion and costs. Moreover, states that expanded Medicaid prior to July of 2013 are included as having taken expansion for 2013. It is true that we do not have as many years post-expansion as pre-expansion – this is a result of data availability. However, negotiations between insurers and healthcare providers don’t happen in a vacuum. Particularly in left-leaning states, Medicaid expansion was known to be coming and contracts may well have been negotiated based on that expectation. We maintain this is a sound approach to study the question. While it is true that we had more data on early expanders, there is no reason to expect that later expanders have had a different experience. With a study, it is always possible to say that more time and data could yield different results. But we have enough data and time to conduct an analysis.
4. Kaiser’s error in misdescribing certain data as per-person rather than per enrollee, affects the amount of increased costs for Wisconsin residents from a Medicaid expansion but it does not affect our conclusion that the increases would happen and be very large. Again, the author of the Journal Sentinel piece knew this but failed to tell his readers.
There is one critique brought up in this process that was valid, but the error was not ours. We used data from the Kaiser Family Foundation that was described as the per capita cost of private health insurance. It has been noted that this is mislabeled data from the Centers for Medicaid Services, and should be described as “per enrollee.” If this is correct, our estimated statewide net effect, based on about 2/3rds of Wisconsinites having private insurance, would be $400 million instead of $600 million. That is less than our initial estimate but it is still large and still significant. That said, it remains unclear if cost-shifting happens to those with other public insurance, such as Medicare, which could serve to further increase that figure.
5. The article argues that our study implies an unreasonable cost for each new enrollee.
The article notes that if estimates from the Legislative Fiscal Bureau are correct and only 76,000 new Wisconsinites enroll in Medicaid, the cost shift we estimate would be about $15,065 per person. In other words, the argument is that these new enrollees would have to incur costs $15,000 above the Medicaid reimbursement to have the impact we estimated. But that’s not true. There are many estimates out there on how many people would enroll, some of which are far higher. For instance, the Urban Institute – hardly a right wing think tank – estimated last year that 176,000 people would enroll in the state. That more may enroll than expected is evidenced by Wisconsin’s 2008 expansion, where enrollment and costs far exceeded projections and led to the program being capped and eventually discontinued. If enrollment is more along the lines of the Urban Institute number, the cost shift would be approximately $6,250 per Medicaid enrollee, far more reasonable.
6. Our study’s critics take the extraordinary position that increased costs can’t raise prices.
The article cites a former Obama administration official who argues that providers could not raise their costs because they would have raised them as far as the market would bear prior to Medicaid expansion. This is, frankly, economically illiterate. A cost shock that affects the entire market changes what the sellers (in this case health care providers) are willing to offer their services for. When the price of something that all market participants use – say gasoline in the trucking industry – goes up, then prices go up. Similarly, if providers across the state experience the increased cost of uncompensated care, then prices go up. In Econ 101 terms, the supply curve “shifts.”
7. At the end of the day, some folks are such strong believers in Medicaid expansion that they cannot wrap their minds around what conservatives have said for years – that a major expansion of government-run healthcare in Wisconsin is going to have a massive fiscal “shock” on the private sector.
We did a robust study using commonplace methodology in academia to help to inform the debate over Medicaid expansion. Too many people want to say that Medicaid expansion is “free” and completely ignore the costs to the private sector. We have sought to fill that void.
The article fails to account for the rigorous way we accounted for these and other factors in our study to reach a fair comparison – and to arrive at the only full analysis of the cost of Medicaid expansion to Wisconsinites that we are aware of at this time.
The bottom line is the Journal Sentinel story is, at its core, a debate between academics. There are valid critiques of our study, as there are of any academic paper. One can go to an academic conference and regularly see preeminent scholars assailed by grad students for their lack of a particular control variable in their model. We stand behind our work in this paper, and hope it remains a useful piece of the debate on whether to expand Medicaid in Wisconsin.