Sam Watson’s journal round-up for 26th June 2017

from The Academic Health Economists’ Blo… at http://bit.ly/2u8fCVN on June 26, 2017 at 02:32PM

Every Monday our authors provide a round-up of some of the most recently published peer reviewed articles from the field. We don’t cover everything, or even what’s most important – just a few papers that have interested the author. Visit our Resources page for links to more journals or follow the HealthEconBot. If you’d like to write one of our weekly journal round-ups, get in touch.

Future and potential spending on health 2015–40: development assistance for health, and government, prepaid private, and out-of-pocket health spending in 184 countriesThe Lancet. [PubMed] Published 20 May 2017.

The collosal research collaboration that is the Global Burden of Disease Study is well known for producing estimates of deaths and DALYs lost across the world due to a huge range of diseases. These figures have proven invaluable as a source of information to inform disease modelling studies and to help guide the development of public health programs. In this study, the collaboration turn their hands to modelling future health care expenditure. Predicting the future of any macroeconomic variable is tricky, to say the least. The approach taken here is to (1) model GDP to 2040 using an ensemble method, taking the ‘best performing’ models from the over 1,000 used (134 were included); (2) model all-sector government spending, out-of-pocket spending, and private health spending as a proportion of GDP in the same way, but with GDP as an input; and then (3) using a stochastic frontier approach to model maximum ‘potential’ spending. This latter step is an attempt to make the results potentially more useful by analysing different scenarios that might change overall health care expenditure by considering different frontiers. All of these steps would conceptually add a lot of uncertainty: the different probability of each model in the ensemble and the prediction uncertainty from each model including uncertainty in inputs such as size of population and demographic structure, all of which is propagated through the three step process. And this is without taking into account that health care spending at a national level is the result of a complex political decision making process, which can impact national income and prioritisation of health care in unforeseen ways (Brexit anyone?). Despite this, the predictions seem quite certain: health spending per capita is predicted to rise from $1,279 in 2014 to $2,872 with a 95% confidence intervention (or do they mean prediction interval?) of $2,426 to $3,522. It may well be a good model for average spending, but I suspect uncertainty (at least of a Bayesian kind) should be higher for a predictive model for 25 years into the future based on 20 years of data. The non-standard use of stochastic frontier analysis, which is typically a way of estimating technical efficiency, is also tricky to follow. The frontier is argued in this paper to be the maximum amount a country of similar levels of development spends on health care. This would also suggest that it is assumed spending cannot go higher than a country’s highest spending peer. A potentially strong assumption. Needless to say, these are the best predictions we currently have for future health care expenditure.

Discovering effect modification in an observational study of surgical mortality at hospitals with superior nursing. Journal of the Royal Statistical Society: Series A. [ArXivPublished June 2017.

An applied econometrician can find endogeneity everywhere. Such is the complexity of the social, political, and economic world. Everything is connected in some way. It’s one of the reasons I’ve argued before against null hypothesis significance testing: no effect is going to be exactly zero. Our job is one of measurement of the size of an effect and, crucially for this paper, what might affect the magnitude of these effects. This might start with a graphical or statistical exploratory analysis before proceeding to a confirmatory analysis. This paper proposes a method of exploratory analysis for treatment effect modifiers and examines the effect of superior nursing on treatment outcomes, which an approach I think to be a sensible scientific approach. But how does it propose to do it? Null hypothesis significance testing! Oh no! Essentially, the method involves a novel procedure for testing if treatment effects differ by group allowing for potential unobserved confounding and where the groups are also formed in a novel way. For example, the authors ask how much bias would need to be present for their conclusions to change. In terms of the effects of superior nurse staffing, the authors estimates that its beneficial treatment effect is the least sensitive to bias in a group of patients with the most serious conditions.

Incorporation of a health economic modelling tool into public health commissioning: Evidence use in a politicised context. Social Science & Medicine. [PubMedPublished June 2017.

Last up, a qualitative research paper (on an economics blog! I know…). Many health economists are involved in trying to encourage the incorporation of research findings into health care decision making and commissioning. The political decision making process often ends in inefficient or inequitable outcomes despite good evidence on what makes good policy. This paper explored how commissioners in an English local authority viewed a health economics decision tool for planning diabetes services. This is a key bit of research if we are to make headway in designing tools that actually improve commissioning decisions. Two key groups of stakeholders were involved, public health managers and politicians. The latter prioritized intelligence, local opinion, and social care agendas over scientific evidence from research, which was preferred by the former group. The push and pull between the different approaches meant the health economics tool was used as a way of supporting the agendas of different stakeholders rather than as a means to addressing complex decisions. For a tool to be successful it would seem to need to speak to or about the local population to which it is going to be applied. Well, that’s my interpretation. I’ll leave you with this quote from an interview with a manager in the study:

Public health, what they bring is a, I call it a kind of education scholarly kind of approach to things … whereas ‘social care’ sometimes are not so evidence-based-led. It’s a bit ‘well I thought that’ or, it’s a bit more fly by the seat of pants in social care.

Credits

Advertisements