from The Academic Health Economists’ Blo… at https://bit.ly/2XcvZSR on April 6, 2020 at 12:03PM
Every Monday our authors provide a round-up of some of the most recently published peer reviewed articles from the field. We don’t cover everything, or even what’s most important – just a few papers that have interested the author. Visit our Resources page for links to more journals or follow the HealthEconBot. If you’d like to write one of our weekly journal round-ups, get in touch.
Value of information analytical methods: report 2 of the ISPOR value of information analysis emerging good practices task force. Value in Health [PubMed] Published 1st March 2020
I’ve been waiting for the VoI Task Force reports since I heard about its launch, and they didn’t disappoint! VoI (or value of information) is a methodology to find out the consequences of uncertainty, and how we should invest in new research to reduce it. This ISPOR Task Force developed two reports, the first on the concepts and role of VoI, and the second, featured here, on the methods.
These two reports explain what VoI is, why we should do it in cost-effectiveness analysis, and, most importantly, how to do it well. Make sure you don’t miss the supplementary appendix online [PDF], with an excellent diagram on how to choose between methods. I found the algorithms for computing quantities such as EVSI (expected value of sample information) particularly useful to understand how I could actually do this in practice. It wouldn’t be easy, but thanks to this report it may just be achievable for the VoI non-experts.
In sum, the reports are pretty comprehensive, and a must read for all analysts!
The effects of communicating uncertainty on public trust in facts and numbers. Proceedings of the National Academy of Sciences [PubMed] Published 23rd March 2020
Keeping with the uncertainty topic, I recommend this remarkable study by Anne Marthe van der Bles and colleagues on communicating uncertainty. One might think that being explicit about the uncertainty in numbers would reduce the public’s trust in those numbers and their trust in the institution producing them. To know if this is actually the case, the authors conducted a comprehensive set of studies: four online experiments, one meta-analysis (of the online experiments), and one field experiment on the BBC News website. It’s absolutely a bumper study!
In the online and field experiments, participants were asked to read a text about a numerical fact, such as the number of unemployed people in the UK. The text varied on how the numerical fact was presented, from no uncertainty to uncertainty communicated as a numerical range, or uncertainty communicated as a verbal statement (e.g. “by an estimated 116,000”).
Spoiler alert on the results! They found that people perceived uncertainty when it is communicated; that it did not affect their emotional response to the numbers; and did not affect people’s trust in the institution generating the numbers. Interestingly, communicating uncertainty did reduce people’s trust in numbers, but the effect was greater for communicating uncertainty verbally rather than numerically.
This is quite a reassuring study for cost-effectiveness analysis. It means that we can and should express the uncertainty in our findings, even if we’re presenting results to non-technical audiences.
R and Shiny for cost-effectiveness analyses: why and when? A hypothetical case study. PharmacoEconomics [PubMed] Published 31st March 2020
Excel vs R for cost-effectiveness analysis: the battle continues! On the R side, we’ve seen a few initiatives promoting it as the software of choice for cost-effectiveness analysis. Although not many people come in defence of Excel, it continues to be used in many analyses.
This thoughtful paper aims to compare Excel to R for cost-effectiveness analysis in terms of their capability, data safety, model building, usability for technical and non-technical users, and model adaptability. To do that, Rose Hart and colleagues built the same cohort model in Excel and in R. The model was informed by simulated patient level data, which was analysed in R too.
You may not be surprised to learn that R came out winning in most domains. If the analysis of patient level data is done in R, building the model in R precludes the need for copy-pasting outputs into the model. So that patient level data are kept safe, you can remove them from the R model and keep only the results to inform the cost-effectiveness analysis. For adapting the model to other decision problems, the advantage of R is that it avoids copy-pasting new inputs.
The downsides of R were that the R model did take longer to build. Unfortunately, the authors didn’t specify how much longer. And wrapping the R model in a Shiny app added “additional coding complexity”… So this is an area where Excel comes out on top. Furthermore, the authors acknowledge that usability depends on the user’s familiarity with R, unless the R model includes a Shiny interface.
I understand the push to move from Excel to R. R has lots of advantages! The issue for analysts who are not proficient in R, like myself, is that it is a steep learning curve to learn another language. If building a model in R takes longer than in Excel for those who are proficient in R, how much longer would it take to a beginner? It would mean having much more time to build the model, and time is quite a precious commodity.
I enjoyed reading this paper on the comparison of R and Excel, and it did motivate me to do more to learn how to build a model in R. On the practicalities of moving to R, I agree with the authors that: “When choosing the software for an economic model, we advise consideration of the lifetime purpose, audience and technical requirement of the analysis”. And I’d add that, in some cases, an Excel model may strike the right balance between the needs of the decision problem and the resources available to address it.
How to create a quick Twitter Poster to share new research (includes templates). YouTube Published 24th March 2020
Lastly, not a peer-reviewed paper, but a YouTube video!
In these exceptional times of social distancing, it may be difficult to divert our attention from COVID-19 news to keep abreast of developments in health economics. This blog is, of course, a great resource (ah ah if I say so myself )… But reading a long post on papers may not be the relaxing read that we might be craving.
How can we keep learning about new research in a fun and relaxed way? Mike Morrison had the great idea of doing research posters as GIFs. GIFs are simply animated images. By publishing your research poster as a GIF on twitter, you may reach many more people than if you keep with the traditional outputs of papers and conference abstracts. And we will all learn more and in a relaxed way!
The GIF poster is surprisingly easy to create. Mike’s video has the full instructions and I can vouch that it does work. Essentially, it’s a set of PowerPoint slides exported as an animated GIF file and published on Twitter.
Health economics community: anyone up for sharing your research in this way? #NewHEOR