Premium
Empiric β‐blockers for the prophylaxis of variceal hemorrhage: Cost effective or clinically applicable?
Author(s) -
Rubenstein Joel H.,
Inadomi John M.
Publication year - 2003
Publication title -
hepatology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 5.488
H-Index - 361
eISSN - 1527-3350
pISSN - 0270-9139
DOI - 10.1053/jhep.2003.50089
Subject(s) - medicine , intensive care medicine
Current guidelines advocate the use of screening endoscopy in patients with cirrhosis to identify those who could benefit from prophylactic therapy to decrease the incidence of initial variceal hemorrhage.1,2 The decision analysis performed by Spiegel et al. reported in this issue of HEPATOLOGY challenges this practice and illustrates the potential benefit of empiric -blockers for all patients with compensated cirrhosis. Based on available literature, one must now decide whether these data should immediately affect clinical practice. The purpose of this editorial is to evaluate this study from an evidence-based perspective, identify key components that affect the results, and highlight unresolved issues that require further investigation. Variceal bleeding is a major cause of mortality in patients with cirrhosis.3 At the initial time of presentation with cirrhosis, about 30% of patients with compensated disease and 60% of those with decompensated disease have esophageal varices.4 The annual risk of developing de novo varices after presentation appears to be around 8%.5 In patients with known varices, but who have never experienced variceal hemorrhage, the annual risk of bleeding is between 2% and 70%, depending on the size of the varices, severity of liver disease, and endoscopic criteria.6 Historically, the 1-month mortality rate associated with variceal hemorrhage has been as high as 50%; even with the advent of endoscopic therapy and the improvement in critical care, the mortality rate appears to be around 20%.7,8 Given the high prevalence of varices, the high risk of initial hemorrhage from esophageal varices, and the high mortality rate from hemorrhage, primary prophylaxis is an attractive strategy. Randomized controlled trials evaluating primary prophylaxis with nonselective -blockers have been published. Based on results from these trials, 3 well-designed meta-analyses have been performed, all showing the efficacy of -blockers in reducing the risk of a first variceal hemorrhage by about 50%; additionally, all showed a trend toward improved survival.4,5,9 As a result, the American College of Gastroenterology and the American Association for the Study of Liver Disease have recommended screening endoscopy every 2 years in patients with cirrhosis, followed by pharmacologic prophylaxis in patients with large varices.1,2 More recently, randomized controlled trials evaluating prophylactic endoscopic band ligation of esophageal varices have been performed. A meta-analysis showed a relative risk of variceal hemorrhage with band ligation compared with no therapy of 0.36 (95% CI, 0.26-0.50) and a relative risk for all-cause mortality of 0.55 (0.43-0.71).10 In the same meta-analysis, 4 studies comparing band ligation with -blockers had a pooled relative risk of bleeding with band ligation of 0.48 (0.24-0.96), but no difference in mortality. This raises the issue of whether the decreased risk of bleeding is worth the cost of multiple endoscopies, without benefit in survival. In such cases, decision analysis is an ideal study design to examine the issue. The field of quantitative analysis has at its disposal a variety of tools designed to assess competing strategies under conditions of uncertainty. Cost-effectiveness analysis is a subset of decision analysis that compares resource use and benefit derived from different strategies under conditions of limited resources.11-19 Quantitative analyses evaluating interventions to prevent initial variceal hemorrhage in patients with cirrhosis have previously been published. The first compared propranolol with sclerotherapy, but did not consider variceal band ligation.20 This study assumed that 15% of participants would discontinue therapy due to adverse effects, but otherwise assumed perfect patient adherence. A second published study did not perform a formal cost-effectiveness analysis and only considered high rates of adherence to therapy.21 Furthermore, both studies assumed that varices had already been documented by endoscopy; thus, neither examined the issue regarding initiation of screening. More recently, Arguedas et al. created a Markov process to compare strategies of observation, empiric -blocker without screening endoscopy, screening followed by -blocker for appropriate patients, and screening followed by band ligation over a 5-year time horizon.22 They assumed a 15% rate of intolerance of medications, but otherwise did not account for nonadherence to therapy. The study concluded that the preferred From the VA Center for Practice Management and Outcomes Research and the Division of Gastroenterology, Department of Medicine, University of Michigan Medical Center, Ann Arbor, MI Address reprint requests to: John M. Inadomi, M.D., VA Medical Center (111D), 2215 Fuller Rd., Ann Arbor, MI 48105. E-mail: jinadomi@umich.edu; fax: 734-761-7549. This is a US government work. There are no restrictions on its use. 0270-9139/03/3702-0005$0.00/0 doi:10.1053/jhep.2003.50089