Assistant Professor, Management Science
Lauren Cipriano is an Assistant Professor in Management Science at the Ivey Business School. Lauren's research interests focus on the application of statistics, economics, operations research, and systems analysis to health policy problems.
- Decision Making with Analytics, HBA core
- B.Sc., Honors Biochemistry, Western University
- HBA, Ivey Business School, Western University
- M.S., Statistics, Stanford University
- Ph.D., Management Science, Stanford University
Recent Refereed Articles
Cerasuolo, J.O., Cipriano, L.E., Sposato, L.A.,
(forthcoming), "The complexity of atrial fibrillation newly diagnosed after ischemic stroke and transient ischemic attack: advances and uncertainties", Current Opinion in Neurology, February 30(1): 28 - 37.
Abstract: Purpose of review: Atrial fibrillation is being increasingly diagnosed after ischemic stroke and transient ischemic attack (TIA). Patient characteristics, frequency and duration of paroxysms, and the risk of recurrent ischemic stroke associated with atrial fibrillation detected after stroke and TIA (AFDAS) may differ from atrial fibrillation already known before stroke occurrence. We aim to summarize major recent advances in the field, in the context of prior evidence, and to identify areas of uncertainty to be addressed in future research.
Recent findings: Half of all atrial fibrillations in ischemic stroke and TIA patients are AFDAS, and most of them are asymptomatic. Over 50% of AFDAS paroxysms last less than 30 s. The rapid initiation of cardiac monitoring and its duration are crucial for its timely and effective detection. AFDAS comprises a heterogeneous mix of atrial fibrillation, possibly including cardiogenic and neurogenic types, and a mix of both. Over 25 single markers and at least 10 scores have been proposed as predictors of AFDAS. However, there are considerable inconsistencies across studies. The role of AFDAS burden and its associated risk of stroke recurrence have not yet been investigated.
Summary: AFDAS may differ from atrial fibrillation known before stroke in several clinical dimensions, which are important for optimal patient care strategies. Many questions remain unanswered. Neurogenic and cardiogenic AFDAS need to be characterized, as it may be possible to avoid some neurogenic cases by initiating timely preventive treatments. AFDAS burden may differ in ischemic stroke and TIA patients, with distinctive diagnostic and treatment implications. The prognosis of AFDAS and its risk of recurrent stroke are still unknown; therefore, it is uncertain whether AFDAS patients should be treated with oral anticoagulants.
Link(s) to publication:
Bahit, M.C., Coppola, M.L., Riccio, P.M., Cipriano, L.E., Roth, G.A., Lopes, R.D., Feigin, V.L., Borrego Guerrero, B., De Martino, M., Diaz, A., Ferrante, D., Funaro, F., Lavados, P., Lewin, M.L., Lopez, D.H., Macarrone, P., Marciello, R., Marino, D., Martens, C., Martinez, P., Odriozola, G., Rabinstein, A.A., Saposnik, G., Silva, D., Suasnabar, R., Truelsen, T., Uzcudun, A., Viviani, C.A., Sposato, L.A.,
2016, "First-Ever Stroke and Transient Ischemic Attack Incidence and 30-Day Case-Fatality Rates in a Population-Based Study in Argentina", Stroke, June 47(6): 1640 - 2.
Abstract: BACKGROUND AND PURPOSE:
Epidemiological data about stroke are scarce in low- and middle-income Latin-American countries. We investigated annual incidence of first-ever stroke and transient ischemic attack (TIA) and 30-day case-fatality rates in a population-based setting in Tandil, Argentina.
We prospectively identified all first-ever stroke and TIA cases from overlapping sources between January 5, 2013, and April 30, 2015, in Tandil, Argentina. We calculated crude and standardized incidence rates. We estimated 30-day case-fatality rates.
We identified 334 first-ever strokes and 108 TIAs. Age-standardized incidence rate per 100 000 for Segi's World population was 76.5 (95% confidence interval [CI], 67.8-85.9) for first-ever stroke and 25.1 (95% CI, 20.2-30.7) for first-ever TIA, 56.1 (95% CI, 48.8-64.2) for ischemic stroke, 13.5 (95% CI, 9.9-17.9) for intracerebral hemorrhage, and 4.9 (95% CI, 2.7-8.1) for subarachnoid hemorrhage. Stroke incidence was slightly higher for men (87.8; 95% CI, 74.6-102.6) than for women (73.2; 95% CI, 61.7-86.1) when standardized for the Argentinean population. Thirty-day case-fatality rate was 14.7% (95% CI, 10.8-19.5) for ischemic stroke, 24.1% (95% CI, 14.2-36.6) for intracerebral hemorrhage, and 1.9% (95% CI, 0.4-5.8) for TIA.
This study provides the first prospective population-based stroke and TIA incidence and case-fatality estimate in Argentina. First-ever stroke incidence was lower than that reported in previous Latin-American studies, but first-ever TIA incidence was higher. Thirty-day case-fatality rates were similar to those of other population-based Latin-American studies.
Link(s) to publication:
Joundi, R.A., Cipriano, L.E., Sposato, L.A., Saposnik, G.,
2016, "Ischemic stroke risk in patients with atrial fibrillation and CHA2DS2-VASc score of 1: Systematic review and meta-analysis", Stroke, April 47(5): 1364 - 1367.
Abstract: Background and Purpose—The CHA2DS2-VASc score aims to improve risk stratification of ischemic stroke among patients with atrial fibrillation to identify those who can safely forego oral anticoagulation. Oral anticoagulation treatment guidelines remain uncertain for CHA2DS2-VASc score of 1. We conducted a systematic review and meta-analysis of the risk of ischemic stroke for patients with atrial fibrillation and CHA2DS2-VASc score of 0, 1, or 2 not treated with oral anticoagulation. Methods—We searched MEDLINE, Embase, PubMed, Cochrane Central Register of Controlled Trials, Cochrane Database of Systematic Reviews, and Web of Science from the start of the database up until April 15, 2015. We included studies that stratified the risk of ischemic stroke by CHA2DS2-VASc score for patients with nonvalvular atrial fibrillation. We estimated the summary annual rate of ischemic stroke using random effects meta-analyses and compared the estimated stroke rates with published net-benefit thresholds for initiating anticoagulants. Results—1162 abstracts were retrieved, of which 10 met all inclusion criteria for the study. There was substantial heterogeneity among studies. The summary estimate for the annual risk of ischemic stroke was 1.61% (95% confidence interval 0%–3.23%) for CHA2DS2-VASc score of 1, meeting the theoretical threshold for using novel oral anticoagulants (0.9%), but below the threshold for warfarin (1.7%). The summary incident risk of ischemic stroke was 0.68% (95% confidence interval 0.12%–1.23%) for CHA2DS2-VASc score of 0 and 2.49% (95% confidence interval 1.16%–3.83%) for CHA2DS2-VASc score of 2. Conclusions—Our meta-analysis of ischemic stroke risk in atrial fibrillation patients suggests that those with CHA2DS2-VASc score of 1 may be considered for a novel oral anticoagulant, but because of high heterogeneity, the decision should be based on individual patient characteristics.
Link(s) to publication:
Sposato, L.A., Kapral, M.K., Fang, J., Gill, S.S., Hackam, D.G., Cipriano, L.E., Hachinski, V.,
2015, "Declining incidence of stroke and dementia: Coincidence or prevention opportunity?", JAMA Neurology, December 72(12): 1529 - 1531.
Abstract: Stroke and dementia pose significant threats to the adult brain and share the same treatable risk factors.1 Stroke incidence in high-income countries has been declining,2 coinciding with better risk-factor control. However, hitherto there have been encouraging trends, but no proof, of declining dementia incidence.3 To address this, we analyzed health care administrative data from the Canadian Institute for Health Information for the province of Ontario, Canada.
Link(s) to publication:
Sposato, L.A., Cipriano, L.E., Riccio, P.M., Hachinski, V., Saposnik, G.,
2015, "Very Short Paroxysms Account for More than Half of the Cases of Atrial Fibrillation detected after Stroke and TIA: A Systematic Review and Meta-analysis", International Journal of Stroke, August 10(6): 801 - 807.
Abstract: Background: Guidelines suggest that only poststroke atrial fibrillation episodes lasting 30?s or longer should be considered for anticoagulation. However, little evidence supports this recommendation. Aims: We performed a systematic review and meta-analysis to investigate the frequency of poststroke atrial fibrillation lasting less than 30?s in stroke and transient ischemic attack patients. Methods: We searched PubMed, Embase, and Scopus from 1980 to June 30, 2014 for studies reporting the detection of poststroke atrial fibrillation of less than 30 s and of 30 s or longer. The primary endpoint was the proportion of screened patients diagnosed with poststroke atrial fibrillation lasting less than 30?s. The secondary endpoint was the proportion of patients diagnosed with poststroke atrial fibrillation shorter than 30 s among the overall number of patients in whom a poststroke atrial fibrillation was detected after stroke or transient ischemic attack. Results: From 28?290 titles, we included nine studies in the random-effects meta-analysis. Among stroke and transient ischemic attack patients without a history of atrial fibrillation, 9·0% (95% confidence interval: 4·9–14·3) experienced episodes of poststroke atrial fibrillation shorter than 30 s. An additional 6·5% (95% confidence interval: 3·2–10·9) experienced episodes of poststroke atrial fibrillation longer than 30 s. Among all patients with poststroke atrial fibrillation, 56·3% (95% confidence interval: 37·7–74·0) had poststroke atrial fibrillation episodes shorter than 30 s during diagnostic evaluation. Conclusions: The clinical and prognostic significance of poststroke atrial fibrillation episodes shorter than 30 s is unknown. The high frequency of poststroke atrial fibrillation episodes shorter than 30 s justify further investigation into the risk of stroke recurrence and the risk–benefit profile of anticoagulation for this patient population.
Link(s) to publication:
Sposato, L.A., Cipriano, L.E., Saposnik, G., Ruíz Vargas, E., Riccio, P.M., Hachinski, V.,
2015, "Diagnosis of atrial fibrillation after stroke and transient ischaemic attack: a systematic review and meta-analysis", Lancet Neurology, April 14(4): 377 - 387.
Abstract: Background: Among patients with atrial fibrillation, the risk of stroke is highest for those with a history of stroke; however, oral anticoagulants can lower the risk of recurrent stroke by two-thirds. No consensus has been reached about how atrial fibrillation should be investigated in patients with stroke, and its prevalence after a stroke remains uncertain. We did a systematic review and meta-analysis to estimate the proportion of patients newly diagnosed with atrial fibrillation after four sequential phases of cardiac monitoring after a stroke or transient ischaemic attack. Methods: We searched PubMed, Embase, and Scopus from 1980 to June 30, 2014. We included studies that provided the number of patients with ischaemic stroke or transient ischaemic attack who were newly diagnosed with atrial fibrillation. We stratified cardiac monitoring methods into four sequential phases of screening: phase 1 (emergency room) consisted of admission electrocardiogram (ECG); phase 2 (in hospital) comprised serial ECG, continuous inpatient ECG monitoring, continuous inpatient cardiac telemetry, and in-hospital Holter monitoring; phase 3 (first ambulatory period) consisted of ambulatory Holter; and phase 4 (second ambulatory period) consisted of mobile cardiac outpatient telemetry, external loop recording, and implantable loop recording. The primary endpoint was the proportion of patients newly diagnosed with atrial fibrillation for each method and each phase, and for the sequential combination of phases. For each method and each phase, we estimated the summary proportion of patients diagnosed with post-stroke atrial fibrillation using random-effects meta-analyses. Findings: Our systematic review returned 28?290 studies, of which 50 studies (comprising 11?658 patients) met the criteria for inclusion in the meta-analyses. The summary proportion of patients diagnosed with post-stroke atrial fibrillation was 7·7% (95% CI 5·0–10·8) in phase 1, 5·1% (3·8–6·5) in phase 2, 10·7% (5·6–17·2) in phase 3, and 16·9% (13·0–21·2) in phase 4. The overall atrial fibrillation detection yield after all phases of sequential cardiac monitoring was 23·7% (95% CI 17·2–31·0). Interpretation: By sequentially combining cardiac monitoring methods, atrial fibrillation might be newly detected in nearly a quarter of patients with stroke or transient ischaemic attack. The overall proportion of patients with stroke who are known to have atrial fibrillation seems to be higher than previously estimated. Accordingly, more patients could be treated with oral anticoagulants and more stroke recurrences prevented.
Link(s) to publication:
Racosta, J.M., Sposato, L.A., Morrow, S.A., Cipriano, L.E., Kimpinski, K., Kremenchutzky, M.,
2015, "Cardiovascular Autonomic Dysfunction in Multiple Sclerosis: A Meta-Analysis.", Multiple Sclerosis and Related Disorders, March 4(2): 104 - 111.
Abstract: Background and objective The definition of cardiovascular autonomic dysfunction in patients with multiple sclerosis is controversial. Thus, its true prevalence is unknown. We performed a systematic review and meta-analysis to compare the proportion of patients with multiple sclerosis that would be diagnosed with cardiovascular dysautonomia using a definition of at least one abnormal cardiac autonomic test vs. at least two abnormal studies. Methods We searched PubMed, Embase, and Scopus from 1980 to December 2013 for publications reporting abnormal autonomic tests in patients with multiple sclerosis. We performed random-effects meta-analyses for calculating the proportion of patients diagnosed with autonomic dysfunction with both definitions. Results We included 16 studies comprising 611 patients with multiple sclerosis, assessing =3 cardiovascular autonomic tests. The proportion of patients with autonomic dysfunction was two-fold higher (p=0.006) when using the definition of only one abnormal autonomic test (42.1%) compared to that using at least two abnormal results (18.8%). Conclusions We found a wide variation in the proportion of patients with multiple sclerosis diagnosed with cardiovascular dysautonomia by using the two definitions. Consensus is needed to define autonomic dysfunction in patients with multiple sclerosis. In the meantime, we encourage investigators to report results using both thresholds.
Link(s) to publication:
Enns, E.A., Cipriano, L.E., Simons, C.T., Kong, C.Y.,
2015, "Identifying Best-Fitting Inputs in Health-Economic Model Calibration: A Pareto Frontier Approach", Medical Decision Making, February 35(2): 170 - 182.
Abstract: Background. To identify best-fitting input sets using model calibration, individual calibration target fits are often combined into a single goodness-of-fit (GOF) measure using a set of weights. Decisions in the calibration process, such as which weights to use, influence which sets of model inputs are identified as best-fitting, potentially leading to different health economic conclusions. We present an alternative approach to identifying best-fitting input sets based on the concept of Pareto-optimality. A set of model inputs is on the Pareto frontier if no other input set simultaneously fits all calibration targets as well or better. Methods. We demonstrate the Pareto frontier approach in the calibration of 2 models: a simple, illustrative Markov model and a previously published cost-effectiveness model of transcatheter aortic valve replacement (TAVR). For each model, we compare the input sets on the Pareto frontier to an equal number of best-fitting input sets according to 2 possible weighted-sum GOF scoring systems, and we compare the health economic conclusions arising from these different definitions of best-fitting. Results. For the simple model, outcomes evaluated over the best-fitting input sets according to the 2 weighted-sum GOF schemes were virtually nonoverlapping on the cost-effectiveness plane and resulted in very different incremental cost-effectiveness ratios ($79,300 [95% CI 72,500–87,600] v. $139,700 [95% CI 79,900–182,800] per quality-adjusted life-year [QALY] gained). Input sets on the Pareto frontier spanned both regions ($79,000 [95% CI 64,900–156,200] per QALY gained). The TAVR model yielded similar results. Conclusions. Choices in generating a summary GOF score may result in different health economic conclusions. The Pareto frontier approach eliminates the need to make these choices by using an intuitive and transparent notion of optimality as the basis for identifying best-fitting input sets.
Link(s) to publication:
Simons, C.T., Cipriano, L.E., Shah, R.U., Garber, A.M., Owens, D.K., Hlatky, M.A.,
2013, "Transcatheter Aortic Valve Replacement in Non-Surgical Candidates with Severe, Symptomatic Aortic Stenosis: a Cost-Effectiveness Analysis", Circulation-Cardiovascular Quality and Outcomes, July 6(4): 419 - 428.
Abstract: Background—Transcatheter aortic valve replacement (TAVR) seems to improve the survival and quality of life of patients with aortic stenosis ineligible for surgical aortic valve replacement. Methods and Results—We used a decision analytic Markov model to estimate lifetime costs and benefits in a hypothetical cohort of patients with severe, symptomatic aortic stenosis who were ineligible for surgical aortic valve replacement. The model compared transfemoral TAVR with medical management and was calibrated to the Placement of Aortic Transcatheter Valves (PARTNER) trial. TAVR increased life expectancy from 2.08 to 2.93 years and quality-adjusted life expectancy from 1.19 to 1.93 years. TAVR also reduced subsequent hospitalizations by 1.40 but increased complications, particularly stroke (from 1% to 11% lifetime risk), and also increased lifetime costs from $83?600 to $1?69?100. The incremental cost-effectiveness of TAVR was $1?16?500 per quality-adjusted life-year gained ($99?900 per life-year gained). Results were robust to reasonable changes in individual variables but were sensitive to the level of annual healthcare costs caused by noncardiac diseases and to the projected life expectancy of medically treated patients. Conclusions—TAVR seems to be an effective but somewhat expensive alternative to medical management among patients with symptomatic aortic stenosis ineligible for surgery. TAVR is more cost-effective for patients with a lower burden of noncardiac disease.
Link(s) to publication:
Tramontano, A.C., Cipriano, L.E., Kong, C.Y., Shepard, J.O., Lanuti, M., Gazelle, G.S., McMahon, P.M.,
2013, "Microsimulation modeling predicts a survival benefit after radiofrequency ablation and stereotactic body radiotherapy when compared to radiotherapy in the treatment of medically inoperable stage I non small cell lung cancer", American Journal of Roentgenology, May 200(5): 1020 - 1027.
Abstract: OBJECTIVE. A subset of patients with stage IA and IB non–small cell lung cancer (NSCLC) is ineligible for surgical resection and undergoes radiation therapy. Radiofrequency ablation (RFA) and stereotactic body radiotherapy are newer potentially attractive alternative therapies. MATERIALS AND METHODS. We added RFA and stereotactic body radiotherapy treatment modules to a microsimulation model that simulates lung cancer's natural history, detection, and treatment. Natural history parameters were previously estimated via calibration against tumor registry data and cohort studies; the model was validated with screening study and cohort data. RFA model parameters were calibrated against 2-year survival from the Radiofrequency Ablation of Pulmonary Tumor Response Evaluation (RAPTURE) study, and stereotactic body radiotherapy model parameters were calibrated against 3-year survival from a phase 2 prospective trial. We simulated lifetime histories of identical patients with early-stage NSCLC who were ineligible for resection, who were treated with radiation therapy, RFA, or stereotactic body radiotherapy under a range of scenarios. From 5,000,000 simulated individuals, we selected a cohort of patients with stage I medically inoperable cancer for analysis (n = 2056 per treatment scenario). Main outcomes were life expectancy gains. RESULTS. RFA or stereotactic body radiotherapy treatment in patients with peripheral stage IA or IB NSCLC who were nonoperative candidates resulted in life expectancy gains of 1.71 and 1.46 life-years, respectively, compared with universal radiation therapy. A strategy where patients with central tumors underwent stereotactic body radiotherapy and those with peripheral tumors underwent RFA resulted in a gain of 2.02 life-years compared with universal radiation therapy. Findings were robust with respect to changes in model parameters. CONCLUSION. Microsimulation modeling results suggest that RFA and stereotactic body radiotherapy could provide life expectancy gains to patients with stage IA or IB NSCLC who are ineligible for resection.
Link(s) to publication:
Liu, S., Cipriano, L.E., Holodniy, M., Goldhaber-Fiebert, J.D.,
2013, "Cost-Effectiveness Analysis of Risk-Factor Guided and Birth-Cohort Screening for Chronic Hepatitis C Infection in the United States", PLoS One, March 8(3): 1 - 14.
Abstract: Background: No consensus exists on screening to detect the estimated 2 million Americans unaware of their chronic hepatitis C infections. Advisory groups differ, recommending birth-cohort screening for baby boomers, screening only highrisk individuals, or no screening. We assessed one-time risk assessment and screening to identify previously undiagnosed 40–74 year-olds given newly available hepatitis C treatments. Methods and Findings: A Markov model evaluated alternative risk-factor guided and birth-cohort screening and treatment strategies. Risk factors included drug use history, blood transfusion before 1992, and multiple sexual partners. Analyses of the National Health and Nutrition Examination Survey provided sex-, race-, age-, and risk-factor-specific hepatitis C prevalence and mortality rates. Nine strategies combined screening (no screening, risk-factor guided screening, or birthcohort screening) and treatment (standard therapy–peginterferon alfa and ribavirin, Interleukin-28B-guided (IL28B) tripletherapy–standard therapy plus a protease inhibitor, or universal triple therapy). Response-guided treatment depended on HCV genotype. Outcomes include discounted lifetime costs (2010 dollars) and quality adjusted life-years (QALYs). Compared to no screening, risk-factor guided and birth-cohort screening for 50 year-olds gained 0.7 to 3.5 quality adjusted lifedays and cost $168 to $568 per person. Birth-cohort screening provided more benefit per dollar than risk-factor guided screening and cost $65,749 per QALY if followed by universal triple therapy compared to screening followed by IL28Bguided triple therapy. If only 10% of screen-detected, eligible patients initiate treatment at each opportunity, birth-cohort screening with universal triple therapy costs $241,100 per QALY. Assuming treatment with triple therapy, screening all individuals aged 40–64 years costs less than $100,000 per QALY. Conclusions: The cost-effectiveness of one-time birth-cohort hepatitis C screening for 40–64 year olds is comparable to other screening programs, provided that the healthcare system has sufficient capacity to deliver prompt treatment and appropriate follow-on care to many newly screen-detected individuals.
Link(s) to publication:
Cipriano, L.E., Zaric, G.S., Owens, D.K., Brandeau, M.L.,
2012, "Cost Effectiveness of Screening for Acute HIV and HCV Infection in Injection Drug Users", PLoS One, September 7(9): 1 - 12.
Abstract: Objective: To estimate the cost, effectiveness, and cost effectiveness of HIV and HCV screening of injection drug users IDUs) in opioid replacement therapy (ORT). Design: Dynamic compartmental model of HIV and HCV in a population of IDUs and non-IDUs for a representative U.S. urban center with 2.5 million adults (age 15-59). Methods: We considered strategies of screening individuals in ORT for HIV, HCV, or both infections by antibody or antibody and viral RNA testing. We evaluated one-time and repeat screening at intervals from annually to once every 3 months. We calculated the number of HIV and HCV infections, quality-adjusted life years (QALYs), costs, and incremental cost-effectiveness ratios (ICERs). Results: Adding HIV and HCV viral RNA testing to antibody testing averts 14.8-30.3 HIV and 3.7-7.7 HCV infections in a screened population of 26,100 IDUs entering ORT over 20 years, depending on screening frequency. Screening for HIV antibodies every 6 months costs $30,700/QALY gained. Screening for HIV antibodies and viral RNA every 6 months has an ICER of $65,900/QALY gained. Strategies including HCV testing have ICERs exceeding $100,000/QALY gained unless awareness of HCV-infection status results in a substantial reduction in needle-sharing behavior. Discussion: Although annual screening for antibodies to HIV and HCV is modestly cost effective compared to no screening, more frequent screening for HIV provides additional benefit at less cost. Screening individuals in ORT every 3-6 months for HIV infection using both antibody and viral RNA technologies and initiating ART for acute HIV infection appears cost effective.
Link(s) to publication:
McMahon, P.M., Kong, C.Y., Johnson, B.E., Weinstein, M., Tramontano, A.C., Cipriano, L.E., Bouzan, C., Gazelle, G.S.,
2012, "The MGH-HMS Lung Cancer Policy Model: Tobacco Control Versus Screening", Risk Analysis: An International Journal, August 32(S1): S117 - S124.
Abstract: The natural history model underlying the MGH Lung Cancer Policy Model (LCPM) does not include the two-stage clonal expansion model employed in other CISNET lung models. We used the LCPM to predict numbers of U.S. lung cancer deaths for ages 30–84 between 1975 and 2000 under four scenarios as part of the comparative modeling analysis described in this issue. The LCPM is a comprehensive microsimulation model of lung cancer development, progression, detection, treatment, and survival. Individual-level patient histories are aggregated to estimate cohort or population-level outcomes. Lung cancer states are defined according to underlying disease variables, test results, and clinical events. By simulating detailed clinical procedures, the LCPM can predict benefits and harms attributable to a variety of patient management practices, including annual screening programs. Under the scenario of observed smoking patterns, predicted numbers of deaths from the calibrated LCPM were within 2% of observed over all years (1975–2000). The LCPM estimated that historical tobacco control policies achieved 28.6% (25.2% in men, 30.5% in women) of the potential reduction in U.S. lung cancer deaths had smoking had been eliminated entirely. The hypothetical adoption in 1975 of annual helical CT screening of all persons aged 55–74 with at least 30 pack-years of cigarette exposure to historical tobacco control would have yielded a proportion realized of 39.0% (42.0% in men, 33.3% in women). The adoption of annual screening would have prevented less than half as many lung cancer deaths as the elimination of cigarette smoking.
Link(s) to publication:
Cipriano, L.E., Levesque, B.G., Zaric, G.S., Loftus, E.V., Sandborn, W.J.,
2012, "Cost-Effectiveness of Imaging Strategies to Reduce Radiation-Induced Cancer Risk in Crohn's Disease", Inflammatory Bowel Diseases, July 18(7): 1240 - 1248.
Abstract: Background: The aim was to examine the cost-effectiveness of magnetic resonance enterography (MRE) compared with computed tomography enterography (CTE) for routine imaging of small bowel Crohn's disease (CD) patients to reduce patients' life-time radiation-induced cancer risk.
Methods: We developed a Markov model to compare the lifetime costs, benefits (measured in quality-adjusted life-years [QALYs] of survival and cancers averted) and cost-effectiveness of using MRE rather than CTE for routine disease monitoring in hypothetical cohorts of 100,000 20-year-old patients with CD. We assumed each CT radiation exposure conferred an incremental annual risk of developing cancer using the linear, no-threshold model.
Results: In the base case of 16 mSv per CTE, we estimated that radiation from CTE resulted in 1,206 to 20,146 additional cancers depending on the frequency of patient monitoring. Compared to using CTE only, using MRE until age 30 and CTE thereafter resulted in incremental cost-effectiveness ratios (ICERs) between $37,538 and $41,031 per life-year (LY) gained and between $52,969 and $57,772 per quality-adjusted life-year (QALY) gained. Using MRE until age 50 resulted in ICERs between $58,022 and $62,648 per LY gained and between $84,250 and $90,982 per QALY gained. In a threshold analysis, any use of MRE had an ICER of greater than $100,000 per QALY gained when CT radiation doses are less than 6.0 mSv per CTE exam.
Conclusions: MRE is likely cost-effective compared to CTE in patients younger than age 50. Low-dose CTE may be an alternative cost-effective choice in the future.
Link(s) to publication:
Park, K.T., Tsai, R., Perez, F., Cipriano, L.E., Bass, D., Garber, A.M.,
2012, "Cost-effectiveness of early colectomy with ileal pouch-anal anastamosis versus standard medical therapy in severe ulcerative colitis", Annals of Surgery, July 256(1): 117 - 124.
Abstract: Background: Inflammatory bowel diseases are costly chronic gastrointestinal diseases. We aimed to determine whether immediate colectomy with ileal pouch-anal anastamosis (IPAA) after diagnosis of severe ulcerative colitis (UC) was cost-effective compared to the standard medical therapy. Methods: We created a Markov model simulating 2 cohorts of 21-year-old patients with severe UC, following them until 100 years of age or death, comparing early colectomy with IPAA strategy to the standard medical therapy strategy. Deterministic and probabilistic analyses were performed. Results: Standard medical care accrued a discounted lifetime cost of $236,370 per patient. In contrast, early colectomy with IPAA accrued a discounted lifetime cost of $147,763 per patient. Lifetime quality-adjusted life-years gained (QALY-gained) for standard medical therapy was 20.78, while QALY-gained for early colectomy with IPAA was 20.72. The resulting incremental cost-effectiveness ratio (?costs/?QALY) was approximately $1.5 million per QALY-gained. Results were robust to one-way sensitivity analyses for all variables in the model. Quality-of-life after colectomy with IPAA was the most sensitive variable impacting cost-effectiveness. A low utility value of less than 0.7 after colectomy with IPAA was necessary for the colectomy with IPAA strategy to be cost-ineffective. Conclusions: Under the appropriate clinical settings, early colectomy with IPAA after diagnosis of severe UC reduces health care expenditures and provides comparable quality of life compared to exhaustive standard medical therapy.
Link(s) to publication:
Honours & Awards
- Decision making in dynamic systems with strategic information acquisition” Natural Sciences and Engineering Research Council of Canada (NSERC)
- Seth Bonder Foundation Research Award, 2012
- Lee B. Lusted Student Prize Award for outstanding presentations of research in Applied Health Economics, Annual Meeting of the Society for Medical Decision Making, 2012
- Course Assistant Award, Department of Management Science & Engineering, Stanford University, 2012
- Seth Bonder Scholarship for Applied Operations Research in Health Services, INFORMS, 2011
- Award for Outstanding Short Course, Annual Meeting of the Society for Medical Decision Making, 2010, 2011
- Centennial Teaching Assistant Award, Stanford University, 2011
- Visiting Researcher, Management of Technology and Entrepreneurship Institute (MTEI), École Polytechnique Fédérale de Lausanne (EPFL), Lausanne, Switzerland (Spring 2012)
- Research Scientist. Institute for Technology Assessment, Massachusetts General Hospital, Boston, MA (2006-2008)
- Queuing Project Manager, Ontario Joint Replacement Registry, London Health Sciences Centre, London, ON (2004-2006)