HEALTH AFFAIRS 33,
NO. 7 (2014): 1148–1154 ©2014 Project HOPE— The People-to-People Health Foundation, Inc.
Ruben Amarasingham (ruben .firstname.lastname@example.org) is president and CEO of PCCI, a nonprofit research and development corporation and an associate professor in the Departments of Internal Medicine and Clinical Sciences at the University of Texas Southwestern Medical Center, both in Dallas.
Rachel E. Patzer is an assistant professor in the Department of Surgery,
School of Medicine, and the
Department of Epidemiology,
Rollins School of Public Health, both at Emory
University, in Atlanta, Georgia.
Marco Huesch is an assistant professor in the Leonard D. Schaeffer Center for Health
Policy and Economics, Sol
Price School of Public Policy, University of Southern California, in Los Angeles.
Nam Q. Nguyen is a business operations associate at PCCI.
Bin Xie is a health services manager at PCCI.
By Ruben Amarasingham, Rachel E. Patzer, Marco Huesch, Nam Q. Nguyen, and Bin Xie
Implementing Electronic Health Care Predictive Analytics: Considerations And Challenges
ABSTRACT The use of predictive modeling for real-time clinical decision making is increasingly recognized as a way to achieve the Triple Aim of improving outcomes, enhancing patients’ experiences, and reducing health care costs. The development and validation of predictive models for clinical practice is only the initial step in the journey toward mainstream implementation of real-time point-of-care predictions. Integrating electronic health care predictive analytics (e-HPA) into the clinical work flow, testing e-HPA in a patient population, and subsequently disseminating e-HPA across US health care systems on a broad scale require thoughtful planning. Input is needed from policy makers, health care executives, researchers, and practitioners as the field evolves. This article describes some of the considerations and challenges of implementing e-HPA, including the need to ensure patients’ privacy, establish a health system monitoring team to oversee implementation, incorporate predictive analytics into medical education, and make sure that electronic systems do not replace or crowd out decision making by physicians and patients.
he US health care system faces sig- following capabilities: data retrieval from elecnificant challenges, including high tronic data repositories, such as electronic costs, poor quality, and variable healthrecords(EHRs),medicaldevices,orwearperformance.1–3 Technology sys- abletechnologies;datacleaningandharmonizatems designed to predict and pre- tion; risk calculation; updating or resetting of vent poor clinical outcomes could help. risk prediction models given new data; and the
At the start of a discussion of these technologies,it isworthwhiletodistinguish betweenrisk prediction models and the softwaresystems that canexecuteandemploythem.Traditionally,clinical risk prediction models have been defined as models that “combine a number of characteristics (e.g. related to the patient, the disease, or treatment) to predict a diagnostic or prognostic outcome.”4 We define the technologies or software systems that can autonomously employ— and sometimes reengineer, modify, or update— these models as electronic health care predictive analytics (e-HPA).
These systems may have some or all of the activation of clinical or other pathways, perhaps indirectly through the EHR or directly through alerts to patients’ or providers’ devices (see the online Appendix).5
Risk-prediction models as decision-making toolshavelongplayedimportantrolesinclinical practice. Well-known examples include the Framingham risk model for cardiovascular events6 and mortality7 and the Acute Physiology and Chronic Health Examination (APACHE) II score for intensive care unit (ICU) mortality.8
However, the implementation of e-HPA on a wide scale to aid in real-time, point-of-care decision making is still in its earliest stages. A 2011
JAMA review of twenty-six risk-prediction modelsforhospitalreadmissionidentifiedonlythree peer-reviewed models that were integrated into an EHR and designed to identify high-risk patients in real time.9
An important limitation of basic risk-predictionmodelsisthattheymaynotbecustomizedto thelocalpopulationorhealthcaresystem;donot change or “learn” over time in response to underlying population changes; and may not be easily automated within an EHR, thus requiring significant staff time to manually curate the data tocalculateandthencomputetheriskscore.10 In contrast, e-HPA has the potential to apply models throughout a health care system with minimal additional staff time.11
The emergence of e-HPA has been well documented in mainstream media and press re-
leases.12–14 However, a formal regulatory framework to guide the field in its earliest phases does not exist. Little is known about how best to incorporate e-HPA into the work flow of a health care system; how to evaluate success or protect against error; or how such systems should be scaled broadlyacrossclinicalorganizationswith varying administrative, scientific, and technical capabilities.15 During the next decade, organizations that implement e-HPA will undoubtedly encounter fresh questions across a range of domains within health care delivery.
The development and implementation of eHPA can be broadly divided into four phases: acquiring data to build a risk-prediction model; building and validating the model; applying it in a real-world setting and testing it for the first time in practice through a technology system; and scaling the model for broader implementationacrosshealthcaresystems.Inthisarticlewe focus on the latter two phases because substantive literature exists that addresses data acquisitionandtheconstructionandvalidationofmod-
Toilluminatesomeofthepertinentchallenges inthelattertwophasesofe-HPA,weoccasionally describe someof the challengesinthe contextof predicting the risk of hospital readmission within thirty days. This may be one of the most straightforward and established uses of e-HPA.
Practical Challenges In Implementing e-HPA
Thescaleandscopeofimplementinge-HPAvary, dependingonthenatureofthemodels,thetechnologies used, the population, and the outcome of interest. Practical challenges in testing e-HPA in a real-world setting for the first time may include ensuring appropriate oversight, the engagement of key stakeholders to ensure successful and sustainable implementation of e-HPA into clinical work flow, establishment of appropriate patient privacy and consent policy and procedures, and data quality assurance (Exhibit 1).
Key Challenges And Recommended Actions To Integrate, Test, And Disseminate Electronic Health Care Predictive Analytics (E-HPA)
Challenge Recommended action
Testing the model in a real-world setting under appropriate supervision
Appropriate approval and oversight of implementation Establish a health care system operations team to oversee implementation
Stakeholder engagement Work with key stakeholders to develop and implement relevant clinical protocols
Human subjects research protection Ensure that the health care system operations team assesses the need for human subjects Protection of patients’ privacy protection and IRB approval
Data assurance IRB determine the need for patient consent, if applicable
Conduct pilot implementation of e-HPA
Broad implementation of the model in a health care setting
Patient privacy protection, patient consent, approval and Apply lessons learned from above oversight, stakeholder engagement Follow standards to ensure interoperability within and across health systems
Interoperability of health systems Ensure open sharing of HPA methods to foster collaboration across health systems Transparency of HPA within health systems
Impact on doctor-patient relationships Ensure that shared patient-provider decision making is not replaced by e-HPA
Medical education and training Implement medical school education and clinical workforce training in e-HPA
Sustainability of e-HPA in health care systems Align patient care quality and population health management goals; have stakeholders (including payers, vendors, and health systems) advocate for reimbursement incentives for HPA in care processes
SOURCE Authors’ analysis. NOTE IRB is Institutional Review Board.
In the example of e-HPA for hospital readmission for patients with heart failure, a model predicting thirty-day readmission risk must first be developed and validated using electronic data
from a hospital EHR on demographic, clinical, and socioeconomic risk factors. Then the model must be automated and integrated into the clinical work flow of the hospital. Several steps are necessarytoensureitseffectiveimplementation.
Appropriate Oversight Of Implementation Health system oversight is recommended when implementinge-HPA for the first time in clinical practice to ensure broad stakeholder support, the smooth integration of e-HPA into the existingwork flow, andthe appropriate protection of patients’ privacy and autonomy. There are no widely accepted guidelines for ongoing oversight of e-HPA within and across health care institutions.
E-HPA implementation requires careful oversight of multiple factors. For example, the scope and complexity of the e-HPA algorithms, the number of simultaneous models being deployed in a given institution over time, and the anticipatedmodificationrateofamodelinresponseto changing conditions at the institution could all affect the scale of oversight. Other considerations might include the expertise required if artificial intelligence or machine learning approaches were used for model tuning, and the costandpersonnelrequirementsneededtoevaluate the complex integration of e-HPA into existing clinical workflows.
The cost of ownership associated with oversight may be a significant barrier to the proper and safe use of e-HPA for many institutions, particularly smaller freestanding organizations. Clinical institutions including hospital systems, academic institutions, and even individual stand-alone clinics will also need to consider the potentially deleterious results of implementing e-HPA without proper oversight. In extreme cases, these results could include enrolling patientsinthewrongtherapeuticorinterventional pathway, misallocating clinical resources in the event of software failure, and decisional paralysis when e-HPA systems are down.
In the example of using e-HPA to prevent hospital readmission, a high-risk patient who warrants intensive case management may fail to receive such help if the underlying risk model fails toaccuratelycapturehisorhercompleterisk.To mitigateagainstanyharmsinsuchanexample,a thoughtfuloversightprocessisclearlydesirable. Ontheotherhand,theoversightprocessmustbe nuancedandflexible;oversightthatrequiresapprovalfromacommitteeoneveryaspectofmodel refinement and clinical work-flow reengineering is likely to be unsustainable and to impede theuseofpowerful,semi-autonomouspredictive modeling systems—and reduce their benefits. A poorly structured review and approval process may slow down the implementation of a model
Stakeholder engagement may be important to ensuring support of e-HPA operations within clinical settings.
and deter its widespread use.
Onepotentialapproachthatinstitutionscould use to ensure appropriate oversight of e-HPA implementation is to create a multidisciplinary clinical and operational oversight committee to oversee the implementation. The committee’s primary role could be to conduct a rigorous assessmentofwhethere-HPAresultedinearlysuccessorfailure.Thecomposition,size,anddegree of oversight of this committee would likely depend on the severity of the potential impact of eHPAandthelevelofexistingevidencetosupport e-HPA efficacy and safety.
For example, if e-HPA output affects only resource allocation, such as the intensity of care transition services given to patients with heart failure at high risk of hospital readmission, and there is evidence to support its efficacy and safety, then less rigorous review of the e-HPA may suffice. In contrast, implementation of e-HPA to predictwhichpatientsareatriskforacardiopulmonary arrest event outside the ICU— which would require their immediate transfer to the ICU for potentially life-saving treatment—might require creation of a multidisciplinary oversight committee consisting of clinical experts, administrators, and e-HPA developers.
Inaddition,managementoftheday-to-dayoperation of e-HPA within a specific set of clinical work flows is needed. Institutions should have flexibility to define their own operations management structure. However, stakeholder engagementmaybeimportanttoensuringsupport of e-HPA operations within clinical settings.
In the case of preventing readmissions, advance planning of the clinical and operational activitiesthatmust occurwhenthemodelflags a patient as being at high risk for readmission is neededtomaximizeefficiencyandminimizeoperational setbacks. Organizations will need to evaluate unforeseen but related consequences, including limited availability of resources for othertypesofpatientsorconditions,unexpected
We are not aware of any framework for patient consent that is specific to e-HPA.
clinical adverse events, failure to adhere to the prescribedpathways,andsituationsinwhichthe interventions are ineffective.
Stakeholder Engagement E-HPA implementation invariably affects many stakeholders, including hospital staff, clinicians, and patients and their families. Health care systems should work with key stakeholders as well as e-HPA developers to determine how e-HPA should be integrated into clinical work flow.
In some cases, e-HPA might identify patients whose risk of an event is so high or whose characteristics are so unmodifiable that no suitable interventionalpathwayexiststopreventtheoutcome;predictionsthatarrivetoolatemayalsobe impossible to act upon. At that point, the goal maybethemitigationofharmortheprovisionof palliative therapy. Much of this decision making should be locally governed.
In the example of using e-HPA to predict the risk of readmission, stakeholder engagement may determine which staff member—for example, a nurse, physician, or social worker—is alerted when e-HPA flags a patient as being at high risk for readmission, when or at what intervals the staff member should be alerted, and what mode of communication (for example, an automatic page generated by the EHR, a secure text message, or the activation of a set of orders) should be used.
An institution could allow e-HPA to identify several groups of patients at high risk for readmission, but the interventions could be patient specific as a result of patients’ specific risk phenotypes. Forexample,somepatientsathighrisk forreadmissionmayrequireintensivecasemanagement with the use of remote monitoring devices for close follow-up. Others may need a cardiology consultation for a potential transplant, inwhichcasearepeatedreadmissionisjustified. Still others may need palliative care. Highperformance clinical engineering teams are critical to these efforts.
Appropriate Patient Privacy And Consent Policies To make predictions, e-HPA relies on detailed data about individual patients, which are often obtained continually and in bulk. Procedures are needed to safeguard the data and obtain consent from patients for the use of their data without slowing down the implementation ofe-HPA,makingitlessaccurate,orreducingthe lead time available to make a prediction.
We arenot awareof any framework for patient consent that is specific to e-HPA. Until there is national consensus or established guidance or regulation,institutionswillneedtodeveloptheir own policies to address a number of questions. These include how a patient would be properly informed when a risk-prediction model did not recommend treatment; how an individual physician would know what resource allocations (for example,allocations of casemanagementcapacityorICUcapacity)areoccurringatthesystemor enterprise level that might affect his or her patients;andwhethertherewouldbeamechanism for patients to dispute a model’s recommendation that no treatment be given.
Thesequestionsbecomeparticularlychallenging in large population settings. For example, would an organization have to notify each individual in a population of 30,000 patients with chronic kidney disease who was not recommended to receive one of fifteen available preventiveappointmentslotswithanephrologistin thefollowingsevendays?Themodelmightselect the fifteen patients out of the 30,000 who would be most likely to benefit. Numerous ethical and legal issues emerge in this scenario, and frameworksneedtobedevelopedtoguideinstitutions. Whene-HPAisbeingtestedforthefirsttimein a real-world setting, patients’ consent should be an absolute requirement if e-HPA implementation could expose patients to serious risks. This mightbeespeciallypertinentwhentheinterventionbeingallocatedisparticularlyconsequential and can be received only through the determinations of the predictive model.
In the example of e-HPA designed to reduce thirty-dayreadmissionfor heartfailure, aninstitution might use the predictions to allocate case management capacity at the population level. If only the patientswith the highest risk scores got an intervention—such as a consultation with a cardiac specialist—it is conceivable that if the model fails or makes a misprediction, some deserving patients thus fail to receive the consultation.Anumberoftechnological,procedural,and clinical checks and balances would need to be instituted to mitigate this risk.
Wewouldnotadvocateusinge-HPAastheonly mechanismfor triggeringan intervention. However, if that were the case, an institution should disclose and inform patients that e-HPA is being usedsolely toallocatespecificresources,such as obtainingoneofa limitedsetof availablefollowup appointments, and should provide a mechanism to inform patients when these resources are not recommended for them.
Data Quality Assurance Many times, models designed for real-time clinical use are first built and tested on retrospective, non-real-time data sets. Although it may be obvious, it is worth statingthatsuchmodelsneedtobesubsequently validated using real-time data, which often behave differently from retrospective data. Input data for predictive analytic models to help decision making may change over time. Changes in data quality or the storage location of the input data might also degrade the model’s performance.
For example, if a risk-prediction model for readmission was built using health insurance payer data, what would happen if there were subtle changes to the data? Who would monitor thosedata?Whenandhowoftenwouldthemodel be updated? Who would approve the update? These are crucial considerations when dynamic updating of a model occurs within e-HPA.
It is also important to determine whether sufficient high-quality outcome data are accessible within the EHR or electronic data framework to evaluate e-HPA over time. This may be particularlyrelevantforlonger-termoutcomesorevents that occur outside the health system. For example, if long-term outcomes such as mortality are the outcome of interest in e-HPA, assessments might not be valid if mortality data on some patients are unavailable because they are lost to follow-up or die outside the health system.
Simulatingorpilotinge-HPAbeforeactivating the model and integrating it into the clinical work flow may be a reasonable way to address such data quality issues. In the readmission example, e-HPA could be simulated in real time without its predictions’ being acted on initially, then be piloted on one or two ward units. After that, e-HPA could be applied to the entire health care system.
An operations management team, including clinical and administrative staff as well as the data analytics team or vendor, could oversee the pilot implementation. An ongoing process of failure modes and effects analysis—that is, theprocessofidentifyingpotentialfailurepoints in the system, understanding their effects, and managing the work flow to mitigate those effects—couldthenbeinstitutedtoexamineeach element of software, modeling, and work-flow performance.
Challenges In The Broader Implementation Of e-HPA
Lessons learned from pilot analyses and e-HPA implementation within a single health care sys-
To help equip physicians for the e-HPA era, medical education and training may need to be modified.
temcanhelpinformbroaderimplementation,or scalability, of e-HPA across health care systems. However, the strategies for oversight and stakeholder engagement may be different in this phase, as the focus shifts to a level of oversight that can be more easily replicated on a large scale.
We are unaware of any published reports on the widespread dissemination of e-HPA across the health care system. Evidence-based standards for all aspects of e-HPA are worth developingasthescalingupofthetechnologybegins.15,23 Ideally, e-HPA vendors, institutions that use eHPA, or independent researchers should perform and publish a prospective evaluation of eHPA through multiple impact and cost-effective analyses across varied settings and populations. Appropriately constructed incentives to encourage adoption of best predictive models or EHRs mayalsohelpsustaine-HPAbestpractices.Studies have suggested that perverse incentives have sometimes resulted in overuse of expensive interventions, underuse of interventions that provide primary or secondary prevention, distorted fee structures, or an undersupply of general
In contrast, the federal government’s meaningful-use incentives for EHR adoption and its readmission reduction program have accelerated the development and adoption of EHRs and even the use of predictive models for readmission.26,27 Similar incentive frameworks could assistwiththedisseminationofsounde-HPApractices. In addition, the government and payers, for example, could insist on acceptable standards or certifications—analogous to Underwriters Laboratories’ “UL” mark—when e-HPA methods, technologies, or even institutions comply with established best practices.
In addition to adequate validation, impact analyses,andestablishingbestpracticesfore-HPA, other barriers stand in the way of widespread scalability. These revolve around issues of interoperability and transparency.
Interoperability Of Health Care Technology Platforms Interoperability, or the ability of different health care technology platforms to exchange information, is necessary to scale eHPA across health care systems. Some authors suggest that national implementation of interoperability standards would result in savings of $77.8 billion annually and would have widespread benefits.28 However, the US health care systemhasbeenslowtoadoptfullyinteroperable health information technology (IT) systems.
E-HPA software developers can help solve the problem by developing products that can function across different EHR platforms. Platforms that comply with open interoperable standards would enable e-HPA to be scaled more widely. Highlyinteroperablesystemsaresometimesperceived to be vulnerable to competition because they reduce customer “lock-in,” as the customer may choose to buy different products from differentvendors,sometimesdescribedasa“bestof breed” strategy by the customer.
Transparency Of Predictive Analytics Within Health Care Systems E-HPA transparency is required because clinical decisions ultimately need to be made by patients, clinicians, and the institutions that serve them. Whenever possible,clinicians,inparticular,needtobeable to “see into” a risk-prediction model and understand how it arrived at a certain prediction. EHPA software and modeling approaches that encourage and facilitate collaboration and an exchange of ideas among diverse stakeholders could facilitate broader dissemination by ensuring sufficient support.
Transparency in e-HPA may include open accesstothesecure(anddeidentified)datasources usedtodevelopthemodelandreportingofmodel development methodology, such as the statistical methods and nonproprietary source code used; details on how the model is integrated withintheEHR;anddocumentationofhowclinical work flow is modified to incorporate e-HPA. Such transparency could ultimately allow for trust in automated predictive modeling systems—arecognizedcharacteristicofsuccessfully automatedsystemsinotherindustries.29 Itmight also encourage scientific “crowdsourcing,” in which health care systems, independent researchers, orotherinstitutionsmayimproveupon or better evaluate a given model in different contexts and settings.
Long-Term Challenges Of E-HPA
We have focused our discussion on some of the near-term practical challenges of implementing and scaling e-HPA. There are, however, a number of additional challenges on the horizon as eHPA inevitably expands in complexity and capabilityoverthenextfewdecades.Theselong-term challenges may include the influence of e-HPA on the relationship between the physician and patient,theneedtoincludecoree-HPAconcepts inmedicaltraining,anddesigningproperincentive mechanisms to ensure continual investments in e-HPA.
Physicians’ ability to practice effective medicine will be challenged by the growing deluge of medical information from numerous new data sources such as wearable patient technologies and by the exponential growth of scientific knowledge.Inthissetting,physiciansmightfind themselves increasingly reliant on e-HPA that is capable of synthesizing and condensing information. For example, e-HPA may include artificial intelligence or decision-making capabilities that could supplement physicians’ judgment.
E-HPA could ultimately present decisions to providers and patients through data reduction strategies that use assumptions and probabilities that physicians and patients typically walk themselves through. How will these component decisionsbepresented,ifatall?Withtheincreasing power and potential autonomy of e-HPA, it will be important to develop a framework that clearly defines the roles and responsibilities of the physician, the patient, and e-HPA. Systems should be designed so that physicians and patients retain critical decision-making authority. To help equip physicians for the e-HPA era, medical education and training may also need to be modified. The medical education system is designedtoprovideclinicianswiththeskillsthey will need to practice effectively in the future. It may be valuable to design curricula that give students an important understanding of predictiveanalytics,coveringboththeunderlyingprinciples—failure points, assumptions, and engineering methods—and how best to use and evaluate e-HPA in clinical practice.
In this article we have discussed considerations and challenges in implementing e-HPA in the health care system. This discussion is not comprehensive but instead is intended to highlight specific challenges to introduce the need for a thoughtful set of best practices to guide the rise of e-HPA. Our experiences and those of others suggest that e-HPA can help achieve the Triple Aim of improving outcomes, enhancing the patient experience, and reducing health care costs.30 However, attention must be paid to newchallengesthatareemergingwiththeascendance of these technologies. ?
Funding for the preparation of this article was provided by the Gordon and Betty Moore Foundation Framework and Action Plan for Predictive Analytics (Grant No. 3861).
1 Kohn LT, Corrigan JM, Donaldson MS, editors. To err is human: building a safer health system. Washington (DC): National Academies Press; 2000.
2 Institute of Medicine. Crossing the
quality chasm: a new health system for the 21st century. Washington (DC): National Academies Press; 2001.
3 Institute of Medicine. Best care at lower cost: the path to continuously learning health care in America. Washington (DC): National Academies Press; 2012.
4 Steyerberg EW. Clinical prediction models: a practical approach to development, validation, and updating. New York (NY): Springer; 2009. p. 2.
5 To access the Appendix, click on the Appendix link in the box to the right of the article online.
6 Wilson PW, D’Agostino RB, Levy D, Belanger AM, Silbershatz H, Kannel WB. Prediction of coronary heart disease using risk factor categories. Circulation. 1998;97(18):1837–47.
7 Wang TJ, Massaro JM, Levy D, Vasan RS, Wolf PA, D’Agostino RB, et al. A risk score for predicting stroke or death in individuals with new-onset atrial fibrillation in the community: the Framingham Heart Study.JAMA.
8 Knaus WA, Draper E, Wagner DP, Zimmerman JE. APACHE II: a severity of disease classification system. Crit Care Med. 1985;13(10):
9 Kansagara D, Englander H,
Salanitro A, Kagen D, Theobald C, Freeman M, et al. Risk prediction models for hospital readmission: a systematic review. JAMA. 2011; 306(15):1688–98.
10 Moons KG, Altman DG, Vergouwe Y, Royston P. Prognosis and prognostic research: application and impact of prognostic models in clinical practice. BMJ. 2009;338:b606.
11 Kawamoto K, Houlihan CA, Balas EA, Lobach DF. Improving clinical practice using clinical decision support systems: a systematic review of trials to identify features critical to success. BMJ. 2005;330(7494):765.
12 IBM [Internet]. Armonk (NY): IBM. Press release, WellPoint and IBM announce agreement to put Watson to work in health care; 2011 Sep 12 [cited 2014 May 29]. Available from: http://www-03.ibm.com/press/us/ en/pressrelease/35402.wss
13 IBM Software Group. The University of Texas MD Anderson Cancer Center: IBM Watson helps accelerate translation of cancer-fighting knowledge to cutting edge medical practices [Internet]. Somers (NY): IBM Software Group; 2013 [cited 2014 May 29]. Available from: http://www-03.ibm.com/ innovation/us/watson/pdf/MD_
Anderson_Case_Study.pdf 14 IBM [Internet]. Armonk (NY): IBM. Press release, IBM research unveils two new Watson related projects from Cleveland Clinic collaboration; 2013 Oct 15 [cited 2014 May 29].
Available from: http://www-03.ibm
15 Reilly BM, Evans AT. Translating clinical research into clinical practice: impact of using prediction rules to make decisions. Ann Intern Med. 2006;144(3):201–9.
16 Royston P, Moons KG, Altman DG, Vergouwe Y. Prognosis and prognostic research: developing a prognostic model. BMJ. 2009;338:b604. 17 Altman DG, Vergouwe Y, Royston P, Moons KG. Prognosis and prognostic research: validating a prognostic model. BMJ. 2009;338:b605.
18 Moons KG, Royston P, Vergouwe Y, Grobbee DE, Altman DG. Prognosis and prognostic research: what, why, and how? BMJ. 2009;338:b375.
19 Steyerberg EW, Moons KG, van der
Windt DA, Hayden JA, Perel P,
Schroter S, et al. Prognosis Research Strategy (PROGRESS) 3: prognostic model research. PLoS Med. 2013; 10(2):e1001381.
20 Riley RD, Hayden JA, Steyerberg EW, Moons KG,Abrams K, Kyzas PA, et al. Prognosis Research Strategy (PROGRESS) 2: prognostic factor research. PLoS Med. 2013;10(2): e1001380.
21 Hingorani AD, Windt DA, Riley RD,
Abrams K, Moons KG, Steyerberg
EW, et al. Prognosis research strategy (PROGRESS) 4: stratified medicine research. BMJ. 2013;346:e5793. 22 Hemingway H, Croft P, Perel P, Hayden JA, Abrams K, Timmis A, et al. Prognosis research strategy (PROGRESS) 1: a framework for researching clinical outcomes. BMJ. 2013;346:e5595.
23 McGinn TG, Guyatt GH, Wyer PC, Naylor CD, Stiell IG, Richardson WS. Users’ guides to the medical literature: XXII: how to use articles about clinical decision rules. EvidenceBased Medicine Working Group.
24 Ellis RP, McGuire TG. Provider behavior under prospective reimbursement. Cost sharing and supply. J Health Econ. 1986;5(2):129–51.
25 Kantarevic J, Kralj B. Link between pay for performance incentives and physician payment mechanisms: evidence from the diabetes management incentive in Ontario. Health Econ. 2013;22(12):1417–39.
26 CMS.gov. EHR incentive programs [Internet]. Baltimore (MD): Centers for Medicare and Medicaid Services; [last modified 2014 Apr 10; cited 2014 Jun 4]. Available from: http:// www.cms.gov/Regulations-andGuidance/Legislation/EHRIncentive
27 CMS.gov. Readmissions reduction program [Internet]. Baltimore (MD): Centers for Medicare and
Medicaid Services; [last modified 2014 Apr 30; cited 2014 Jun 4]. Available from: http://www.cms
28 Walker J, Pan E, Johnston D, AdlerMilstein J, Bates DW, Middleton B. The value of health care information exchange and interoperability. Health Aff (Millwood). 2005;24:w510–8. DOI: 10.1377/hlthaff.w5.10.
29 Lee JD, See KA. Trust in automation: designing for appropriate reliance.
Hum Factors. 2004;46(1):50–80.
30 Berwick DM, Nolan TW,Whittington J. The Triple Aim: care, health, and cost. Health Aff (Millwood). 2008; 27(3):759–69.