Using normalisation process theory to understand workflow implications of decision support implementation across diverse primary care settings
•,,,,,,,.
...
Abstract
Background Effective implementation of technologies into clinical workflow is hampered by lack of integration into daily activities. Normalisation process theory (NPT) can be used to describe the kinds of ‘work’ necessary to implement and embed complex new practices. We determined the suitability of NPT to assess the facilitators, barriers and ‘work’ of implementation of two clinical decision support (CDS) tools across diverse care settings.
Methods We conducted baseline and 6-month follow-up quantitative surveys of clinic leadership at two academic institutions’ primary care clinics randomised to the intervention arm of a larger study. The survey was adapted from the NPT toolkit, analysing four implementation domains: sense-making, participation, action, monitoring. Domains were summarised among completed responses (n=60) and examined by role, institution, and time.
Results The median score for each NPT domain was the same across roles and institutions at baseline, and decreased at 6 months. At 6 months, clinic managers’ participation domain (p=0.003), and all domains for medical directors (p<0.003) declined. At 6 months, the action domain decreased among Utah respondents (p=0.03), and all domains decreased among Wisconsin respondents (p≤0.008).
Conclusions This study employed NPT to longitudinally assess the implementation barriers of new CDS. The consistency of results across participant roles suggests similarities in the work each role took on during implementation. The decline in engagement over time suggests the need for more frequent contact to maintain momentum. Using NPT to evaluate this implementation provides insight into domains which can be addressed with participants to improve success of new electronic health record technologies.
Trial registration number NCT02534987.
Summary
What is already known?
Clinical decision support (CDS) can affect healthcare provider actions by improving preventive care, diagnosis and treatment.
Effective implementation of technologies into clinical workflow is hampered by lack of integration into daily activities.
Normalisation process theory (NPT) can be used to describe the kinds of ‘work’ necessary to implement complex new practices.
What does this paper add?
The NPT toolkit can be adapted to a quantitative survey, administered longitudinally.
NPT can be used over time to assess the ongoing implementation barriers of a new electronic health record workflow.
The insights from applying the NPT framework to a health IT intervention implementation can be used to improve the success of the implementation itself.
Background
Clinical decision support (CDS) can affect healthcare provider actions by improving preventive care, diagnosis and treatment.1–5 With the near national use of electronic health records (EHRs) in the USA,6 CDS has been embedded into the EHR to varying degrees, most notably bringing evidence-based care to the point-of-care. This is one step towards improving the provision of recommended care, as patients in the USA receive only 8% of recommended preventive care7 and only 55% of recommended care overall.8 9 In order to extract the maximum benefit from CDS, however, it is critical to understand implementation implications of new technologies and complex interventions in the EHR.
Understanding facilitators, barriers and effort required to operationalise complex system changes and new health technologies is key to continuous improvement in implementation and success of the intervention. Normalisation process theory (NPT) is an action theory for describing the kinds of ‘work’ that are done to implement complex new practices and technologies in healthcare.10 NPT rests on three tenets of intervention. First, complex interventions integrate with existing work because of the individual and group efforts of those involved in implementation. Second, four domains—sense-making/coherence (also known as coherence), participation (also known as cognitive participation), action (also known as collective action) and monitoring (also known as reflexive monitoring)—describe general kinds of work through which implementation is operationalised. Third, complex intervention adoption and integration requires enduring efforts by those involved.10–12
The four NPT domains characterise the work done to implement a system change.10 11 13Sense-making or coherence represents understanding how practices differ from each other, individual roles in the new practice and the value of the new practice. Participation encompasses the work of key personnel and relationships to drive the change forward and the procedures needed to sustain the change. Action is the operational work necessary to support a new practice: interactions between group members, individual and group accountability, skill set recognition and building and appropriate resource allocation. Monitoring refers to the assessment work that is done to explain how a new practice affects the individual and group.
NPT has been used to evaluate the process of randomised controlled trials (RCTs),14 and has been proposed as the theoretical basis of a metric designed to help determine in advance whether workflow issues of a research programme would prevent successful implementation.15 In this context, NPT helps identify feasibility of an intervention’s components and whether benefits of implementing the intervention are likely to justify the effort. NPT has also guided a post hoc analysis of decision support tools that helped the investigators understand the ‘workability’, user knowledge, user skills and organisational impact of these innovative tools.16 Their analysis found that NPT identified significant gaps in understanding of organisational impact and provided a framework to identify facilitators and barriers to decision-support implementation.
The integrated clinical prediction rules (iCPR) RCT (see Funding) was developed to test the feasibility and effectiveness of integrating clinical prediction rules for sore throat and cough into the EHRs in diverse primary care settings.17 iCPR is based in primary care clinics affiliated with the University of Wisconsin and the University of Utah. The trial tests the use of two clinical prediction rules, the Heckerling Rule for pneumonia and the Centor Criteria for group A streptococcal pharyngitis, integrated into the local EHR. Based on a patient’s presenting complaint, providers complete a risk calculator which drives an order set based on the patient’s age and risk score. The order set presents evidence-based diagnostic and treatment options, as well as patient education. Both the pneumonia and streptococcal pharyngitis prediction rules were implemented at all intervention sites in the RCT.
We used the NPT framework to understand facilitators, barriers and the ‘work’ required to implement clinical prediction rules in the EHR within the context of the RCT investigating the utility of such iCPRs. To our knowledge, this is a novel use of the NPT toolkit to longitudinally assess a complex EHR CDS intervention study to identify and mitigate potential implementation barriers.
Methods
Setting and participants
We conducted a quantitative survey study at all of the University of Wisconsin and University of Utah primary care clinics that were randomised to the intervention group of the larger randomised controlled iCPR study. The clinics are diverse in geography and patient population, and they have varied clinical workflows in place related to patient triage, care and use of the EHR. Although they share an EHR vendor (Epic Systems, Verona, Wisconsin, USA), their version is customised to their institution’s workflows. The survey participants included the clinic manager and medical director at each participating intervention primary care site given these roles were instrumental in the implementation process. Survey participants had been integral in implementation of the iCPR tool in their clinics, receiving training by study staff prior to initiating the new workflow on the purpose of the RCT and how to use the iCPR EHR tool.
Survey
The survey was adapted from the previously studied NPT toolkit,18 which analyses the four domains of implementation of complex interventions: sense-making, participation, action and monitoring. Each domain is assessed through four questions in the survey, which are weighted equally in calculating the mean NPT score for the domain. Answers range from ‘not at all’ to ‘completely’ using a slider bar which begins in the centre position. The bar is moved to the left for negative response and to the right for a positive response, with responses equating to 0 (negative response) to 100 (positive response). The numerical values are not apparent to the respondent. The survey here rewords each statement in the NPT toolkit as a question and customises it to pertain to our intervention (online supplementary appendix 1).
The NPT toolkit provides radar plots of the NPT score for each domain with positive responses further away from centre and negative responses closer to centre.18 A response closer to the centre for the sense-making domain may indicate that the participant cannot make sense of the intervention. A response further away from centre for participation may indicate that the participant is fully engaged in the intervention. However, these radar plots were thought to be less informative in aggregate than numerical graphical views of the data. We developed bar graphs based on the numerical values (0–100, with more negative responses being lower numerically) assigned by the online NPT toolkit to positive and negative slider bar responses.
Survey administration
The survey was administered at the time of the intervention implementation (baseline, starting in October 2015) and at 6 months into implementation (follow-up). An email with the survey link was sent to each participant individually by the site study team, and up to three reminder emails or phone calls were made at 2-week intervals until the survey was completed or the reminders exhausted.
Data analysis
Each domain is derived as the mean score of four questions in the survey that correspond to that NPT domain as defined by the theory. Four distinct domains were derived for each survey response. Domains were then summarised among all completed survey responses (n=60) and examined for potential associations by role (clinic manager (n=31), medical director (n=29)), institution (Wisconsin (n=42), Utah (n=18)) and time (baseline (n=33), 6 months (n=27)). Summary measures are presented as domain median by group. Associations were investigated via Wilcoxon rank sum and Kruskal-Wallis tests as appropriate within time point (baseline and 6 months). Generalised estimating equations (GEE) were used for analyses of domains over time. All analyses were conducted using SAS V.9.4.
Results
All clinical sites identified in the intervention arm of the larger RCT participated in this study. The University of Wisconsin had 12 intervention clinic sites at baseline and 11 clinic sites at the 6-month follow-up survey. During the time between the two surveys, the University of Wisconsin had two clinic sites merge into one unified clinic with a new medical director and clinic manager. Three other clinics at the University of Wisconsin also had new clinic managers at the time of the follow-up survey. The University of Utah had six intervention clinic sites at baseline and six clinic sites at the 6-month follow-up survey. The University of Utah had one clinic with a new clinic manager at the time of the follow-up survey and otherwise no changes to participating clinics or survey respondents between the two survey administrations. Table 1 shows the number of unique survey recipients (denominator) and respondents (numerator) for the baseline and 6-month follow-up surveys based on clinical site and role. Participants who did not respond to the survey after the initial invitation to participate received up to three reminders from the site study team (until they completed the survey or exhausted the number of reminders in the study protocol (three)). All clinical sites in the intervention arm of the larger RCT participated in both the baseline and follow-up surveys.
Table 1
|
Survey participants by role/university at baseline and 6-month follow-up time points (total response N/total recipient N)
At baseline, the median score for each NPT domain was the same across clinic managers and medical directors (p>0.1 for all domains vs role), with the monitoring domain tending towards a lower value than the other three domains (figure 1). The median was higher for the University of Wisconsin than the University of Utah across all domains, with a significant difference in domain score between institutions identified only for action and only at baseline (p=0.03, figure 2).
Median normalisation process theory (NPT) domain scores* by role†. *NPT domain score is the mean score of four survey questions corresponding to that NPT domain. The median score for each NPT domain across a given role is presented here. †Role: CM, clinic manager; MD, medical director. Respondent N indicated by number and line within each bar.
Median normalisation process theory (NPT) domain scores* by institution. *NPT domain score is the mean score of four survey questions corresponding to that NPT domain. The median score for each NPT domain across a given institution is presented here.
Median values for each NPT domain were below 80 (out of a possible 100) for all domains. The median values for each domain decreased from baseline to 6 months across both roles and institutions, with the monitoring and participation domains tending to decrease more than the other two domains at this time point. Among clinic managers, only participation decreased significantly at 6 months compared with baseline (p=0.003). Among medical directors, all domains scored lower at 6 months than baseline (p<0.003 for each domain). The action domain score decreased significantly (p=0.03) from baseline to 6 months among Utah respondents. All domains were lower at 6 months than baseline among Wisconsin respondents (p≤0.008 for each domain).
Discussion
To the best of our knowledge, this study employed NPT in a novel way to assess the implementation of new clinical decision support in EHRs across diverse primary care settings at two institutions over time.19 20 We saw overall low median domain scores (range 50–75, out of 100). If these data were available in a timely way to shape the intervention from the start, it might speak to a need for enhanced engagement from participating sites overall. At baseline, sense-making and participation were higher for medical directors and action was higher for clinic managers. This differential may point to the role that the medical director had in shaping the clinical tool versus the role the clinic manager had in rolling out the intervention to clinicians. We saw higher median NPT domain scores across all four domains at one institution’s sites versus another, but saw relatively little variation between clinic managers and medical directors across practice sites. The difference between sites may be related to varying levels of engagement in the project, complexity of implementation within the EHR itself or complexity of clinical workflows at one institution versus the other. Additional qualitative work may help elucidate this further. The similarities between roles, however, may suggest that with this intervention the work of implementation fell equally to operational managers who helped disseminate information and the clinicians who helped design the EHR workflows and engaged with the tool to provide care. We also noted an overall decline in engagement from baseline to 6-month follow-up, suggesting the need for additional efforts to maintain interest and momentum for the ongoing implementation.
Using NPT to evaluate this implementation provides insight into work domains which can be addressed with participants to improve integration of the new CPRs and persistent efforts to ensure success of implementation. Action and sense-making domains had the highest median values, and the monitoring domain had the lowest across all roles, both institutions and both time points. This consistent difference may be related to monitoring being a future step in the implementation process that has not yet reached importance fully; continuing to follow this over the remainder of the trial will help us better understand whether monitoring becomes more important later in the implementation process. The decline in engagement from baseline to follow-up may suggest the need for more frequent contact with intervention sites to maintain momentum or the natural progression of a quality improvement project.21–23 This finding is congruent with other postimplementation evaluations using the NPT framework for analysis.24–26 As a result of this, we have included additional educational and feedback sessions for intervention sites to improve ongoing engagement and monitoring of the implementation.
The NPT framework provided quantitative information about the success of the implementation process, which complemented qualitative work done subsequently to more specifically identify opportunities for improvement. From the quantitative data from NPT and the qualitative data from subsequent interviews as part of the RCT (data not included in this manuscript), similar to another study of a pilot EHR intervention,27 we have identified differences in workflow across institutions. These differences impacted integration of the CPRs and their identification has resulted in changes to EHR build based on these workflows to improve usability of the tool.
Understanding the work of implementation and impact on organisational workflow of innovations in healthcare delivery is key to realising efficacy of the intervention. A prior analysis of decision support tools helped investigators understand these domains after an implementation had completed.27 NPT has also guided analysis of qualitative evaluation of a national programme to deliver health-promotion technology to the public.28 Using the NPT framework to quantitatively assess a healthcare IT implementation has helped us guide qualitative work to specifically identify areas for improvement. The complementary methods have been crucial in our ability to augment the original design of the EHR tool to improve utility for providers.
Our response rate is relatively high for a study including busy clinic administrators and physician leaders. This may be due to the ease of completing the survey, the personal relationships between the study team members and survey participants or other unknown factors. With little variation in responses between clinic managers and physician leaders, it may not be necessary to survey both groups.
This study has a number of limitations. It is conducted in the setting of an RCT with support to evaluate implementation, address deficiencies and make adjustments to workflows and interventions which may not be available during the course of a quality improvement project or other non-funded practice redesign work; as a result, generalisability may be limited. The study is conducted at only two institutions. Although we have relatively few intervention sites in the RCT resulting in the small number of survey participants, this is similar to prior studies using the NPT framework.24–26 28 We also have data collection at only two time points to date. Low numbers of participants and time points limit our ability to analyse the data for significance. We intend to continue collecting this survey data at 6-month intervals through the duration of the RCT. NPT has not previously been used in a longitudinal analysis within an RCT such as this. The absolute value of the scores has not yet been validated for research and as such we focus on comparisons between sites, participant type and over time. The utility of this type of ongoing analysis in implementation design for future projects requires further investigation. In fact, the group which developed NPT has since developed a new survey instrument aimed at end users of an intervention which may allow for better informed quantitative analysis of the four domains, particularly over time.29 30
Conclusions
NPT provides a framework for evaluation of technology and workflow innovation implementations in healthcare settings beyond traditional qualitative implementation evaluation methods. The current analysis provides insight into the four domains described by NPT during active implementation of clinical decision support tools within EHRs across diverse settings, with the goal of influencing ongoing implementation efforts to result in improved efficacy and integration of the EHR tools. Through continued administration of the NPT-based survey we will determine the value of this type of evaluation in longitudinal analysis. The successful integration of complex interventions in healthcare is directly correlated with understanding the implementation implications of those innovations. NPT provides a context for this evaluation, and we have shown here the influence this can have in ongoing health information technology implementations.
Presented at: This abstract was previously published as part of the 2017 Society of General Internal Medicine Annual Meeting Proceedings.
Funding: This study was funded by the National Institute of Allergy and Infectious Diseases, project 5R01AI108680-05.
Competing interests: None declared.
Patient consent for publication: Not required.
Ethics approval: This study was approved by the institutional review boards (IRBs) at the University of Wisconsin, University of Utah and Boston University. Survey participants’ response to the survey was considered sufficient consent to participate by these IRBs.
Provenance and peer review: Not commissioned; externally peer reviewed.
Data availability statement: All data relevant to the study are included in the article or uploaded as supplementary information.
Kaushal R, Shojania KG, Bates DW, et al. Effects of computerized physician order entry and clinical decision support systems on medication safety. Arch Intern Med2003; 163:1409–16. doi:10.1001/archinte.163.12.1409•Google Scholar
Roshanov PS, Fernandes N, Wilczynski JM, et al. Features of effective computerised clinical decision support systems: meta-regression of 162 randomised trials. BMJ2013; 346. doi:10.1136/bmj.f657•Google Scholar
Souza NM, Sebaldt RJ, Mackay JA, et al. Computerized clinical decision support systems for primary preventive care: a decision-maker-researcher partnership systematic review of effects on process of care and patient outcomes. Implementation Sci2011; 6:87–99. doi:10.1186/1748-5908-6-87•Google Scholar
McGinn TG, McCullagh L, Kannry J, et al. Efficacy of an evidence-based clinical decision support in primary care practices: a randomized clinical trial. JAMA internal medicine2013; 173:1584–91. Google Scholar
Jamoom EW, Yang N, Hing E, et al. Adoption of certified electronic health record systems and electronic information sharing in physician offices: United States, 2013 and 2014. NCHS Data Brief2016; 236:1–8. Google Scholar
Borsky A, Zhan C, Miller T, et al. Few Americans receive all High-Priority, appropriate clinical preventive services. Health Aff2018; 37:925–8. doi:10.1377/hlthaff.2017.1248•Google Scholar
Lomas J, Sisk JE, Stocking B, et al. From evidence to practice in the United States, the United Kingdom, and Canada. Milbank Q1993; 71:405–10. doi:10.2307/3350408•Google Scholar
McGlynn EA, Asch SM, Adams J, et al. The quality of health care delivered to adults in the United States. N Engl J Med2003; 348:2635–45. doi:10.1056/NEJMsa022615•Google Scholar
May C, Finch T, Mair F, et al. Understanding the implementation of complex interventions in health care: the normalization process model. BMC Health Serv Res2007; 7. doi:10.1186/1472-6963-7-148•Google Scholar
May CR, Mair F, Finch T, et al. Development of a theory of implementation and integration: normalization process theory. Implementation Sci2009; 4. doi:10.1186/1748-5908-4-29•Google Scholar
May CR, Finch T, Ballini L, et al. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit. BMC Health Serv Res2011; 11. doi:10.1186/1472-6963-11-245•Google Scholar
May CR, Mair FS, Dowrick CF, et al. Process evaluation for complex interventions in primary care: understanding trials using the normalization process model. BMC Fam Pract2007; 8. doi:10.1186/1471-2296-8-42•Google Scholar
Murray E, Treweek S, Pope C, et al. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions. BMC Med2010; 8. doi:10.1186/1741-7015-8-63•Google Scholar
Elwyn G, Légaré F, Weijden Tvander, et al. Arduous implementation: does the normalisation process model explain why it's so difficult to embed decision support technologies for patients in routine clinical practice. Implementation Sci2008; 3. doi:10.1186/1748-5908-3-57•Google Scholar
Feldstein DA, Hess R, McGinn T, et al. Design and implementation of electronic health record integrated clinical prediction rules (iCPR): a randomized trial in diverse primary care settings. Implementation Sci2017; 12. doi:10.1186/s13012-017-0567-y•Google Scholar
Morrison D, Mair FS. Telehealth in practice: using normalisation process theory to bridge the translational gap. Prim Care Respir J2011; 20:351–2. doi:10.4104/pcrj.2011.00092•Google Scholar
McEvoy R, Ballini L, Maltoni S, et al. A qualitative systematic review of studies using the normalization process theory to research implementation processes. Implementation Sci2014; 9. doi:10.1186/1748-5908-9-2•Google Scholar
Joint Commission Resources Inc. Implementing and sustaining improvement in health care. Oak Brook, IL, Joint Commission Resources2009; Google Scholar
Scoville R, Little K, Rakover J, et al. Sustaining improvement. Cambridge, MA2016; Google Scholar
Henderson EJ, Rubin GP. The utility of an online diagnostic decision support system (Isabel) in general practice: a process evaluation. JRSM Short Rep2013; 4:31–11. doi:10.1177/2042533313476691•Google Scholar
Gould DJ, Hale R, Waters E, et al. Promoting health workers' ownership of infection prevention and control: using normalization process theory as an interpretive framework. J Hosp Infect2016; 94:373–80. doi:10.1016/j.jhin.2016.09.015•Google Scholar
Jones CHD, Glogowska M, Locock L, et al. Embedding new technologies in practice – a normalization process theory study of point of care testing. BMC Health Serv Res2016; 16. doi:10.1186/s12913-016-1834-3•Google Scholar
Kanagasundaram NS, Bevan MT, Sims AJ, et al. Computerized clinical decision support for the early recognition and management of acute kidney injury: a qualitative evaluation of end-user experience. Clin Kidney J2016; 9:57–62. doi:10.1093/ckj/sfv130•Google Scholar
Nordmark S, Zingmark K, Lindberg I, et al. Process evaluation of discharge planning implementation in healthcare using normalization process theory. BMC Med Inform Decis Mak2016; 16. doi:10.1186/s12911-016-0285-4•Google Scholar
Finch TL, Rapley T, Girling M, et al. Improving the normalization of complex interventions: measure development based on normalization process theory (NoMAD): study protocol. Implementation Sci2013; 8. doi:10.1186/1748-5908-8-43•Google Scholar
Finch TL, Girling M, May CR, et al. Nomad: Implementation measure based on Normalization Process Theory. [Measurement instrument].