Skip to main content
  • Systematic review
  • Open access
  • Published:

Hospital-based interventions: a systematic review of staff-reported barriers and facilitators to implementation processes

Abstract

Background

Translation of evidence-based interventions into hospital systems can provide immediate and substantial benefits to patient care and outcomes, but successful implementation is often not achieved. Existing literature describes a range of barriers and facilitators to the implementation process. This systematic review identifies and explores relationships between these barriers and facilitators to highlight key domains that need to be addressed by researchers and clinicians seeking to implement hospital-based, patient-focused interventions.

Methods

We searched MEDLINE, PsychInfo, Embase, Web of Science, and CINAHL using search terms focused specifically on barriers and facilitators to the implementation of patient-focused interventions in hospital settings. To be eligible, papers needed to have collected formal data (qualitative or quantitative) that specifically assessed the implementation process, as experienced by the staff involved.

Results

Of 4239 papers initially retrieved, 43 papers met inclusion criteria. Staff-identified barriers and facilitators to implementation were grouped into three main domains: system, staff, and intervention. Bi-directional associations were evident between these domains, with the strongest links evident between staff and intervention.

Conclusions

Researchers and health professionals engaged in designing patient-focused interventions need to consider barriers and facilitators across all three identified domains to increase the likelihood of implementation success. The interrelationships between domains are also crucial, as resources in one area can be leveraged to address barriers in others. These findings emphasize the importance of careful intervention design and pre-implementation planning in response to the specific system and staff context in order to increase likelihood of effective and sustainable implementation.

Trial registration

This review was registered on the PROSPERO database: CRD42017057554 in February 2017. 

Peer Review reports

Introduction

Health service interventions that are effectively implemented are associated with improved patient and staff outcomes and increased cost-effectiveness of care [1]. However, despite sound theoretical basis and empirical support, many interventions do not produce real-world change, as few are successfully implemented [2, 3], and fewer still are sustained long-term [4]. The ramifications of failed implementation efforts can be serious and far-reaching; the additional workload required by implementation efforts can add significant staff burden [3], which can reduce the quality of patient care and may even impact treatment efficacy if interventions disrupt workflow [5]. Additionally, staff who bear the burden of implementing new interventions may be reluctant to try alternatives if their first experience was unsuccessful [6]. A thorough understanding of the barriers and facilitators to implementation, as well as an ongoing assessment of the process of implementation, is therefore crucial to increase the likelihood that the process of change is smooth, sustainable, and cost-effective.

Implementation science focuses on factors that promote the systematic uptake of research findings and evidence-based practices into routine care [7]. A number of frameworks have been developed to describe and facilitate this process and can be classified into three main groups with the following aims: describing or guiding the process of translating research into practice (process models), understanding and/or explaining what influences implementation outcomes (determinant frameworks, classic theories and implementation theories), and evaluating implementation (evaluation frameworks) [8]. As our review seeks to recognize the specific types of determinants that act as barriers and facilitators, we drew mostly from determinant frameworks such as the Promoting Action Research in Health Services (PARiHS) framework [9] and the Consolidated Framework for Implementing Research (CFIR) [10]. The PARiHS highlights the importance of evidence, context, and facilitation [9], while the CFIR proposes five key domains of influence: inner and outer setting, individual characteristics, intervention characteristics, and processes [10]. The focus of such frameworks is on understanding and/or explaining influences on implementation outcomes, and they are therefore often used by researchers and clinicians to plan their implementation, develop strategies to overcome barriers, and support successful delivery.

However, research in the field has also been impeded by the use of inconsistent language and inadequate descriptions of implementation strategies [11], an issue that has recently been addressed by the development of the Expert Recommendations for Implementing Change, which has resulted in a refined compilation of strategy terms and definitions [12]. In addition, recent reviews of commonly used strategies, such as nominating intervention champions, have found that they are not uniformly successful [13], suggesting that such approaches are not “one size fits all” and must instead be selected in line with the context and needs of the population. Therefore, there has been an increasing call to explore implementation frameworks by systematic review, in ways that not only identify barriers and facilitators but seek to explore the mechanisms underlying change, and the processes by which these barriers and facilitators relate to each other and to implementation success outcomes [3, 14] in the specific context in which they are trialed.

Hospitals are one such specific context, with unique populations, processes, and microsystems, which may encounter unique barriers [15]. Additionally, interventions within hospitals are often complex and multi-faceted and must contend with barriers across a wide range of settings. While systematic reviews have focused on the hospital context as regards integrated care pathways [16], no systematic review to date has focused on the implementation of patient-focused interventions in the hospital setting.

The current systematic review therefore had two key aims: first, to identify staff-reported barriers and facilitators to implementation of patient-focused interventions within the hospital context, and second, to define and explore relationships between these, in order to generate practical strategies that can assist tailoring to individual service needs. We also sought to explore the fit between existing frameworks and components of real-world implementation studies, to contribute to the growing evidence base for these frameworks and to identify those likely to be of most use to clinicians, researchers, and administrators in designing and conducting implementation studies.

Methods

Registration

This systematic review is registered on PROSPERO (17.02.17, registration:2017057554) [17].

Search strategy

A search of the relevant databases (Psych Info, MEDLINE, PubMed, Embase, CINAHL, and Web of Science) was conducted, with results limited to articles published up until 31 December 2016. A comprehensive list of search terms (see Additional file 1) was developed based on the terminology of the field and keyword lists of relevant papers (see summary in Table 1). Keywords that mapped to specific Medical Subject Headings for each database were selected to ensure an inclusive search approach. Returned search results were screened for duplicates. Ethical approval was not required for this review.

Table 1 Summary of database search terms

Eligibility criteria

A checklist of inclusion and exclusion criteria was developed to guide selection of appropriate studies (Table 2). During this process, all authors reviewed a sub-sample of articles (10%) to refine inclusion and exclusion criteria and ensure criteria could be consistently applied.

Table 2 Inclusion and exclusion criteria

A study was eligible for inclusion if (1) it was an original research study published in full, (2) it was hospital-based, (3) participants surveyed about the implementation were hospital staff, (4) the intervention involved direct patient care, and (5) it included formal collection of data from participating staff about barriers and facilitators to the implementation process.

No study design was excluded, but studies needed to meet all five criteria to be eligible. Only studies in English were assessed, and studies that could not be accessed in full (such as conference abstracts) were excluded, as there was insufficient detail to determine whether they met the additional exclusion criteria. We included studies that provided any formal data, quantitative (such as surveys and Likert ratings) or qualitative (such as interviews and focus groups), regarding implementation barriers and facilitators either anticipated pre-implementation or encountered during implementation. In assessing eligibility, included studies were required to have collected formal data related to the implementation specifically, rather than the intervention itself [11]. The need to separate assessment of implementation processes from interventions has been highlighted in the recent Standards for Reporting Implementation Studies (StaRI), which note that this distinction is crucial for allowing researchers to identify the key components that lead to effective translation of evidence into practice [18]. Therefore, our analysis focused solely on papers which identified the barriers and facilitators that affect the implementation process, rather than the intervention. This meant that all papers that reported only data about the intervention outcomes (including effectiveness data) were not considered eligible. Interventions were defined as being focused on patient care if they had either direct patient contact (such as patient-targeted behavioral interventions) or had a direct impact on patient outcomes (such as quality and safety interventions). Some studies retrieved dealt exclusively with introducing electronic records; these were not included as they had no patient-centered focus. Further detail on exclusion and examples of excluded papers for each eligibility criterion are provided in Additional file 2.

Several theories and taxonomies have been proposed to guide measurement of success that include issues of uptake, penetration, cost-effectiveness, and sustainability [19]. However, very few identified studies used a theory or framework to guide their definition of success. Therefore, for the purposes of this review, we used the barometer of success defined by each individual study.

Study selection process

Decisions regarding eligibility were made by LG and verified by co-authors. Studies were initially screened by title and abstract; the remaining articles underwent a full-text analysis. All studies were initially reviewed by the first author (LG), with a subset of articles (10%) also subject to team review to assure consistency. No formal analysis of agreement was carried out for this stage of study selection, as any disagreements were resolved by iterative discussion until consensus was reached.

Data extraction and analysis of included articles

For all included articles, we collected descriptive information comprising author, date of publication, participant group, and study design. To extract and synthesize data on barriers and facilitators, we used the Framework Analysis approach [20] and generated a data abstraction matrix to organize and display content.

Qualitative synthesis was accomplished in a series of stages as follows: (1) reviewing a subset of the included articles to familiarize the research team with the literature base, (2) deriving a series of codes and subcodes that reflected key concepts within the data, (3) developing these concepts into an overarching thematic framework of categories, (4) systematically indexing each article according to the framework, entering summary data (quantitative studies), and verbatim quotes (qualitative studies) into the cells of the matrix. Initial codes were generated by the first author and were refined together by the team in a series of iterative reviews, to ensure clarity and synthesis of data [21].

Given the unique context being explored, we decided to undertake this inductive approach rather than using an existing theoretical framework initially, as this allowed us to see what factors arose in real world studies, rather than imposing a specific framework initially.

Quality assessment

We used the Critical Appraisal Skills Program (CASP) [22] for qualitative studies, and the Mixed Method Assessment tool (MMAT) [23] for quantitative and mixed method studies. These were selected because they have an extensive scoring guide, sound psychometric properties, capture a range of key components of qualitative research (CASP), and specifically assess both quantitative descriptive and mixed methods research (MMAT).

Quality assessment was based on the implementation data provided, rather than the overall study data. All papers were reviewed against these checklists (LG), and a subset of papers (6) were reviewed by a second author (NR) to assess for agreement. We defined agreement as the proportion of items where both raters gave a positive (yes) or a negative (cannot tell, no) score. A formal analysis of agreement was carried out based on Cohen’s Kappa for inter-rater reliability, and scores varied from 0.45 to 0.61 between raters, indicating moderate to substantial agreement according to Landis and Koch’s standards [24]. Discrepancies were resolved through iterative discussions.

Results

Included studies

Of the 4239 articles identified, 43 met the inclusion criteria (see Fig. 1). Study characteristics are reported in Additional file 3.

Fig. 1
figure 1

PRISMA flow diagram of study selection process. Some papers were excluded on more than one criterion, therefore total excluded N > 3684

Study characteristics

Study origin

Studies were largely based in developed countries, including the USA (12), the UK (8), Canada (6), Australia/New Zealand (6), Denmark (2), Sweden (1), Finland (1), Italy (1), and the Netherlands (1). The remaining studies originated in Uganda (1), South Africa (1), Tanzania (1), Ghana (1), and Mexico (1).

Study designs

Studies were predominantly cross-sectional (n = 41) designs, with only two using a longitudinal design.

Participants

Participant response reporting varied as some interventions were carried out at the macro-level (e.g., across several hospitals) and some at the micro-level (e.g., a pilot in a single ward). Some studies reported exact numbers (n = 2 to 132) while others only included the number of hospitals participating (n = 1 to 38). Participant type was also reported inconsistently, with some studies specifying only that interviews were carried out with “project participants,” while others specified respondent type (e.g., nurses, clinical specialists, allied health professionals, and administrators).

Methods

The majority (n = 37) of studies used qualitative methods exclusively, three used mixed methods, and three quantitative methods exclusively. Semi-structured interviews were the most common data collection strategy (in both qualitative and mixed methods) followed by focus groups, audit, and observation. Quantitative and mixed methods studies used questionnaires (designed for the study) or validated measures.

Types of implementation

There was great variation in the implementation of interventions and the health states targeted, as shown in Tables 3 and 4.

Table 3 Population health states targeted in included studies
Table 4 Intervention approach in included studies

Explicit use of conceptual theory or framework

Less than half the studies (n = 16) reported using theory to guide their implementation, most commonly the Theoretical Domains Framework, the PARiHS framework, the Realist Evaluation framework, and the Contingency Model.

Reporting of barriers and facilitators

Most studies focused explicitly on barriers and facilitators to implementation (n = 28), the remaining 15 studies reporting barriers and facilitators as secondary data (with a primary focus on effectiveness or outcomes of the intervention).

Study quality

Studies focusing on implementation processes often had a quality improvement or action research focus that did not clearly align with any of the major checklists and therefore failed to address some criteria. Where implementation data about barriers and facilitators was a secondary focus, reporting on these issues was of lower quality, despite overall high-quality reporting on other outcomes. Areas of poorer quality included a lack of detail on data collection methods, participants, response rates, and representativeness (Table 5). Few researchers discussed reflexivity, despite increasing recognition that research teams are likely to affect implementation processes [25, 26].

Table 5 Quality checklist criteria

Key findings of barriers and facilitators to implementation

Qualitative synthesis identified 12 distinct categories of barriers or facilitators, which were grouped into three main domains: system, staff, and intervention. Each domain was associated with clear sub-domains, as shown in Table 6. The detail about each domain is presented, with illustrative quotes, in Table 7.

Table 6 Identified barriers and facilitators to implementation
Table 7 Identified domains and quotes from included studies

System level barriers and facilitators

Environmental context

Barriers directly related to the hospital environment included workload and workflow, physical structure, and resources. Staff workload and lack of time for implementation were the most commonly cited barriers [27,28,29]. Staff shortages, high staff turnover, or changes in roster compounded this issue [30], resulting in burden for implementation falling on small numbers of staff who were most interested, rather than generating change at the institution level [31]. Several studies targeted this issue by hiring additional staff, such as a research coordinator [32], or delegating parts of the intervention to the research team. However, this was dependent on research team capacity and funds; sustainability of these strategies after the research team left was not addressed [32]. In contrast, support provided at the institutional level for staff to have time for implementation was believed to be a more sustainable facilitator [6].

Implementation processes were also stymied by systemic workflow organization and staff movement [33]. Hospital workflow around division of responsibilities, transfer of work between shift-working staff, and systems of care governing how and when patients were seen during changeover periods often resulted in inconsistent implementation or significant gaps in the process [5]. Movement of staff across multiple roles or areas of the site resulted in decreased knowledge and movement of patients made consistency in the implementation process challenging [34].

The physical structure of the hospital site created barriers to implementation, such as lack of private space for interventions requiring sensitive discussion [35, 36]. Implementation involving IT innovations often faced barriers related to the hospital’s ability to accommodate new systems [6]. A final barrier was the popularity of interventions in hospital wards, which results in staff reporting fatigue toward new initiatives [6] or feelings of tension when juggling hospital priorities alongside intervention goals [37, 38].

Culture

Barriers related to workplace culture centered around system-level commitment and change readiness. Low levels of commitment often occurred in response to structural changes, such as high turnover, which left staff feeling demoralized and unable to accept additional challenges required by implementing the intervention [30]. Support from management regarding the importance of change and organization-level commitment to new processes was crucial to combating this [38,39,40]. Several interventions also used champions or coordinators to facilitate motivation [39], although some staff reported experiencing negativity from colleagues as a barrier to carrying out this role effectively [27].

Workplace culture barriers also included the level of role flexibility and trust between different clinicians involved in the intervention. Congruence between the intervention requirements and staff roles was important [27]. Staff who reported that implementation required them to carry out duties beyond their role reported struggling, especially if they anticipated judgment from colleagues [41]. However, other respondents felt that building trust across the team could address these concerns [41].

Communication processes

The efficacy of communication processes emerged as the third system-level factor, particularly where interventions required collaboration between staff of different disciplines [20, 42]. Lack of interdepartmental collaboration, miscommunication, and fragmentation between practitioners could serve as a significant barrier to successful implementation [28, 43]. Study environments that promoted open and clear communication motivated staff to take on challenges, and feel safe about reporting errors or issues, resulting in more successful implementation [44].

External requirements

The final system-level domain related to external pressures such as pending audits, accreditation requirements, or assessments by an external body. These were strong influencers of motivation and commitment to the intervention [44], particularly if perceived as contributing to better institutional outcomes. The perception of external obligations alone was considered a source of motivation as it encouraged management support for staff who were trying to implement the intervention [37]. Participants noted that implementation as part of hospital policy or standards were a strong facilitator to lasting change [6].

Staff level barriers and facilitators

Staff commitment and attitudes

While system domains focused on the overall structure and culture, staff domains were more focused on the individual, and the experiences, motivations and beliefs of those staff directly involved with carrying out the intervention. Commitment and motivation was identified as the first staff-level barrier, and this was clearly influenced by staff attitudes regarding the proposed intervention, which directly impacted their engagement with the implementation process. In some instances, participants questioned intervention validity, for example, whether patients would respond honestly to screening [31] and whether the intervention would have any real effect on behavioral change [43]. Lack of belief in the intervention was associated with variability in adherence to intervention guidelines, causing a barrier to successful implementation [34]. Equally, if staff felt they were already equipped to address the issue targeted by the intervention, they were less likely to adopt the changes required to achieve full implementation [45].

Change readiness levels of individual staff also influenced commitment; even in cases where the overall culture was positive, individual clinicians were not always responsive to new ways of doing things, in part due to feelings of losing control in their role, or feeling that they were forced to make changes [45]. To combat this, several studies noted the impact of sharing informal intervention “success stories” in shifting staff morale and openness to change [32, 46]. A sense of ownership, and a belief in the process, was another key facilitator and was more likely to occur when staff felt engaged in the process of implementation [6, 28].

Understanding and awareness

Staff knowledge of the aims and process of the intervention was key to ensuring effective implementation. Misinterpretation of the intentions or meaning of interventions could trigger unnecessary resistance toward the implementation [37]. Confusion or disregard of intervention processes could also impact implementation, as it meant that staff did not follow procedure [35]. In some instances, this lack of awareness was addressed via additional training and education [34, 37]. Where an intervention did require additional work or resources, it was important that staff understood that it would lead to longer term positive outcomes and reduction in overall burden [38, 45].

Role identity

Motivation to adopt changes required for implementation was often decreased when staff felt the intervention was not part of their role (22) or experienced confusion regarding who should fulfill the role [6]. Where interventions called for staff to go beyond their previous role, this could also create resistance or hesitation [32]. However, role responsibility was likely to be increased in situations where participants felt a sense of duty or obligation to the intervention [47].

Skills, abilities, and confidence

In cases where the intervention required staff to implement a new approach, lack of confidence or ability proved a significant barrier, with staff who reported lower skills expressing greater resistance to the implementation [31, 41]. Participants at times felt ill-equipped to carry out the tasks of the intervention, particularly if it required skills in an area they felt they had not been trained for [31]. Participants also felt under-resourced or unable to overcome a range of patient-related barriers to the intervention such as engaging challenging populations on difficult topics (e.g., substance use) [35]. Participants who felt they had the skills to engage and build rapport with patients described this ability as a facilitator to change [31]. Ability to carry out the intervention was further impacted by stress and time management challenges [27]. Participants at times reported that their level of responsibility was unmanageable [32], expressing concerns about the potential of burnout [29], or that the physical care of the patients needed to be prioritized over the implementation [38, 48]. However, where an intervention lead to greater consistency of practice, this was reported as a facilitator, leading to increased ability and decreased stress overall [41].

Intervention level barriers and facilitators

Ease of integration

Interventions that fitted the existing hospital system and ways of working were more likely to be reported as successful [49], while interventions that required change to standard processes were more likely to report delays and gaps in implementation processes [50]. However, these issues could be overcome in interventions that were flexible and iterative, such as those that engaged in ongoing tailoring and review [50]. The use of action research methods and frameworks facilitated this process, enabling researchers to respond to concerns and allowed timely intervention amendments to be made [34].

Intervention complexity often made integration more challenging. Where interventions required new operating systems, IT functionality and accessibility issues were commonly reported [51, 52]. Complexity also related to intervention design: interventions that involved multiple health professionals across a range of contexts increased the likelihood of delays and miscommunications [49]. Similarly, interventions involving additional forms or screening tools created extra work for staff, and more errors in process were likely. This issue could often be targeted by simplifying forms and tools to make the process more user-friendly [34, 50]. Interventions that were perceived as simple and accessible were more likely to receive positive endorsement and greater engagement with the implementation process [37].

Acceptability and suitability of an intervention to system, staff, and patient influenced how easily it was integrated. Sometimes, the intervention did not suit the system, requiring staff to seek out patients normally seen in a different part of the hospital [35]. Staff sometimes identified a particular intervention was better suited to a different setting, where greater needs existed [45]. The cost and resources required by an intervention, in terms of work, time and stress, also influenced acceptability, and were often cited as reasons for withdrawing from, or having negative feelings toward, the implementation process [45]. Finally, acceptability of the intervention to the patient was key to integration; staff encountered barriers where patients perceived that the intervention was not relevant, such as in the case of lifestyle change interventions [47] or screening for problem drinking [35]. Patient populations were often highly complex and did not suit the straightforward pathways or interventions proposed [45]. Staff highlighted the importance of considering this in the pre-implementation design phase [6].

Face validity and evidence base

Many participants expressed concerns about the evidence base of interventions, and this was frequently cited as a barrier to implementation. Communicating and making the evidence accessible to staff in was considered a key facilitator as lack of awareness of the evidence was commonly reported [49]. When participants felt confident about the evidence and the intervention rationale, this increased motivation to support the implementation overall [6, 28].

Safety, legal, and ethical concerns

Many participants raised concerns about intervention safety, particularly where change of care was required. Participants raised this as a barrier when they were asked to deliver information they did not agree with [53]. Conversely, an intervention perceived as leading to potentially decreased risks and improved care was seen as a facilitator [41]. Ethical issues concerning patient well-being and patient confidentiality were sometimes raised. For example, when interventions required shared platforms, participants noted that confidentiality relating to user privacy needed to be considered and that patient awareness of the shared platform could influence information disclosed [51]. Concerns regarding legality and fear of litigation were also commonly cited barriers when interventions called for changes in roles and responsibilities [44, 54]. Concerns about safety meant that staff were less likely to endorse or fully participate in the implementation [53].

Supportive components

Training, awareness raising, audit/feedback, and engagement with end users could all serve as barriers or facilitators. Lack of training and awareness of intervention processes was seen as a key barrier, and in cases where staff turnover was high, regular in-services were noted as crucial to facilitate implementation [5]. Repeated training and awareness campaigns were seen as necessary to reinforce new processes and behavioral patterns [44], although access and time to attend training, along with availability of professional support, were common challenges [40]. These awareness-raising activities were perceived as most useful when they highlighted the evidence and need for the intervention, as well as the likely benefits to staff and patients [6].

The importance of regular audit, such as real-time monitoring of admissions to ensure fidelity, was also reported as helpful to the implementation success [55]. These strategies were also associated with improved motivation and demonstrated the utility of the intervention [28, 44]. Finally, participants highlighted the importance of engaging with the intervention end users (i.e., themselves and their colleagues) to facilitate the process of implementation in a way that was acceptable, appropriate, and sustainable [6]. Studies which had adopted models of iterative implementation, such as participatory action research, reported greater engagement from end users [34].

Reported frequency of barriers and facilitators

The number of studies reporting barriers and facilitators for each domain are shown in Table 6. The most commonly reported domains impacting implementation success were environmental barriers at the system level, staff commitment, and attitudes toward the intervention at the staff level and supportive components at the intervention level. We note that these are only the most commonly reported barriers, which does not indicate that they are the most critical or important. However, it does convey a sense of those issues most likely to occur in the hospital setting, when carrying out patient-focused interventions.

Links and relationships between domains

In addition to the above domains influencing implementation success directly, associations between domains were also identified, in which facilitators from one domain were able to impact barriers in other domains (Fig. 2). This occurred most clearly at the staff level, which was easily responsive to intervention level barriers, and also highly susceptible to changes at the system level. This association was reciprocal, with staff barriers shaping elements of the intervention itself, particularly where the intervention was responsive to end user involvement [34, 48]. Staff could also impact system level barriers, providing feedback that led to changes in organizational culture and communication processes.

Fig. 2
figure 2

Bi-directional associations between key domains

Intervention domains were also responsive system domains, particularly in times of deficiency, when the environment lacked concrete resources or a supportive workplace culture. Interventions would strive to address this by increasing their internal support (via additional staff or engagement meetings) [32] and ensuring ease of integration (by flexibly altering intervention components where possible) [50]. Similarly, system domains could raise significant barriers if the intervention had not foreseen and addressed them or did not have the ability to respond flexibly. This was noted in cases where hospitals underwent staffing changes, renovations, or procedural changes, which meant the intervention could not proceed as anticipated or could not be sustained [30, 34].

Associations also appeared to move in cycles, where the system might influence the staff, which in turn influenced the intervention, which in response sought to influence the system. Thus, the process was continually dynamic and iterative, explaining why interventions could fail for many different reasons, even with the best grounding in theory and planning. Our findings suggest that implementation success is not simply about selecting and delivering strategies but about reflexive awareness of emergent influences that arise from the complex microcosm of the hospital environment. A clear understanding of this ever-evolving process, which includes frequent checking in with the staff and system as an in-built part of the process, is therefore key to a sustainable intervention and its implementation.

Discussion

This systematic review of staff-reported barriers and facilitators to implementation of hospital-based, patient-focused interventions highlights two crucial pieces of information for researchers, policy-makers, and health service staff. First, there are key domains that must be considered to support effective implementation in hospital settings, and secondly, the interrelationships between these domains can be leveraged to address barriers and amplify facilitators. Our analysis indicated the presence of three overarching domains that could influence the implementation process: system, staff, and intervention. The evidence of distinct domains and their interrelationships confirms prior research and theory that implementation success is influenced by a dynamic range of barriers and facilitators. While the wide range of relevant sub-domains may seem overwhelming, it can also be empowering, as it highlights the many avenues through which researchers, health service staff, administrators, and managers can positively shape intervention design and implementation strategies. Each of the three main domains had a significant influence on implementation success; we discuss each in turn, describe interrelationships, and reflect on directions for future research below.

Barriers within the system domain confirmed the importance of understanding the broader organizational context, an issue that has been raised frequently in implementation research to date [56, 57]. The influence of these macro-level barriers was particularly evident in studies that described implementation across different hospital contexts [31, 58, 59]. These studies all showed that while the intervention design and processes were the same across sites, the cultures of each site were vastly different and faced their own unique barriers and enablers. Those interventions that responded to the hospital context and worked toward ease of integration were more likely to be reported as successful, in terms of adherence, acceptability, and sustainability [58]. Therefore, a thorough understanding of the system in which an intervention will be implemented can assist in intervention design. Several studies carried out barrier analyses relating to the organization prior to implementation, commonly using qualitative interviews or informal meetings. No studies identified in this review used validated measures for pre-assessment of organizational or staff level barriers. Recent research has generated a range of validated measures to assess organizational context including the Organizational Readiness for Implementing Change (ORIC) [60], the Organizational Readiness to Change Assessment (ORCA) [61] and Alberta Context Tool (ACT) [62]. Use of these measures, in conjunction with early-stage interviews and feedback from key stakeholders, may provide useful information on the context and highlight system level challenges that need to be addressed, potentially through intervention modification or tailored implementation strategies.

Barriers within the staff domain highlighted challenges at the micro level, including motivation toward change, personal beliefs regarding the intervention, understanding of the end-goals and outcomes, and level of skill and confidence. This demonstrates the need for implementation researchers to take the time to understand staff engagement and beliefs about the intervention and to generate specific strategies to address existing barriers. Studies in this review used a range of strategies to engage staff, including involvement in intervention development, targeted education and training to support and build confidence, and integration of ongoing feedback and regular contact to continually address concerns and provide a forum for staff to share experiences [5, 6, 44]. Recognizing staff as a dynamic and central factor in intervention design, implementation and maintenance is therefore likely to be crucial to ongoing sustainability.

Finally, intervention factors were consistently reported to play a strong role in implementation success. Almost every study named the barriers encountered in relation to the intervention itself. These were fairly consistent, with issues of ease of integration, face validity, safety/legality, and supportive strategies being commonly reported across the wide range of interventions that were reviewed. While much research in implementation science has focused on the contextual factors such as system and staff influences, recent research has highlighted this important role that intervention design plays in implementation processes [63]. Frameworks such as the CFIR [10] outline a range of facets within the intervention and its delivery process that should be considered, and our findings support this focus. Awareness of barriers is especially important in the design and deliver of complex, multi-faceted interventions, which are commonly implemented in hospital settings. Implementation of clinical pathways, patient-focused care initiatives, and evidence-based practice guidelines frequently engage multiple health disciplines and may demand that changes be made at the process and system levels in contrast to current practice. Implementing change can be demanding on staff and health services and interventions that are flexible and engage with needs of end users, are likely to produce better outcomes [57]. Therefore, researchers should consider intervention design and place more emphasis on pilot testing interventions to demonstrate feasibility and acceptability prior to full-scale implementation.

This review also provided novel insights into the associations between system, staff, and intervention domains, with each domain having possible influence on the others. Links between barriers across domains were more clearly recognized and more consistently addressed by those studies that reported using a theory or framework to guide their implementation [34]. This is likely due to the encouragement of iterative review and reflection that is central to most frameworks in the field. Interventions that had inbuilt flexibility, and allowed for ongoing change and tailoring, resulted in greater opportunities to introduce strategies and respond to unforeseen challenges. These new learnings can assist researchers, health service staff, administrators, and managers developing interventions to directly assess for challenges posed by context or culture and respond to this by tailoring their intervention where possible.

In undertaking this systematic review, we gave consideration to the relative benefits and detriments of inductive versus deductive analysis. Given the hospital context, and recognition of the systematic review as an iterative process [21], we elected to use an exploratory approach to remain open to the factors that may emerge from real-world studies within hospital settings. In line with our secondary aims, we recognized the breadth of determinant frameworks already exist and it was very useful to compare our findings within these frameworks, in order to explore similarities and differences. The three key domains identified in this review reinforce the use of theory-based frameworks to guide and support hospital-based implementation, as the factors outlined by such frameworks were clearly borne out in this real-world data. Our findings also contribute to the usefulness of existing frameworks, adding to the PARiHS framework by highlighting the important role of intervention factors, and to the CFIR by casting light on the associations between domains. Our domains showed significant overlap across the five domains of CFIR. However, it was challenging to decide where specific barriers from the studies we reviewed would best fit with pre-defined framework domains. For example, due to the limited information provided in some studies, it was unclear at times where a barrier would fit within the CFIR sub-domains; this applied in trying to determine the role of an individual involved in engagement, as studies did not always provide sufficient detail to code this barrier into an “opinion leader” versus a formally appointed “champion.” This type of fine grained differentiation may be of most relevance in situations where nuanced distinctions might influence the selection of implementation strategies at the development stage.

With five domains and 39 constructs, the CFIR provides a more nuanced conceptualization of factors impacting implementation success and therefore provides a means of expanding and exploring in more depth the domains identified in our analysis. In contrast, our review generated a simplified view of factors, which may be more pragmatic for busy hospital environments. In real world research, it is clear that at some points, pragmatism is required, while at other times, a more detailed understanding is needed, and this is a constant balance for implementation scientists.

We acknowledge that this review has some limitations. While every attempt was made to screen widely and inclusively, indexing studies in implementation is inconsistent and it is possible that some eligible studies were missed. Papers written in languages other than English were excluded, and 39 of the 43 studies were conducted in developed countries. Therefore, the findings outlined may be of less relevance to hospital-based implementation in developing nations. The quality of studies was variable, and in some cases involved very small sample sizes. The majority of studies collected qualitative data and at times did not provide significant detail about the interview methods or data analysis. Finally, in choosing to include only original research published in full, it is possible that we were unable to include some of the newest emerging research in the field (e.g., conference abstracts). There is significant debate about the exclusion of grey literature and unpublished research in systematic reviews, and it is noted that in choosing to exclude this research, there is a risk of publication bias in the findings presented [64].

Despite this, our review highlights knowledge gaps and areas for future study in the context of hospital-based implementation. Many studies published implementation results shortly after implementation, so questions about sustainability remain unanswered. This is supported by a recent scoping review by Tricco and colleagues, which showed that very few studies publish results about sustainability [65]. It is unclear whether the barriers and facilitators identified in this review will impact on long-term sustainability, and further research focused on the longer-term processes of change are warranted. Our review also noted significant variability in definitions of, and/or the outcomes used to assess, “implementation success” across different studies. This variability makes it hard to assess the generalizability of findings or to make broader comparisons across studies. A greater focus on outcomes with clearer definitions of successful implementation, such as the taxonomy proposed by Proctor et al. [19], would assist researchers to generate findings that can be more easily evaluated. In addition, while there has been a proliferation of studies focused on the introduction of new interventions in recent years, we found that a significant proportion of the papers identified in our initial search addressed the implementation process only anecdotally, without the collection of any formal data. The inclusion of formal assessments of the implementation process in future research will greatly add to the body of knowledge about the specific factors that influence successful translation of evidence into practice. Finally, the need to build flexibility into interventions emerged as a key facilitating factor. However, the balance between flexibility and fidelity is an ongoing challenge in the field. Cohen et al. [66] highlight the importance of clarity in research design and reporting regarding which elements of the intervention are adapted, to increase understanding of these processes within the readership. The StaRI guidelines suggest that this issue can be explored by differentiating between the core components of the intervention, to which fidelity is required, versus components or strategies that may be adapted by local sites to support effective implementation [18]. Adhering to the recommendations of these recent guidelines when reporting results will help to improve the quality of reporting and generating results that can be more clearly understood and used by others in the field.

Conclusions

Our findings have clear practical implications for researchers and health service staff seeking to develop and implement feasible and acceptable interventions in hospital settings. They highlight the need to consider staff and system domains as active components in the change process rather than imposing change. An ongoing process of reflection and evaluation is indicated, with early engagement in intervention design, involvement and regular dialog with staff during pilot testing, and full-scale delivery of the intervention, including staff at administrative and managerial levels. Implementation scientists may benefit from reflecting on the interrelationships between the three domains identified in this review, to understand the bidirectional associations between different domains within the hospital setting. The greater our understanding of these associations, the more likely we are to be able to implement interventions that are meaningful, acceptable, and positively impact on health outcomes.

References

  1. Grol R, Grimshaw J. From best evidence to best practice: effective implementation of change in patients’ care. Lancet. 2003;362:1225–30.

    Article  PubMed  Google Scholar 

  2. Grol R. Successes and failures in the implementation of evidence-based guidelines for clinical practice. Med Care. 2001;39:S46–54.

    Article  Google Scholar 

  3. Grimshaw J, Eccles M, Tetroe J. Implementing clinical guidelines: current evidence and future implications. Journal of Continuing Education for Health Professionals. 2004;2

  4. Gould D, Moralejo D, Drey N, Chudleigh J. Interventions to improve hand hygiene compliance in patient care. Cochrane Database Syst Rev. 2010;8

  5. Wajanga BK, Peck RN, Kalluvya S, Fitzgerald DW, Smart LR, Downs JA. Healthcare worker perceived barriers to early initiation of antiretroviral and tuberculosis therapy among Tanzanian inpatients. PLoS One. 2014;9

  6. Rankin NM, Butow PN, Thein T, Robinson T, Shaw JM, Price MA, et al. Everybody wants it done but nobody wants to do it: an exploration of the barrier and enablers of critical components towards creating a clinical pathway for anxiety and depression in cancer. BMC Health Serv Res. 2015;15

  7. Eccles MP, Mittman BS. Welcome to implementation science. Implement Sci. 2006;1

  8. Nilsen P. Making sense of implementation theories, models and frameworks. Implement Sci. 2015;10

  9. Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence-based practice: a conceptual framework. Quality in Health Care. 1998;7:149–58.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  10. Damschroder L, Aron D, Keith R, Kirsh S, Alexander J, Lowery J. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4 https://doi.org/10.1186/1748-5908-4-50.

  11. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8 https://doi.org/10.1186/1748-5908-8-1.

  12. Powell BJ, McMillen JC, Proctor EK, Carpenter CR, Griffey RT, Bunger AC. A compilation of strategies for implementing clinical innovations in health and mental health. Med Care Res Rev. 2012;69 https://doi.org/10.1177/1077558711430690.

  13. McCormack B, Rycroft-Malone J, DeCorby K, Hutchinson AM, Bucknall T, Kent B, et al. A realist review of interventions and strategies to promote evidence-informed healthcare: a focus on change agency. Implement Sci. 2013:38. https://doi.org/10.1186/1748-5908-8-107.

  14. Hack TF, Carlson L, Butler L, Degner LF, Jakulj F, Pickles JT, et al. Facilitating the implementation of empirically valid interventions in psychosocial oncology and supportive care. Support Care Cancer. 2011;19:1097–105. https://doi.org/10.1007/s00520-011-1159-z.

    Article  PubMed  Google Scholar 

  15. Squires JE, Estabrooks CA, Scott SD, Cummings GG, Hayduk L, SHK, et al. The influence of organizational context on the use of research by nurses in Canadian pediatric hospitals. BMC Health Serv Res. 2013;13 https://doi.org/10.1186/1472-6963-13-351.

  16. Allen D, Gillen E, Rixson L. Systematic review of the effectiveness of integrated care pathways: what works, for whom, in which circumstances? International Journal of Evidence Based Healthcare. 2009;7:61–74. https://doi.org/10.1111/j.1744-1609.2009.00127.x.

    Article  PubMed  Google Scholar 

  17. Geerligs L, Rankin N, Shepherd H, Butow P. Hospital-based interventions: a systematic review of barriers and facilitators to support successful implementation. http://www.crd.york.ac.uk/PROSPERO/display_record.asp?ID=CRD42017057554. PROSPERO; 2017.

  18. Pinnock H, Barwick M, Carpenter CR, Eldridge S, Grandes G, Griffiths CJ. Standards for reporting implementation studies (StaRI) statement. The BMJ. 2017;356

  19. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, Hensley M. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Admin Pol Ment Health. 2011;38:65–76.

    Article  Google Scholar 

  20. Cranwell K, Polacsek M, McCann TV. Improving mental health service users’ with medical co-morbidity transition between tertiary medical hospital and primary care services: a qualitative study. BMC Health Serv Res. 2016;16 https://doi.org/10.1186/s12913-016-1567-3.

  21. Moher D, Liberati A, Tetzlaff J, Altman DG, The PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. PLoS Med. 2009;6

  22. Critical Appraisal Skills Programme (CASP) Oxford: CASP; 2014 [cited 2017 12.1.17]. Available from: http://media.wix.com/ugd/dded87_29c5b002d99342f788c6ac670e49f274.pdf.

  23. Pluye P, Gagnon MP, Griffiths F, Johnson-Lafleur J. A scoring system for appraising mixed methods research, and concomitantly appraising qualitative, quantitative and mixed methods primary studies in Mixed Studies Reviews. International Journal of Nursing Studies in Family Planning. 2009;46:529–46.

    Article  Google Scholar 

  24. Landis J, Koch G. The measurement of observer agreement for categorical data. Biometrics. 1977;33:159–74.

    Article  CAS  PubMed  Google Scholar 

  25. Mays N, Pope C. Assessing quality in qualitative research. BMJ: British Medical Journal. 2000;320:50–2.

    Article  CAS  PubMed  PubMed Central  Google Scholar 

  26. Malterud K. Qualitative research: standards, challenges, and guidelines Lancet (London, England). 2001;358.

  27. McAteer J, Stone S, Fuller C, Michie S. Using psychological theory to understand the challenges facing staff delivering a ward-led intervention to increase hand hygiene behavior: a qualitative study. Am J Infect Control. 2014;42:495–9.

    Article  PubMed  Google Scholar 

  28. Nithianandan N, Gibson-Helm M, McBride J, Binny A, Gray KM, East C, et al. Factors affecting implementation of perinatal mental health screening in women of refugee background. Implement Sci. 2016;11 https://doi.org/10.1186/s13012-016-0515-2.

  29. Kane JC, Adaku A, Nakku J, Odokonyero R, Okello J, Musisi S, et al. Challenges for the implementation of World Health Organization guidelines for acute stress, PTSD, and bereavement: a qualitative study in Uganda. Implement Sci. 2016;11:36.

    Article  PubMed  PubMed Central  Google Scholar 

  30. Jones A. Implementation of hospital care pathways for patients with schizophrenia. J Nurs Manag. 2000;8:215–25. https://doi.org/10.1046/j.1365-2834.2000.00175.x.

    Article  CAS  PubMed  Google Scholar 

  31. Hughes R, Aspinal F, Addington-Hall JM, Dunckley M, Faull C, Higginson I. It just didn’t work: the realities of quality assessment in the English health care context. Int J Nurs Stud. 2004;41:705–12.

    Article  PubMed  Google Scholar 

  32. Nollen C, Drainoni ML, Sharp V. Designing and delivering a prevention project within an HIV treatment setting: lessons learned from a specialist model. AIDS Behav. 2007;11:S84–94.

    Article  PubMed  Google Scholar 

  33. Kahan D, Leszcz M, O'Campo P, Hwang SW, Wasylenki DA, Kurdyak P, et al. Integrating care for frequent users of emergency departments: implementation evaluation of a brief multi-organizational intensive case management intervention. BMC Health Serv Res. 2016;16 https://doi.org/10.1186/s12913-016-1407-5.

  34. Moody G, Choong YY, Greenwood J. An action research approach to the development of a clinical pathway for women requiring Caesarean sections. Contemp Nurse. 2001;11:195–205.

    Article  CAS  PubMed  Google Scholar 

  35. Sorsdahl K, Myers B, Ward C, Matzopoulos R, Mtukushe B, Nicol A, et al. Screening and brief interventions for substance use in emergency Departments in the Western Cape province of South Africa: views of health care professionals. Int J Inj Control Saf Promot. 2014;21:236–43.

    Article  Google Scholar 

  36. Clark JB, Sheward K, Marshall B, Allan SG. Staff perceptions of end-of-life care following implementation of the Liverpool care pathway for the dying patient in the acute care setting: a New Zealand perspective. J Palliat Med. 2012;15:468–73. https://doi.org/10.1089/jpm.2011.0375.

    Article  PubMed  Google Scholar 

  37. Schmied V, Gribble K, Sheehan A, Taylor C, Dykes FC. Ten steps or climbing a mountain: a study of Australian health professionals’ perceptions of implementing the baby friendly health initiative to protect, promote and support breastfeeding. BMC Health Serv Res. 2011;11

  38. Kirk JW, Sivertsen DM, Petersen J, Nilsen P, Petersen HV. Barriers and facilitators for implementing a new screening tool in an emergency department: a qualitative study applying the Theoretical Domains Framework. J Clin Nurs. 2016;25:2786–97. https://doi.org/10.1111/jocn.13275.

    Article  PubMed  Google Scholar 

  39. Bergh AM, Manu R, Davy K, Van Rooyen E, Quansah Asare G, Awoonor-Williams J, et al. Progress with the implementation of kangaroo mother care in four regions in Ghana. Ghana medical journal. 2013;47:57–63.

    PubMed  PubMed Central  Google Scholar 

  40. Liisa AA, Marja-Terttu T, Paivi AK, Marja K. Health care personnel’s experiences of a bereavement follow-up intervention for grieving parents. Scand J Caring Sci. 2011;25:373–82.

    Article  PubMed  Google Scholar 

  41. Smid M, Campero L, Cragin L, Hernandez DG, Walker D. Bringing two worlds together: exploring the integration of traditional midwives as doulas in Mexican public hospitals. Health Care for Women International. 2010;31:475–98.

    Article  PubMed  Google Scholar 

  42. Ross F, O'Tuathail C, Stubberfield D. Towards multidisciplinary assessment of older people: exploring the change process. J Clin Nurs. 2005;14:518–29. https://doi.org/10.1111/j.1365-2702.2004.01085.x.

    Article  PubMed  Google Scholar 

  43. Karlsson A, Johansson K, Nordqvist C, Bendtsen P. Feasibility of a computerized alcohol screening and personalized written advice in the ED: opportunities and obstacles. Accident and emergency nursing. 2005;13:44–53. doi: https://doi.org/10.1016/j.aaen.2004.10.013.

  44. Blake SC, Kohler S, Rask K, Davis A, Naylor DV. Facilitators and barriers to 10 national quality forum safe practices. Am J Med Qual. 2006;21:323–34. https://doi.org/10.1177/1062860606291781.

    Article  PubMed  Google Scholar 

  45. Pace KB, Sakulkoo S, Hoffart N, Cobb AK. Barriers to successful implementation of a clinical pathway for CHF. Journal for healthcare quality : official publication of the National Association for Healthcare Quality. 2002;24:32–8.

    Article  Google Scholar 

  46. Oestrich I, Austin S, Tarrier N. Conducting research in everyday psychiatric settings: identifying the challenges to meaningful evaluation. J Psychiatr Ment Health Nurs. 2007;14:55–63.

    Article  CAS  PubMed  Google Scholar 

  47. Elwell L, Powell J, Wordsworth S, Cummins C. Challenges of implementing routine health behavior change support in a children's hospital setting. Patient Educ Couns. 2014;96:113–9. https://doi.org/10.1016/j.pec.2014.04.005.

    Article  PubMed  Google Scholar 

  48. Melnyk BM, Bullock T, McGrath J, Jacobson D, Kelly S, Baba L. Translating the evidence-based NICU COPE program for parents of premature infants into clinical practice: impact on nurses’ evidence-based practice and lessons learned. J Perinat Neonatal Nurs. 2010;24:74–80.

    Article  PubMed  Google Scholar 

  49. Francis J, M Duncan E, E Prior M, S Maclennan G, U Dombrowski S, Bellingan G, et al. Selective decontamination of the digestive tract in critically ill patients treated in intensive care units: a mixed-methods feasibility study (the SuDDICU study). Health Technology Assessment. 2014;18:1–170. doi: https://doi.org/10.3310/hta18250.

  50. Belkora JK, Loth MK, Chen DF, Chen JY, Volz S, Esserman LJ. Monitoring the implementation of Consultation Planning, Recording, and Summarizing in a breast care center. Patient Educ Couns. 2008;73:536–43.

    Article  PubMed  PubMed Central  Google Scholar 

  51. Dubenske LL, Chih MY, Dinauer S, Gustafson DH, Cleary JF. Development and implementation of a clinician reporting system for advanced stage cancer: initial lessons learned. J Am Med Inform Assoc. 2008;15:679–86.

    Article  PubMed  PubMed Central  Google Scholar 

  52. Dudgeon D, King S, Howell D, Green E, Gilbert J, Hughes E, et al. Cancer Care Ontario’s experience with implementation of routine physical and psychological symptom distress screening. Psychooncology. 2012;21:357–64.

    Article  PubMed  Google Scholar 

  53. Hsu C, Liss DT, Westbrook EO, Arterburn D. Incorporating patient decision aids into standard clinical practice in an integrated delivery system. Med Decis Mak. 2013;33:85–97. https://doi.org/10.1177/0272989x12468615.

    Article  Google Scholar 

  54. Clement CM, Stiell IG, Davies B, O'Connor A, Brehaut JC, Sheehan P, et al. Perceived facilitators and barriers to clinical clearance of the cervical spine by emergency department nurses: a major step towards changing practice in the emergency department. International Emergency Nursing. 2011;19:44–52. https://doi.org/10.1016/j.ienj.2009.12.002.

    Article  PubMed  Google Scholar 

  55. Mello MJ, Bromberg J, Baird J, Nirenberg T, Chun T, Lee C, et al. Translation of alcohol screening and brief intervention guidelines to pediatric trauma centers. Journal of Trauma & Acute Care Surgery. 2013:S301–7. https://doi.org/10.1097/TA.0b013e318292423a.

  56. Mihalic S, Fagan A, Argamaso S. Implementing the LifeSkills Training drug prevention program: factors related to implementation fidelity. Implement Sci. 2008;3

  57. Greenhalgh T, Robert G, Macfarlane F, Bate P, O K. Diffusion of innovations in service organizations: systematic review and recommendations. The Milbank Quarterly 2004;82:581–629.

  58. Van Os-Medendorp H, Eland-De Kok P, Linge RV, Bruijnzeel-Koomen C, Grypdonck M, Ros W. The tailored implementation of the nursing programme ‘Coping with itch’. J Clin Nurs. 2008;17:1460–70.

    Article  PubMed  Google Scholar 

  59. Cheyne H, Abhyankar P, McCourt C. Empowering change: realist evaluation of a Scottish Government programme to support normal birth. Midwifery. 2013;29:1110–21. https://doi.org/10.1016/j.midw.2013.07.018.

    Article  PubMed  Google Scholar 

  60. Shea C, Jacobs S, Esserman D, Bruce K, Weiner B. Organizational readiness for implementing change: a psychometric assessment of a new measure. Implement Sci. 2014;9

  61. Helfrich CD, Li Y-F, Sharp ND, Sales AE. Organizational readiness to change assessment (ORCA): development of an instrument based on the Promoting Action on Research in Health Services (PARIHS) framework. Implement Sci. 2009;4:38. https://doi.org/10.1186/1748-5908-4-38.

    Article  PubMed  PubMed Central  Google Scholar 

  62. Estabrooks CA, Squires JE, Cummings GG, Birdsell JM, Norton PG. Development and assessment of the Alberta context tool. BMC Health Serv Res. 2009;9 https://doi.org/10.1186/1472-6963-9-234.

  63. Baker R, Camosso-Stefinovic J, Gillies C, Shaw E, Cheater F, Flottorp S, et al. Tailored interventions to address determinants of practice. Cochrane Database Syst Rev. 2015;29 https://doi.org/10.1002/14651858.CD005470.pub3.

  64. Hopewell SMS, Clarke MJ, Egger M. Grey literature in meta-analyses of randomized trials of health care interventions. Cochrane Database Syst Rev. 2007; https://doi.org/10.1002/14651858.MR000010.pub3.

  65. Tricco A, Ashoor H, Cardoso R, MacDonald H, Cogo E, Kastner M, et al. Sustainability of knowledge translation interventions in healthcare decision-making: a scoping review. Implement Sci. 2016;11 https://doi.org/10.1186/s13012-016-0421-7.

  66. Cohen DJ, Crabtree BF, Etz RS, Balasubramanian BA, Donahue KE, Leviton LC, et al. Fidelity versus flexibility: translating evidence-based research into practice. American Journal of Preventative Medicine. 2008;35:S381–9. https://doi.org/10.1016/j.amepre.2008.08.005.

    Article  Google Scholar 

  67. Bradford AN, Castillo RC, Carlini AR, Wegener ST, Frattaroli S, Heins SE, et al. Barriers to implementation of a hospital-based program for survivors of traumatic injury. Journal of Trauma Nursing. 2013;20:89–101. https://doi.org/10.1097/JTN.0b013e3182960057.

    Article  PubMed  Google Scholar 

Download references

Acknowledgements

The authors acknowledge Doga Dimerr for the assistance in screening preliminary articles for the review.

Funding

This systematic review was conducted as part of the PhD candidature of Liesbeth Geerligs and developed as part of the Anxiety and Depression Pathway (ADAPT) Program, led by the Psycho-oncology Cooperative Research Group (PoCoG). The ADAPT Program is funded by a Translational Program Grant from the Cancer Institute NSW. Liesbeth Geerligs is funded by a scholarship from the Australian Post-Graduate Awards Scheme (Australian Government) and additional top-up funding from Sydney Catalyst and the ADAPT Program. The funding bodies had no role in study design, data collection, analysis, or writing of the manuscript.

Availability of data and materials

Not applicable

Author information

Authors and Affiliations

Authors

Contributions

All authors (LG, NR, HS, and PB) were involved in conceptualizing the review, refining the search terms, and defining the inclusion/exclusion criteria. LG carried out preliminary database searches and the first screening of the results. All authors screened the articles that met the inclusion criteria. All authors contributed to the development of the thematic framework and participated in a quality review of a subsample of included articles. LG wrote the first draft of the manuscript. NR, HS, and PB made significant contributions to subsequent drafts. All authors contributed to making revisions and responding to reviewer feedback, and all authors read and approved the final manuscript.

Corresponding author

Correspondence to Liesbeth Geerligs.

Ethics declarations

Ethics approval and consent to participate

Not applicable

Consent for publication

Not applicable

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:

Search terms by database. (DOCX 15 kb)

Additional file 2:

Examples of excluded papers for each eligibility criterion. (DOCX 21 kb)

Additional file 3:

Summary table of included papers. (DOCX 87 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Geerligs, L., Rankin, N.M., Shepherd, H.L. et al. Hospital-based interventions: a systematic review of staff-reported barriers and facilitators to implementation processes. Implementation Sci 13, 36 (2018). https://doi.org/10.1186/s13012-018-0726-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13012-018-0726-9

Keywords