Skip to content

Implementation Science: A Primer of Key Concepts for the Respiratory Care Practitioner

Outline

Abstract

Introduction

What are Implementation Research and Implementation Science?

Evolution of Quality Processes in Healthcare

Stages of Implementation

  Pre-Implementation

     Is your organization ready for implementing change?

     What are Models for Improvement?

         Plan-Do-Study- Act (PDSA) Cycle

              Select Example

         Define, Measure, Analyze, Improve and Control (DMAIC) Cycle

         Measuring Progress Toward Implementation

            Increasing Organizational Readiness: Lewin’s 3-Step Change Model

            Plan, Implement, Evaluate: PIE Cycle and GTO™

             Select Example

  Implementation

     Designing a Meaningful Intervention

  Sustainability

      Measuring Progress – Fidelity Assessment

      Sustaining Change―The Dynamic Sustainability Framework

Select Example

Review of the Literature

Summary

References

Abstract

Background: The healthcare environment has been in a state of transformation for numerous years. Delivery of care costs have dramatically increased without foreseeable substantial abatement. Additionally, the need for quality patient interventions (the transition from volume-based to value-based care) along with provider and customer satisfaction are key drivers for change.

Since the beginning of the 21st century, there has been a fundamental explosion of reporting in the quantity and quality of dissemination and implementation research. Numerous conceptual frameworks have been developed and key constructs have been validated in the health care arena. Various professional health delivery societies have advocated for their members to understand, embrace and enact quality focused initiatives within their given organizations and practices. There is a paucity of data in the literature addressing quality conceptual frameworks within the Respiratory Care arena.

This article will provide a brief synopsis of select, key, evolving process improvement concepts. The intent is to inform and review specific Implementation Science concepts and strategies to assist Respiratory Care Practitioners with having a better understanding of some potential approaches to developing and sustaining a culture of quality, utilizing standardized quality paradigms, and how these strategies can relate to the effective delivery of health care. Ultimately, the Respiratory Care Practitioner will be better positioned to appreciate and adapt to a changing healthcare environment and have a better understanding of some of the implementation tools necessary to successfully implement and maintain quality processes in practice.

Key Words

implementation, implementation research, implementation science, quality improvement, quality assurance, continuous quality assurance, quality control, calibration verification

Introduction

Currently, the “Quadruple Aim1,2,3 has been an impetus to assist providers in appreciating what internal and external factors may be influencing the changing healthcare setting and/or system. The four, cyclical, components, of the Quadruple Aim, are integral and are dependent upon each other to maximize the patient and provider healthcare encounter. The QA components, along with examples of each component (that are considered to be the drivers of today’s changing health care system), are as follows.4,5,6

  • Improved Population Health: Healthy People 2020, Preventive Care
  • Enhanced Care Experience: Accountable Care Organizations, Patient-Centered Care, Research: From Bench to Bedside
  • Reduced Cost of Care: Volume to Value
  • Satisfied Care Teams: Improved work life of providers

Understanding, what some of the components of healthcare change might entail does not necessarily translate into effectively developing and employing processes which could maximize improving the delivery of care. Development of Clinical Practice Guidelines (CPGs), by professional organizations, has been viewed as instrumental to adopt expert opinion and evidence-based information into clinical practice in order to improve the delivery of quality care. However, translating CPGs into actual clinical practice may be complex and therefore inconsistently implemented. Guidelines provide the information on what the right intervention might be, however, they do not provide the how to do it.7  Sharing of common goals, among and between stakeholders/disciplines, is often seen as a method to enhance cross fertilization of ideas and patient treatment goals. Enacting departmental and, especially, multidisciplinary approaches to patient care would, theoretically, reduce redundancy and costs. These undertakings could also potentially improve the quality of care, as well as the patient experience. However, these types of activities may not be effectively developed, executed and/or sustained due to a lack of understanding of the individual and related components which are integral to assure program success.

What are Implementation Research and Implementation Science?

A recently developed area of research, Implementation Research, and its interconnected discipline, Implementation Science, have been embraced and endorsed by some health focused educators, professional societies and organizations. 8,9,10,11,12,13,14,15   By definition, Implementation Research is, “the scientific study of methods to promote the systematic uptake of research findings and other evidence-based practices into routine practice, and hence, to improve the quality and effectiveness of health services.” 16 According to the definition proposed by the National Implementation Research Network (NIRN), Implementation Science (IS) is the study of factors that influence the full and effective use of innovations in practice. The goal is not to answer factual questions about what is, but rather to determine what is required.17 Additionally, the National Institute of Health defines Implementation Science (IS) as the study of methods to promote integration of research findings and evidence into healthcare policy and practice. Therefore, for this purpose, IS can be considered to be the structured link between translational research and evidence-based medicine/clinical practice, interventions and outcomes

A major IS stipulation is that, “An intervention will not succeed if it is not implemented properly, even if it has been shown to be effective in research.”18 For that reason, with the goal of improving the quality of health care, Implementation Science complements, and enhances, the transition from focusing primarily on Quality Improvement (QI) initiatives.

Evolution of Quality Processes in Healthcare

Historically, in the healthcare arena, the QI interest began in earnest in the early 1990’s, primarily utilizing the Deming Management Method (DMM),19 which was developed and initiated for the rebuilding of Japan, after World War II. In essence, this method incorporated the Plan, Do, Check, Act (PDCA) cycle to assure the quality of processes. By its design, DMM assumes that the initial plan (P), to be implemented, was correct and that quality was assured if the implementation process was enacted (D) as outlined & checked (C) for exactness. If the D & C aspects were not as expected, action (A) was taken (e.g., retraining) and the plan was reinitiated. The PDCA cycle was followed, and assessed, until the intent of the original plan was achieved. While these concepts were, for the most part, satisfactory for their original intent, they proved to be limited in a fluid, dynamic, environment such as health care.

Further refinements followed such that a QI program commonly included aspects of the following:20, 21, 22

The QI program

• is initiated because of a specific problem within a specific system

  • Identification of area(s) of improvement and develop strategies to improve problem

• goals and findings leads to improvements within current practice paradigms

• is rarely rigorously evaluated

• results in failed outcomes, and reasons are frequently unknown

These last two findings are inherent limitations of a common healthcare QI program, in that there is a general lack of process evaluation, follow-up and potential revision of the original plan; was the original plan goal/assumption correct to begin with? Continuous Quality Improvement (CQI) somewhat addressed these deficits, by placing additional emphasis on the evaluation phase and helped spur the development of the Plan, Do, Study, Act (PDSA) implementation strategy cycle for improving healthcare quality and outcomes. The Plan-Do-Study-Act cycle was also developed by W. Edwards Deming.23

With the goal of improving the quality of health care, Implementation Science incorporates aspects of CQI, but builds upon and may also focus on many of its aspects.20, 21, 22

Exploration, Installation, Initial Implementation and Full Implementation are components commonly incorporated, for any aspect of a healthcare system, into the IS process.24,25,26,27,28 Table2. outlines key IS process components.

Stages of Implementation

Pre-Implementation

Is your organization ready for implementing change?

Prior to developing a construct for a new innovation, the organization must ask (itself) if the proposal is appropriate and is the organization ready to make the change. Scaccia, et al 29 proposed a basic concept to assess organizational readiness: “R = MC2”, whereby Readiness = Motivation X General Capacity X Innovation Specific Capacity.

This group proposed that organizational readiness incorporates three distinct components, which could be assessed independently:

1. the motivation to implement an innovation,

2. the general capacities of an organization, and

3. the innovation-specific capacities needed for a particular innovation.

In general, as described by the authors, Motivation is defined as perceived incentives and disincentives that contribute to the desirability to use an innovation, which includes beliefs about (a) an innovation and (b) support for the innovation that contribute to innovation use. General Capacities comprise attributes of a functioning organization (e.g., sufficient staffing, effective organizational leadership) and connections with other organizations and the community. Innovation-specific capacities are the human, technical, and fiscal conditions that are important for successfully implementing a particular innovation with quality.

Additionally, in 2014 30, an Executive Summary entitled, “Willing, Able → Ready: Basics and Policy Implications of Readiness as a Key Component for Implementation of Evidence-Based Interventions”, was published by the Office of the Assistant Secretary for Planning and Evaluation, US Department of HHS. This publication highlighted the key components of Scaccia, et al, and included strategies/strategic approaches to incorporate evidence-based innovations into implementation readiness preparation and potentially organizational practice.

Readiness has also been evaluated and reported in an article that conducted four studies to assess the psychometric properties of a measure called, “Organizational Readiness for Implementing Change (ORIC)”31 These findings suggest that the ORIC measure would enable the testing of theories about change commitment, and change efficacy, with the goal of reducing the number of change efforts that do not achieve benefits. Ultimately the readiness component would be streamlined to be more targeted and efficient (Table 3 illustrates a modified ORIC measure format).

What are Models for Improvement?

A model for improvement can be defined as an integrated approach to process improvement that helps deliver quick and substantial results in quality and productivity.32  While a number of healthcare improvement models have been described in the literature, below are some examples which may be suitable for assimilation into Respiratory Care practice.

Plan-Do-Study-Act (PDSA) Cycle

Oftentimes, in an effort to improve, an organization will enact a protocol without first utilizing a systematic process to develop, test, implement and sustain improvement(s). As illustrated, the PDSA Cycle (Fig. 3), is a model for continuous improvement, and is useful when assessing and documenting a test for a change.

According to the Institute for Healthcare Improvement QI Essentials Toolkit ©,33 Table 4.highlights the concepts that are essential when employing the PDSA Cycle.

The PDSA Cycle is a process planning progression that, because of its design, continuously involves the user to document, collect and evaluate data against the original proposal (Plan). Unlike the previously mentioned PDCA cycle, this cycle methodology additionally allows for the option of abandoning the proposed plan, if the collected data does not support the original prediction (Plan).

Additionally, the PDSA Cycle involves all staff in assessing problems and suggesting and testing potential solutions. This bottom-up approach increases the likelihood that staff will embrace the change(s) which is a key requirement for successful QI. 34

Select Example

PDSA Cycle: pH Blood Gas Analyzer Operability: Calibration, Calibration Verification (Quality Control [QC])  

Keeping the following concepts in mind, the PDSA Cycle can be implemented for a number of areas in Respiratory Care:

  • What are we trying to accomplish? The aim is to determine which specific outcomes you are trying to change.
  • How will we know that a change is an improvement? Identify the appropriate measures to track your success.
  • What change can we make that will result in an improvement? Identify the key changes that you will actually test.

This example (see also Fig. 1) suggests how the PDSA Cycle can be employed when assessing functionality of a pH Blood Gas analyzer:

  Plan:   Assess pH Blood Gas Analyzer Operability

  Do:     Introduction of calibration agents: maintain record of date & time of calibration

                  Troubleshoot: If QC results are unacceptable, rectify malfunction & recalibrate

             Record intervention.

             Maintain record of date and time of recalibration

  Study: Calibration Verification: QC product(s) analysis

                  Do derived results fall within predetermined expected/acceptable ranges?

  Act:    Use instrument if QC product run(s) acceptable: maintain record of date/time/QC material Lot #s & derived data

                 Do Not use instrumentation if QC results are unacceptable

Troubleshoot source of error

Repeat PDSA Cycle until problem(s) rectified

Proficiency testing as directed

Lean Six Sigma – DMAIC Cycle

While definitions may vary, Lean Six Sigma is commonly described as a fact based, data driven philosophy that provides organizations tools to improve the capability of their business processes. Improving performance and decreasing process variation may help lead to deficiency reduction and improvement in output, employee morale, and the quality of products or services.

The DMAIC Cycle consists of five Phases: Define, Measure, Analyze, Improve and Control. This is the problem-solving methodology behind Lean Six Sigma.35

Fundamentally, each phase of the DMAIC Cycle focuses on the following aspects:

Define: Define the Problem.

Confirm, with data, that there is an ongoing problem and the negative impact that this problem is having on the (business) environment. Are there existing resources that can assist with resolving the problem? Development of a “Goal Statement” is recommended in this phase, which also includes mapping out processes and defining internal/external customers and their stated proposed requirements to solve the problem.

Measure: Map out the current process

Establish a performance baseline, before instituting change, to serve as the standard against which any change can be measured. Consider the root cause(s) of current performance levels and develop a plan to collect and analyze reliable data to ultimately ensure that decisions are based upon comprehensive information. Measures inform whether the changes that are being made actually lead to improvement

Analyze: Identify the cause of the problem

Structured brainstorming sessions may be a method to develop theories about potential causes of a problem. The team reviewing the initial collected data that may also decide that additional information may be useful in the analytical phase.

Oftentimes, displaying the data as charts and/or graphs may enable team members to have a better understanding of the information and allow them to more readily verbalize findings to leadership or other interested parties.

Improve: Implement and verify the solution

Once the team is satisfied that accurate and reliable data has been obtained (to ensure informed decisions), an improvement solution can be developed, communicated, implemented and monitored (measured).

  • The team can incorporate the PDSA cycle to achieve this phase.

Control: Maintain the solution

Once a plan has demonstrated success, the team needs to develop a structured strategy on how to pass it along to other employees and ensure successful implementation, monitoring (how frequently?), documentation and fidelity.

  • Sharing project data and findings with internal customers and external teams (multidisciplinary) may be helpful in maintaining the solution, while also enabling others to recognize areas for improvement and develop similar solution-based strategies within their work environment.

Measuring Progress Toward Implementation

Research proposes that interventions have more effective implementation when change models are used.36 A review published in 2012 identified 109 change models, 61 of which were presented/reviewed.37 Their findings suggest that improvement models vary according to discipline and their level of detail. Therefore, organizations should consider various implementation models and select the one that works best for both the organization and the initiative.

Increasing Organizational Readiness: Lewin’s 3-Step Change Model

Lewin’s change management model, “Understanding the three stages of change” was first introduced in the 1940s. Since then, process improvement innovators have embraced his findings to better understand organizational change and incorporated them into process and change models.38,39

Lewin’s model suggests that change can be ascribed as a three-step process of Unfreeze – Change – Refreeze, which is summarized below {adopted from Lewin’s Change Management Model Understanding the Three Stages of Change {https://www.mindtools.com/pages/article/newPPM_94.htm }.

Lewin’s 3-Step Change Model

Unfreeze (the status-quo): Change can be difficult for many. Some individuals may resist change, as they may actually benefit from the status quo. Therefore, a persuasive message must be developed, and delivered, so that there is a clear understanding of why change must be adopted.

  • To ensure organizational buy-in, it is imperative to communicate the rationale for change. When developing the communication, ensure that the message clearly and succinctly addresses the reasons, urgency, as well as how staff may be affected by the change,
  • Invite participation, by select individuals, who are being affected by the change. This active engagement may be motivational for many, as there is a tendency for people to support initiatives that they have been empowered to help to create.
    • If necessary, specific actions may be necessary to “unfreeze” attitudes to change within the organization.

Change: During this stage, people begin to understand the importance of the change and to look for new approaches to doing things; structured brainstorming may be an integral component of this phase. While not always understood by agents of change, extended time may be necessary for some to accept and embrace the new direction. This phase may require additional managerial “hands-on” to reemphasize the intent, and the importance for acceptance, by those who are resistant to change.

Refreeze: This stage is achieved after people have accepted and incorporated the new change. This acceptance is measureable simply by assessing the frequency that it is employed in everyday business.

Plan, Implement, Evaluate: PIE Cycle and GTO™

The Plan, Implement, Evaluate (PIE) Cycle was incorporated into a 2004 RAND Corporation publication entitled, “Getting To Outcomes™2004: Promoting Accountability Through Methods and Tools for Planning, Implementation and Evaluation (GTO™ -04)” with the intent of assisting communities in developing, or improving, their substance abuse prevention programs.

However, currently, GTO™ is both a model for carrying out prevention programming with quality, and a support intervention aimed at enhancing practitioners’ capacity.40 As such, GTO™ does not advocate for any specific prevention program. Instead, GTO™ provides supports to improve the quality of a particular program with the goal of achieving positive results. By its design, GTO™ offers how-to steps for high quality, outcomes-based programs, which are flexible and applicable to numerous health-based systems improvement initiatives.

Keeping the PIE Cycle framework in mind, Respiratory Care Practitioners could effectively incorporate the GTO’s™ 10 Steps into a quality focused implementation plan strategy. A 2007 jointly supported technical report, from the Centers for Disease Control and Prevention (CDC) and RAND Health, entitled, “Getting To Outcomes™ 10 Steps for Achieving Results-Based Accountability”41 provides a concise overview and potential applications of the 2004 RAND Corporation publication. This succinct document could serve as a road map for developing, implementing, monitoring and evaluating a quality based improvement program.

Select Example

AARC CPG 2012: Aerosol Delivery Device Selection for Spontaneously Breathing Patients Using The GTO™-04 Cycle Framework

This example illustrates how the GTO™-04 Cycle Framework can be employed when incorporating the 2012 AARC Clinical Practice Guideline to assess the effectiveness of aerosol delivery to spontaneously breathing patients.

Planning (Steps 1-6):

1.         Aerosol Delivery for Spontaneously Breathing Patients > 3 y.o utilizing a small volume nebulizer (SVN)

2.         Goal is to effectively deliver aerosolized medication.

3.         AARC CPG 2012: Aerosol Delivery Device Selection for Spontaneously Breathing Patients

4.-6.     Steps (generalized)

a. Assess indication(s)/appropriateness of medical order, including therapy frequency

i. Is there a patient contraindication for prescribed medication(s)

b. Patient Assessment:

i. Level of mentation and ability to follow instructions, including cough and inspiratory maneuvers (e.g., deep breathing and breath-hold)

ii. Resting respiratory rate

iii. Auscultation of lung fields

iv. Ability to position patient accordingly to maximize aerosol deposition in lungs

c. Selection of appropriate medication, medication interfaces (including mouthpiece and face mask) and assure proper functioning of equipment

i. Can patient use mouthpiece appropriately, for the expected amount of medication delivery time (with or without assistance), while incorporating a nose clip?

ii. Utilize size appropriate face mask if mouthpiece is not a viable option

d. Deliver prescribed medication(s) while coaching patient on proper breathing patterns

Evaluating and Improving (Steps 7-10)

7.         Assess and document subjective effectiveness of the therapeutic intervention; including pre- and post-breath sounds, cough efficiency, secretion mobilization, perceived change in dyspnea, etc.

a. Did patient tolerate the intervention?

8.         Did the therapy achieve the intended objective(s)?

a. If yes, is the prescribed frequency appropriate?

b. If no, what alternative intervention might be more applicable?

c. Notify pertinent overlay team members with intervention impressions

9.         Develop an integrated CQI plan (e.g., PDSA) to assess continuation, modification, or discontinuation of prescribed therapeutic intervention as patient’s condition changes over time.

10.        Share patient findings and proposed planning with pertinent team members

a. Solicit input regarding next steps with further interventions.

While this example reads similar to a policy and procedure, the format includes the GTO™-04 Cycle Framework and key components of the 2012 AARC CPG. The format is flexible to make amendments when new scientific information is available, as well as adaptable to a changing patient condition, all the while utilizing the team approach to decision making. The PDSA Cycle can also be incorporated into the plan to continuously assess the effectiveness of the therapeutic intervention as well as modify therapy as the patient condition changes.

In summary, for the pre-implementation stage, readiness is a necessary precursor to successful organizational change.29 When readiness is high, organizational members are more likely to initiate and support the change.31 To measure progress, leaders need to assess actual change in the practice. Ensure buy-in through effective communication and invite participation. The model for improvement is an integrated approach to process improvement.32   Some examples of Implementation Science strategies are PDSA, Lean Six Sigma – DMAIC Cycle and PIE Cycle – GTO™.  It is important to select the improvement model that works best for both your organization and your initiative to ensure acceptance, implementation and sustainability of the program.

Implementation

Recent research suggests that utilizing the principles of Design Thinking may be helpful when creating and implementing a meaningful organization change.42 By definition, Design Thinking is a design methodology that provides a solution-based approach to solving problems.43  These authors suggest that solving ill-defined or unknown complex problems, might best occur , with a better understanding the human needs involved in a change, re-framing the problem in human-centric terms, as well as incorporating structured brainstorming sessions and adopting a hands-on approach in prototyping and testing of the proposed solution.

5 Stages of Design Thinking

A modified version of the 5 Stages of Design Thinking includes:43

  1. Empathize  Research the user’s needs: Since the agent of change may not be directly involved, in a specific area, observing what people do and how they interact with their environment may provide clues about what they think and feel. These insights could allow for more meaningful direction to create innovative solutions.
  2. Context-specific insights from the user’s point of view
  3. Observe, engage, watch, listen
  4. Define – State the users’ needs and problems: Take the information obtained in the Empathize section and analyze it to determine the core problem(s) that the change agent and team have identified. Develop a problem statement, in a human-centered manner, to personalize/enhance ownership what needs to be addressed.
  5. Define the problem in a clear, meaningful, and actionable way
  6. Frame the right problem to create the right solution
  7. Ideate—Challenge Assumptions and Create Ideas: look for alternative ways (e.g., structured brainstorming, best practices, bench marking) to view the problem and identify innovative solutions to the problem statement that was created in the Define segment
  8. Push for the widest possible range of ideas; use brainstorming, mind mapping, etc.
  9. Generate then evaluate; carry forward multiple good ideas
  10. Prototype—Start to Create Solutions: the emphasis is to identify the best possible solution for each of the problems identified during the first three stages.
  11. Create prototypes to problem solve and learn
  12. Test—Try Solutions Out: implement and evaluate/measure the prototype solution. As illustrated in Figure 5, the Test stage allows the user’s to revisit the previous stages, in the process, to make further iterations, alterations and refinements to rule out alternative solutions.
  13. Test in a real-life context, solicit feedback, refine the prototype

Additionally, consider what might need to be incorporated into the design and planning process once full implementation is rolled out across the organization.

Implementation Drivers

Implementation Drivers are critical to implementing change, where they are designated as the engine of change.44  

The National Research Implementation Network (NIRN) undertook a systematic review, of qualitative literature and best practices, of successfully implemented practices and programs, and assembled lists of the barriers and facilitators that were identified from the searches. Active Implementation Drivers arose from these lists, whose roles are considered dynamic and interactive, for the purposes of driving change in consistent and innovative ways

These interactive processes are integrated to maximize their influences on staff and the organizational culture. Additionally, the Implementation Drivers augment each other by compensating for the inherent weakness of each component.

Implementation Teams

Implementation Teams play an active role by providing an internal support structure to help move innovations through the stages of implementation.45 These teams ensure that the implementation infrastructure is effectively assimilated to support the proposed practice or program.

Oftentimes, due to resource limitation, a single champion is designated to assuring proper implementation of a new program. In this case, the long term success of the program may be limited if the champion can no longer fulfill his/her responsibilities. Therefore, the team approach may be more beneficial to ensure sustained success of new and existing programs. Additionally, when multiple teams are engaged in a larger-scale change effort they need to be firmly connected, to ensure proactive communication and engage in problem-solving activities. The purpose and functions of each team needs to be clearly defined and known to all other teams.

Sustainability

Despite the annual investment of tens of billions of U.S. tax dollars on health research and progress in developing, testing, and implementing evidence-based healthcare practices, we have limited understanding of how to sustain quality health care in routine services.46  The intent of this section is to overview some common models which support, and enhance, program sustainability.

Measuring Progress – Fidelity Assessment

From an implementation point of view, any innovation (evidence-based or otherwise) is incomplete without a good measure of fidelity to detect the presence and strength of the innovation in practice.47 Fidelity assessment measures the strength of the innovation and enables improvement in the implementation plan and process, with the goal of program sustainability.

The Dynamic Sustainability Framework

Sustaining Change―The Dynamic Sustainability Framework (DSF)48

In real-world settings, change occurs over time. Because of this phenomenon, ongoing optimization of fit (through adaptive, continuous quality improvement) is required to ensure intervention sustainability. Traditionally, implementation project sustainability was assessed with a variety of methods (needs assessments, long-term action plans, tracking of program adaptation, financial planning, etc.). However, these methods may only be applicable in in a static delivery system, requiring periodic assessment, and not in an ongoing manner.

As previously stated, all aspects of health care are undergoing tremendous, rapid, transformations. In many cases departments are cross fertilizing procedures, personnel and other resources. According to Chambers48, the Dynamic Sustainability Framework (DSF) “emphasizes that change exists in the use of interventions over time, the characteristics of practice settings, and the broader system that establishes the context for how care is delivered.” This model emphasizes three focus areas; importance of context, the need for ongoing evaluation and decision making, and the goal of continuous improvement.

The change agent should also always be cognizant of the role change places on intervention sustainability, as well as the importance of optimizing fit to achieve maximal benefit (see Fig. 3).

The DSF assesses the fit between overlapping areas which are encountered in the healthcare arena. These overlapping areas contain and are characterized as the

  • Intervention: Components, Practitioners, Outcomes, Delivery Platform
  • Practice Setting (Context): Staffing, Information Systems, Origination Culture/Climate Structure, Business Model, Training, Supervision
  • Ecological System: Other Practice Settings, Policy, Regulations, Market Forces, Population Characteristics

In summary, in the real world, change occurs over time. Therefore ongoing optimization of fit through adaptive, continuous quality improvement is an important aspect of sustaining change. Sustainability can be viewed as a dynamic process and as such can remain viable if the change agent keeps the seven DSF tenets in mind, when developing implementation strategies (e.g., PDSA) for a new program, as well as modification to existing programs.

Select Example

Evaluation of existing pulmonary rehabilitation program to reduce readmission and emergency department visits using the Dynamic Sustainability Framework concept.

This example incorporates concepts from the Dynamic Sustainability Framework (DSF) to evaluate the feasibility of developing, implementing and monitoring/documenting a treatment modification strategy to an existing pulmonary rehabilitation program. Employing a multi-disciplinary strategy, qualified pulmonary rehabilitation respiratory care practitioners would provide home visits to recently discharged COPD patients with the goal of reducing hospital readmissions and emergency department (ED) visits.49

Figure 4: Application of the Dynamic Sustainability Framework

This real-world pilot model illustrates how DSF concepts may be applied to the modification of an existing Pulmonary Rehabilitation program by utilizing (in this instance) the RCP as a key multi-disciplinary driver with the intent of reducing ED & hospital readmissions. Additionally, the PDSA cycle is reinforced within the cycle to ensure that continuous quality improvement is included and sustained within this process.

  • In this instance, the DSF model format allows flexibility in who should be the change driver, along with identifying and including additional key individuals necessary to plan and optimize care post-discharge. As always, in a multi-care patient approach, communication about the various care planning objectives, implementation steps, process monitoring and outcomes reporting is vital to ensure program and patient successes.
  • Long-term program adherence and sustainability has been described in a COPD program that was designed and implemented to reduce readmission rates using a care-bundle model.50

Review of the Literature

PubMed (accessed from inception to July 2019): Implementation Science, Implementation Science and Respiratory Care, Implementation Science and Clinical Laboratory Practices,

Google (accessed to July 2019) Implementation Science, Implementation Science Organizations, Implementation Science Professional Organizations, Implementation Science Organizations Worldwide

Summary

In the U.S., enormous sums of resources are expended each year in healthcare with, oftentimes, less than optimal return on investment. In an effort to improve upon the delivery of care, since the late 20th century, focus was placed on healthcare quality improvement, which has been met with mixed success. Utilization of process concepts that were designed for industry has been less than optimal in a fluid, dynamic, environment such as healthcare.

Undertaking change can oftentimes be perceived as a daunting challenge, even if evidence suggests that the change is necessary.

Having a clear understanding of potential quality improvement processes can assist the change agent in successfully designing, implementing, measuring and instituting new, or modifications to existing, programs. Implementation Science principles, and how they are intimately linked with healthcare quality programs, should enable the Respiratory Care Practitioner/change agent with developing processes that improve sustained delivery of quality care.

Understanding organizational readiness (for change) is paramount prior to even considering designing an improvement project and/or instituting a new program. Such an understanding reduces program development time wasted, and overall frustration, in trying to implement a program that may not be perceived as valuable.

Some healthcare organizations, have advocated for their members to embrace and enact Implementation Science principles within their scope of practice. Now is the time for the field of Respiratory Care to follow their colleague’s example to improve quality, within their respective areas, by adopting a focus on the field of Implementation Science.

Acknowledgements:

I would like to thank the following individuals for providing valuable constructive feedback on earlier manuscript versions: Ellen A. Becker, PhD, RRT RPFT AE-C FAARC; Susan B. Blonshine, BS, RRT, RPFT, AE-C, FAARC; Christopher G. Green, MD.

Return to Resources at: https://pftlabresources.com/resources/

References

  1. Sikka R, Morath JM, Leape L. The quadruple aim: care, health, cost and meaning in work. BMJ Qual Saf 2015;24:608-610.
  2. Bodenheimer T, Sinsky C. From triple to quadruple aim: care of the patient requires care of the provider. Ann Fam Med 2014;12(6):573-575.
  3. Berwick DM, Nolan TW, Whittington. The triple aim: care, health, and cost. Health Affairs 2008;27(3):759–769.
  4. Rittenhouse DR, Shortell SM, Fisher ES. Primary care and accountable care–two essential elements of delivery-system reform. N Engl J Med. 2009 Dec 10;361(24):2301-3.
  5. https://www.cms.gov/Medicare/Quality-Initiatives-Patient-Assessment-Instruments/Value-Based-Programs/MACRA-MIPS-and-APMs/MACRA-MIPS-and-APMs.html. Accessed April 12, 2019.
  6. https://www.healthypeople.gov/2020/topics-objectives/topic/Access-to-Health-Services. Accessed April 12, 2019.
  7. Vander Schaaf EB, Seashore CJ, Randolph GD. Translating clinical guidelines into practice: Challenges and opportunities in a dynamic health care environment. N C Med J 2015;76(4):230-234.
  8. Weiss CH, Krishnan JA, Au DH, Bender BG, Carson SS, Cattamanchi A, et al. An official American thoracic society research statement: Implementation science in pulmonary, critical care, and sleep medicine. Am J Respir Crit Care Med 2016; 194(8): 1015-1025.
  9. Bender BG, Krishnan JA, Chambers DA, Cloutier MM, Riekert KA, Rand CA, et al. American thoracic society and national heart, lung, blood institute implementation research workshop report. Ann Am Thorac Soc 2015; 12(12):S213-S221.
  10. Mensha GA. Embracing Dissemination and implementation research in cardiac critical care. Glob Heart 2014; 9(4):363-366.
  11. Estabrooks PA, Brownson RC, Pronk NP. Dissemination and implementation science for public health professionals: An overview and call to action. Preventing chronic disease; Public health research, practice and policy 2018:15 (E162); 1-10.
  12. Ramaswamy R, Mosnier J, Reed K, Powell BJ, Schenck AP. Building capacity for public health 3.0: introducing implementation science into a MPH curriculum. Implementation Science 2019: 14(18): 1-10.
  13. Lobb R, Colditz GA. Implementation science and its application to population health. Annu Rev Public Health 2013; 34: 235-251.
  14. Siddiqi K, Sheikh A. Primary care respiratory medicine broadens its focus to include global respiratory health, tobacco control and implementation science (editorial). Npj Primary Care Respiratory Medicine June 22, 2017.
  15. Castiglione SA, Ritchie JA. Moving into action: we know what practices we want to change, now what? An implementation guide for health care practitioners. Canadian institutes of health research March 28, 2012.
  16. Eccles MP, Mittman BS. Welcome to implementation science (editorial). Implementation Science 2006; 1(1):1-3.
  17. National Implementation Research Network. NIRN https://nirn.fpg.unc.edu/learn-implementation/implementation-science-defined  Accessed April 12, 2019.
  18. Trochim W, Kane C, Graham MJ, Pincus HA. Evaluating translational research: A process marker model. Clin Trans Sci 2011;4:153-162.
  19. Walton M. The Deming Management Method 1986 Dodd, Mead publishers; p 86-88.
  20. Bauer MS, Damschroder L, Hagedorn H, Smith J, Kilbourne AM. An introduction to implementation science for the non-specialist. BMC Psychology 2015; 3(32): 1-12.
  21. Balasubramanian BA, Chase SM, Nutting PA, Cohen DJ, Ohmand Strickland PA, Crosson JC, Miller WL, et al. Using learning teams for reflective adaptation (ULTRA): Insights from team-based change management strategy in primary care. Ann Fam Med 2010: 8(5) 425-432.
  22. Balasubramanian BA, Cohen DJ, Davis MM, Gunn R, Dickinson LM, Miller WL, et al. Learning evaluation: blending quality improvement and implementation research methods to study healthcare innovations. Implementation Science 2015: 10(31) 1-11.
  23. Deming WE. The new economics for industry, government, education. Cambridge: Massachusetts Institute of Technology; 1994.
  24. National Implementation Research Network (NIRN). https://nirn.fpg.unc.edu/learn-implementation/implementation-stages   Accessed April 12, 2019.
  25. National Implementation Research Network (NIRN). https://nirn.fpg.unc.edu/learn-implementation/implementation-stages/exploration-readiness  Accessed April 12, 2019.
  26. National Implementation Research Network (NIRN). https://nirn.fpg.unc.edu/learn-implementation/implementation-stages/installation  Accessed April 12, 2019.
  27. National Implementation Research Network (NIRN). https://implementation.fpg.unc.edu/module-4/topic-5-initial-implementation-stageNIRN  Accessed April 12, 2019.
  28. National Implementation Research Network (NIRN). https://implementation.fpg.unc.edu/module-4/topic-6-full-implementationNIRN  Accessed April 12, 2019.
  29. Scaccia JP, Cook BS, Lamant A, Wandersman A, Castellow J, Katz J. A practical implementation science heuristic for organizational readiness: R = MC2. J Community Psychol 2015;43(4):484–501.
  30. Dymnicki A, Wandersman A, Osher D, Grigorescu V, Huang L. Willing, able → ready: Basics and policy implications of readiness as a key component for implementation of evidence-based interventions. Executive summary. ASPE Issue Brief. Office of the assistant secretary for planning and evaluation. Office of human services policy-U.S. department of health and human services September 2014.
  31. Shea CM, Jacobs SR, Esserman DA, Bruce K, Weiner BJ. Organizational readiness for implementing change: a psychometric assessment of a new measure. Implementation Science 2014; 9(7): 1-15.
  32. Langley GL, Moen R, Nolan KM, Nolan TW, Norman CL, Provost LP. The Improvement Guide: A Practical Approach to Enhancing Organizational Performance (2nd Edition). Jossey-Bass Publishers; 2009.
  33. Institute for Healthcare Quality. QI Essentials Toolkit: PDSA Worksheet. http://www.ihi.org/resources/Pages/Tools/Quality-Improvement-Essentials-Toolkit.aspx. Accessed April 12, 2019.
  34. Greenhalgh T, Robert G, Bate P, Kyriakidou O, Macfarlane F, Peacock R. How to Spread Good Ideas: a systematic review of the literature on diffusion, dissemination and sustainability of innovations in health service delivery and organization. Report for the National Coordinating Centre for NHS Service Delivery and Organisation R & D (NCCSDO). London: NCCSDO; 2004.
  35. Lean process. Six sigma and lean manufacturing. The DMAIC process for proven quality improvement.   http://www.leanprocess.net/dmaic-process-six-sigma/  Accessed April 21, 2019.
  36. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: recommendations for specifying and reporting. Implementation Science 2013; 8(139):1-11.
  37. Tabak RG, Khoong EC, Chambers D, Brownson RC. Bridging research and practice. Am J Prev Med 2012; 43(3):337-350.
  38. Hussain ST, Lei S, Akram T, Haidar MJ, Hussain SH, Ali M. Kurt Lewin’s Change Model: A Critical Review of the role of Leadership and employee Involvement in Organizational Change. J Innovation & Knowledge 2018; 3: 123-127.
  39. Manchester J, Gray-Miceli DL, Metcalf JA, Paolini CA, Napier AH, Coogle CL, Owens MG.. Facilitating Lewin’s change model with collaborative evaluation in promoting evidence based practices of health professionals. Evaluation and Program Planning 2014; 47: 82-90.
  40. Chinman M, Hunter SB, Ebener P, Paddock SM, Stillman L, Imm P, Wandersman A. The getting to outcomes demonstration and evaluation: An illustration of the prevention support system.  Am J Community Psychol 2008; 41(3-4): 206–224.
  41. Wiseman S, Chinman M, Ebener PA, Hunter S, Imm P, Wandersman A. Getting to outcomes™ 10 steps for achieving results-based accountability. Rand® Health Technical Report 2007.
  42. Plattner H. An introduction to design thinking process guide. The institute of design at stanford  2010  https://dschool-old.stanford.edu/sandbox/groups/designresources/wiki/36873/attachments/74b3d/ModeGuideBOOTCAMP2010L.pdf Accessed April 12, 2019.
  43. Rikke D, Teo S. 5 Stages in the design thinking process.  https://www.interaction-design.org/literature/article/5-stages-in-the-design-thinking-process?fbclid=IwAR0sGZp6kKlY0JKioecN7KvknkBUy0FK2SSICtQWvdznty4g2iOcK6Q5H74. Accessed April 28, 2019.
  44. National Research Implementation Network https://nirn.fpg.unc.edu/learn-implementation/implementation-drivers Accessed April 28, 2019.
  45. National Research Implementation Network https://implementation.fpg.unc.edu/module-1/implementation-teams Accessed April 28, 2019.
  46. Proctor E, Luke D, Calhoun A, McMillen C, Brownson R, McCrary S, Padek M. Sustainability of evidence-based healthcare: research agenda, methodological advances, and infrastructure support. Implementation Science 2015; 10(88):1–13.
  47. National Research Implementation Network https://implementation.fpg.unc.edu/module-6/topic-3/usable-innovations-and-performance-assessment Accessed April 28, 2019.
  48. Chambers DA, Glasgow RE, Stange KC. The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implementation Science 2013; 8(117); 1-11.
  49. Connors GA, Goel N, Harper G, Poffenroth M, Kopelen RA, Greenwalt GE, et al.  INOVA Respiratory therapist home visits program after hospitalization to decrease readmission and ER visits for COPD patients in primary care network. Resp Care 2018; 63 (Suppl 10) 3025854.
  50. Zafar MA, Nguyen B, Gentene A, Ko J, Otten L, Panos RJ, Alessandrini EA. Pragmatic challenge of sustainability: Long-term adherence to COPD care bundle maintains lower readmission rate. Jt  Comm J Qual Patient Saf. 2019; 45(9): 639-645 (Epub July 19, 2019).