tools & resources
Adaptation = Degree to which an intervention is changed or modified by a user in the process of its adoption and conduct.
Adoption = Extent of adoption measured by the number, proportion, and representativeness of settings that begin delivery of a given intervention (RE-AIM).
Channel of dissemination = Route of message delivery (e.g., mass media, community, interpersonal) (NCI/Making Health Communication Programs Work).
Channels of EBI delivery = Pathway by which intervention is delivered to participants (e.g., face-to-face; small group; telephone).
Clinical / study clinic = Facility with defined responsibilities for recruiting, enrolling, treating, and following patients or subjects in a clinical trial (Meinert, 1986).
Compatibility = Degree to which an innovation is perceived as consistent with the existing values, past experiences, and needs of potential adopters (Rogers, 2003).
Complexity = Degree to which an innovation is perceived as relatively difficult to understand and to use (Rogers, 2003).
Core elements = Features that are responsible for an intervention's efficacy (Rogers, 2003).
Diffusion = Aggregate spread of innovations through intentional and non-intentional means. Passive process that is not targeted, perhaps even haphazard, largely unplanned and uncontrolled (Lomas, 1993). The process by which a new idea or new product is accepted by the market (Rogers , 2003).
Dissemination = The purposive diffusion of an effective innovation to a targeted set of potential adopters.
Dissemination research = The study of the processes and variables that determine and/or influence the adoption of knowledge, interventions or practice by various stakeholders (Lomas J., 1997).
Economic evaluation = Comparison of the relationship between costs and outcomes of alternative healthcare interventions, such as cost-benefit analysis, cost-effectiveness analysis, and cost-utility analysis.
Effectiveness Study = Studies of the idea or concept when conducted in less controlled situations. Studies of the effects of an innovation under real-world conditions (SPR, 2005).
Efficacy Trial = Controlled study of the idea or concept when conducted in highly controlled situations. The extent to which an intervention produces a beneficial result under ideal conditions. (cc).
Engagement = Process of matching evidence-based intervention (EBI) characteristics with interests and needs of potential adopting organizations or their decision makers.
Evidence-based interventions (EBI) = Scientifically-based intervention studies which have been shown to be efficacious or effective (Fixen, 2006).
Fidelity = Faithfulness of the intervention in practice to its original design. The degree to which an EBI is delivered as originally intended.
Fit = The degree to which the characteristics of an EBI are compatible with the delivery system structure and values.
Implementation = Introduction and application of an intervention or its elements in a particular setting.
Implementation effectiveness trial = Test of the effectiveness of an efficacious program when implementation can vary, or is deliberately varied, so that both availability and acceptance can vary (Flay, 1986).
Implementation evaluation =Assessment of how, and at what level, a program is implemented, and what and how much were received by the target population (i.e., a type of process evaluation) (Flay, 1986).
Implementation (practice) = The processes or methods of delivering an intervention, including core component identification, adaptation, replication, fidelity, fit, sustainability or evaluation. Specified set of activities designed to put into practice an activity or program of known dimensions (Cancer Control PLANET Web site, 2006). Efforts to establish the use of an innovation in an adopting site. Occurs after a site makes decision to adopt.
Implementation research = Study of the processes or methods of delivering an intervention, including core component identification, adaptation, replication, fidelity, fit, sustainability or evaluation.
Institutionalization (Sustainability) = Sustained integration of a particular intervention into a standard practice or infrastructure of a setting.
Observability = Degree to which the results of an intervention are visible to others (Rogers, 2003).
Originating group = Group that developed the intervention or conducted the original efficacy testing on the intervention (Battelle Report on Replication and Dissemination, 2002).
Pilot (See Trialability) = Preliminary study designed to indicate whether a larger study is practical (also known as feasibility study) (Meinert, 1986).
Process evaluation = Evaluation that uses a mixture of methods to identify and describe the factors that promote or impede the implementation of an intervention.
Program evaluation = Evaluation conducted to identify a program's accomplishments and effectiveness (NCI/Making Health Communication Programs Work, 1989).
Provider reminders or checklists = Intervention strategy that informs, cues or reminds providers or other healthcare professionals that individual clients are due (reminder) or overdue (recall) for a checkup or medical procedure. Techniques for delivery include notes in client charts (written or electronic) or memorandum or letter. (Guide)
Provider or system-oriented interventions = Category of intervention strategies that intends to affect change on providers or on the systems within which providers work (Guide to Community Preventive Services).
Qualitative research = Qualitative research explores the subjective world. It attempts to understand why people behave the way they do and what meaning experiences have for people. Qualitative research relevant to effectiveness reviews may include interviews, observational studies, and process evaluations.
Quasi-experimental design with control group = An experiment in which units are not randomly assigned to conditions and contains a control group that receives no treatment. The control group is selected to be as similar as possible to the treatment group (Shadish, Cook, and Campbell, 2002).
Randomized controlled trial (RCT) = An experiment in which two or more interventions, possibly including a control intervention or no intervention, are compared by being randomly allocated to participants. In most trials one intervention is assigned to each individual but sometimes assignment is to defined groups of individuals (for example, in a household) or interventions are assigned within individuals (for example, in different orders or to different parts of the body).
Receiving group = Group who adopt and use an evidence-based intervention (Battelle Report on Replication and Dissemination, 2002).
Re-invention (See adaptation.) = Degree to which an innovation is changed or modified by a user in the process of it's adoption and implementation (Rogers, 2003).
Replication = Testing of the same intervention to verify that it produces the same effect.
Representativeness = The degree to which the study group reflects the larger target population on a set of agreed characteristics. Use of a sampling method which deliberately samples for heterogeneity aimed at reflecting diversity on presumptively important dimensions, even though not formally randomized (Shadish, Cook, and Campbell, 2003).
Resources = Physical, staffing or financial supports necessary to carryout the EBI. A response of sufficient resources indicates that the study described the physical, staffing, or financial supports necessary and demonstrated that these resources were present in the setting where implementation occurred.
Setting = Where an intervention is delivered.
Standard care = Comparison group that mimics typical practice (Meinert, 1986).
Training = Level of instruction and skill development provided to those who deliver the EBI.