Analysis which misses the mark is disappointing for all parties. It is conventional wisdom that one needs to start with a logic model and indicators when conducting an evaluation. While logic models can be helpful they are not absolutely necessary as long as there is a clear description of the program and its objectives. More troubling is the trend to simplify knowledge development into creating a list of indicators as a first step in an evaluation. This approach has a high probability of producing sub-optimal results. To be useful, indicators need to be understood in a context.
Health Care Program Evaluation | RAND
For example, is low cost a marker for program efficiency or a reflection of cost shifting from the organization to the client? To be relevant, indicators need to have a shared meaning, and relevance, to key actors. Unfortunately, indicators may be more a reflection of form rather than content; that is, people come up with indicators because they think they need to do so. In addition, while the indicators developed may be interesting they may not capture the essence of what administrators and policy makers need to know.
A preferred approach is to develop a set of key questions which clearly reflect the new knowledge that administrators and policy makers want and need to know in order to make informed decisions. Indicators should be a product of a process of analysis.
Medical Care Evaluation (MCE)
When this is the case they are much more likely to have a shared meaning and relevance to decision-makers. Formal evaluations of new initiatives are conducted to ensure that the initiatives are working as planned and are achieving intended results. Process or formative evaluations are conducted to determine if services are being delivered in a manner that is consistent with the model of care adopted and with the policies of the program.
Process evaluations can be used to improve how services are delivered. Outcome evaluations can be used to determine the relative "worth" of a program and can be used to make decisions about whether a program will be maintained, modified or ended. Two other approaches, a proof of concept evaluation and an implementation evaluation , can also be conducted in the early stages of a new program.
The application of the proof of concept approach to healthcare evaluation was developed by Hollander Analytical Services Ltd. A description of this approach is presented in Appendix 1. These evaluations should, ideally, precede both process and outcome evaluations. The first looks at the consistency of the proposed care model with best practices for similar initiatives i.
The second evaluates the implementation phase of an initiative. Thus, one can think of a progression of four types of evaluation: evaluation of the model that is developed proof of concept evaluation ; evaluation of the implementation of the model implementation evaluation ; evaluation of how the model is operating process evaluation ; and evaluation of whether the model should be continued in its existing form outcome evaluation see Figure 1.
While the above provides an overview of the main types of evaluations, other approaches provide more specific domains of inquiry to be considered in conducting an evaluation. The CIHI performance domains are acceptability, accessibility, appropriateness, competence, continuity, effectiveness, efficiency and safety.
The original HTF was a federal program to fund and evaluate new and innovative models of care delivery. Sustainability, which refers to the extent to which a program appears to be well funded and supported and is likely to continue to exist over time, can be added to the above criteria. Table 1 presents the proposed evaluation framework and evaluation domains. Table 2 presents an initial set of generic evaluation questions for developing instruments for each domain of inquiry.
The questions in Table 2 can be used to develop a series of indicators of relevance to a particular evaluation, for new or existing programs. It is recognized that not all questions may be included in each evaluation. Rather, the generic questions can serve as a guide for thinking about how any specific evaluation could be conducted. For larger programs of research, where multiple programs are being evaluated, the generic questions could be used to develop a core set of questions which should be covered in all evaluations.
Table 1. Appropriateness of the model design proof of concept evaluation and structure of the model This relates to whether or not the model itself is well documented, is designed to meet the stated purposes, goals and objectives of the program, and is consistent with best practices in the field. The rationale for the model, the key characteristics of the model and the organizational structure of the model are all included in this domain of inquiry.
- Evaluation and Performance Improvement;
- Resilience and Development: Positive Life Adaptations;
Efficiency and effectiveness of model implementation implementation evaluation This relates to whether or not the model was implemented in accordance with the required model design, how well or poorly the model was implemented and the acceptance of the new model by personnel in the organization and other key actors. Functionality of the Model Process Evaluation 3. Appropriate care provision This relates to an assessment of the extent to which there are adequate staff to provide care; care provision is carried out in a consistent manner and in accordance with documented policies and procedures; and the model is "functional," that is, that the process of care provision functions in an appropriate manner.
Continuity of care and care coordination This refers to how well care services, and the process of providing care, are coordinated across the component parts of the continuum. Competence of personnel This relates to the professional qualifications and competence of the people managing and delivering services, for example, the care staff in a service delivery organization. Effectiveness of the Model Outcome Evaluation 6. It also relates to the hours of operation and the ease of access to needed services.
- Health Care Evaluation | olmenjatufood.tk?
- Evaluation: what to consider | The Health Foundation;
- 100 Most Infamous Criminals?
- Grails in Action (2nd Edition).
- The American Way: A Geographical History of Crisis and Recovery.
- Webb Society Deep-Sky Observers Handbook, Vol. 2: Planetary and Gaseous Nebulae.
- Get help from Veterans Crisis Line;
Cost-effectiveness This relates to the value for money obtained by the organization which adopted the care model. It relates to both the costs and outcomes of the model. Health impacts This relates to the impact, if any, of the model on the clientele served and on the health status of the broader population. It refers to the extent to which a given model has the potential to be adopted more broadly across Canada, and the extent to which it has actually been adopted across organizations or jurisdictions. It is a measure of the diffusion of innovation.
Sustainability This relates to how well the model can continue to operate over time into the future. Table 2. Types of evaluation, domains of inquiry and generic evaluation questions Types of Evaluation and Domains of Inquiry Generic Evaluation Questions Design and Implementation of the Model 1. Appropriateness of the model design proof of concept evaluation Is the documentation on the model clear and comprehensive?
Services on Demand
Is the model congruent with its intended purposes and rationale? Is the model design congruent with the goals and objectives of the model and with best practices? What are the key characteristics of the model? What is the organizational structure of the model? What is the expenditure allocation, or budget breakdown, of the model? Efficiency and effectiveness of model implementation implementation evaluation Was the model implemented within the anticipated time frame?
Was the program implemented consistent with the description of the model and program policy? During implementation were there changes to the model design? If so, were they well documented and supported? How well was the new model accepted by staff and management? Overall, how would the staff and management rate the "success" of the implementation? Appropriate care provision To what extent is care provision consistent with program policy?
To what extent are care needs met in a timely manner? To what extent are clients' questions answered in an appropriate and timely manner? To what extent is there adequate coverage for staff sick days and holidays? To what extent are emergency procedures in place and tested on a regular basis? To what extent are staff levels adequate to carry out the needed work?
Continuity of care and care coordination To what extent is there "informational continuity" is information from prior events used to give appropriate care to the client?