Getting Evaluation Questions Right: The First Step in
Getting Evaluation Questions Right: The First Step in Quality Evaluation Presented at Office of Personnel Management Federal Employee Development Evaluation Conference September 26, 2016 MICHAEL COPLEN Senior Evaluator Office of Research, Development & Technology Federal Railroad Administration U.S. Department of Transportation FRA - Office of Research, Development & Technology Moving America Forward DISCLOSURE The views and perspectives in this presentation are solely those of my own, and do not purport to represent those of the Federal Railroad Administration, the Department of Transportation, or the federal government. FRA - Office of Research, Development & Technology 2 Moving America Forward Overview
Evaluation context: history, definition, roles, evaluation frameworks. What are some common flaws in developing evaluation questions? What constitutes good quality evaluation questions? What? So what? Now What? Exemplar program with illustrative questions How do quality evaluation questions drive useful, actionable evaluation? FRA - Office of Research, Development & Technology Moving America Forward What is evaluation? Evaluation is the systematic application of defensible criteria to determine the value, merit or worth (i.e. quality, utility, effectiveness, or significance) of something. Program evaluations answer big picture questions about programs, like: How well was the program designed? To what extent did the program achieve its goals? Are the results worth what the program costs? Should it continue? How can it be improved? FRA - Office of Research, Development & Technology 4
Moving America Forward Why Evaluation in Federal Programs? Congressional Mandates Government Performance and Results Act (GPRA, 1993) Program Assessment Rating Tool (PARTs, 2002) GPRA Modernization Act of 2010 OMB Memos M-13-17, July 26, 2013: Next Steps in the Evidence and Innovation Agenda M-13-16, July 26, 2013: Science and Technology Priorities for the FY 2015 Budget M-10-32, July 29, 2010: Evaluating Programs for Efficacy and Cost-Efficiency M-10-01, October 7, 2009: Increased Emphasis on Program Evaluations M-09-27, August 8, 2009: Science and Technology Priorities for the FY2011 Budget GAO reports Program Evaluation: Strategies to Facilitate Agencies Use of Evaluation in Program Management and Policy Making (June, 2013) Program Evaluation: A Variety of Rigorous Methods Can Help Identify Effective Interventions (GAO-10-30, November, 2009) Program Evaluation: Experienced Agencies Follow a Similar Model for Prioritizing Research (GAO-11-176 , January, 2011) Federal Evaluation Working Group Reconvened in 2012 to help build evaluation capacity across the federal government
[We] need to use evidence and rigorous evaluation in budget, management, and policy decisions to make government work effectively. FRA - Office of Research, Development & Technology Moving America Forward Assessing the logic of Evaluation in Federal Programs ACTIVITIES Funded Activity Family ______ e.g., Scientific Research Technology Development OUTPUTS Deliverables Products Technical Report(s) Frameworks Model(s) Research to Impact GAP
OUTCOMES Application of Research Behavior Change Data Use: Users Adopt Guidelines, Standards or Regs Emergent Outcomes Changing Practices Positive Knowledge Gains Negative Effects Unintended Consequences EVALUATION FRA - Office of Research, Development & Technology Moving America Forward
IMPACTS The Research-Evaluation Continuum Research Primary Purpose: Primary audience: Types of Questions: Sources of Data: Criteria/Standards: - contribute to knowledge improve understanding scholars researchers academicians hypotheses theory driven preordinate surveys tests experiments pre-ordinate - validity - reliability
- generalizability Evaluation - program improvement decision-making program funders administrators decision makers practical applied open-ended, flexible interviews field observations documents mixed sources open-ended, flexible utility feasibility propriety accuracy accountability FRA - Office of Research, Development & Technology Moving America Forward Evaluation Framework: Roles of Evaluation
FORMATIVE When: Before or during R&D projects/programs Purpose: To guide: SUMMATIVE After R&D projects/programs To assess: Program planning Program design Implementation strategies Completed projects or project lifecycles Accomplishments Impacts
To meet accountability requirements Primary Focus: To improve programs Moving America Forward To prove program merit or worth FRA - Office of Research, Development & Technology CIII Evaluation Model: (Context, Input, Implementation, Impact) Types of Evaluation Context (needs) Input (design) Implementation (process) Impact (product) Stakeholder engagement is key Daniel L. Stufflebeam's adaptation of his CIPP Evaluation Model framework for use in guiding program evaluations of the Federal Railroad Administration's Office of Research and Development. For additional
information, see Stufflebeam, D.L. (2000). The CIPP model for evaluation. In D.L. Stufflebeam, G. F. Madaus, & T. Kellaghan, (Eds.), in Evaluation models (2nd ed.). (Chapter 16). Boston: Kluwer Academic Publishers. FRA - Office of Research, Development & Technology Moving America Forward Evaluation Framework: Roles and Types of Evaluation Context Formative Evaluation (proactive) Implementation Impact Identifies: Assesses: Monitors: - Needs - Problems - Assets
Summative Assesses: Evaluation Original (retroactive) program goals and priorities Moving America Forward Inputs Assesses: Original - Execution procedural plans and FRA - Office of Research, Development & Technology budget Assesses: +/- outcomes Reassess: Project/program plans - Outcomes - Impacts - Side effects - Cost-effectiveness
Evaluative Questions Framework Illustrative Questions Context Formative Evaluation Summative Evaluation Who asked for the evaluation? Why? What are the needs? Intended users and uses? Inputs Given the need, what are the most promising alternative approaches? How do they compare? (potential success, costs) Implementation To what extent is the program proceeding on time, within budget, and effectively?
Is the program being implemented as designed? What are the highest priority needs? How can this strategy be most effectively implemented? What is the existing context driving those needs? Any potential barriers to implementation? How to mitigate barriers? If needed, how can the design be improved? To what extent did the program address the high priority needs?
What strategy was chosen and why, compared to other viable strategies (re. prospects for success, feasibility, costs)? To what extent was the program carried out as planned, or modified with an improved plan? Are implementation challenges being addressed? Impact To what extent are intended users (states, organizations, public) using the program? What other indicators of use, if any, have emerged that indicate the program is being used? What are some emerging outcomes (positive or negative)? How can the implementation be modified to maintain, measure and sustain long-term success?
To what extent did this program effectively address the needs? Any unanticipated negative or positive side effects? What conclusions and lessons learned can be reached (i.e., cost effectiveness, stakeholder engagement, program effectiveness)? FRA - Office of Research, Development & Technology Moving America Forward Exemplar Evaluation Questions: Educational Website Development and Implementation FRA - Office of Research, Development & Technology Moving America Forward An Educational Website: Evaluation Framework Illustrative Questions Context Inputs/Design Implementation Impact
Formative Evaluation What are the highest priority needs for a website in the railroad industry? Given the need for specific education and training, what are the most promising alternatives? How do they compare (potential success, costs, etc.)? How can this strategy be most effectively implemented? What are some potential barriers to implementation? To what extent is the website project proceeding on time, within budget, and
effectively? If needed, how can the design be improved? To what extent are people using the website? What other indicators of use, if any, have emerged that indicate the website is being accessed and the information is being acted upon? What are some emerging outcomes (positive or negative)? How can the implementation be modified to maintain and measure success? Summative Evaluation To what extent did the website address this high priority need? What strategy was chosen and why compared to other viable strategies (re. prospects for success, feasibility, costs)?
To what extent was the website carried out as planned, or modified with an improved plan? To what extent did this project effectively address the need to educate railroad employees on this topic? Were there any unanticipated negative or positive side effects? What conclusions and lessons learned can be reached (i.e. cost effectiveness, stakeholder engagement, program effectiveness)? FRA - Office of Research, Development & Technology Moving America Forward High Priority Needs (Context) WHAT? EDUCATION: Provide on-call railroaders Scientifically valid content on .... Proven, practical strategies to address the real-world challenges of balancing work and life. Personal tools to address issues identified. anonymous assessment for employees Diary data
WHY? BEHAVIOR CHANGE: Motivate railroaders to adjust behavior in the aspects of their lives within their individual control. FRA - Office of Research, Development & Technology Moving America Forward Target Audience/Intended Users (Context) Primary: On-call train and engine crews On-call employees (and their families) with specific information, on all classes of freight and passenger service on U.S. railroads Secondary: Other active railroaders Labor, management and others who interact with, and have influence on, these railroaders FRA - Office of Research, Development & Technology Moving America Forward Independent External Evaluation Need Large scale industry-wide project. Context is complex, contentious. Building site might not mean they will come or they will use it. Integrate evaluation into project phases to ensure attention to multiple perspectives are reflected. Evaluation Goals Facilitate good website design, understand website use and utility.
Inform key stakeholders about merit and worth of project based on systematic assessment. Evaluation Use Inform project decision-making, improve design, plan implementation strategy, accountability. FRA - Office of Research, Development & Technology Moving America Forward Implementation and Impact Evaluation Core Evaluation Question: Which company-sponsored implementation approaches promote industry-wide utilization of the website as an educational resource that increases user understanding of issues identified. 1) Identify and examine company developed initiatives for integrating/ implementing website into on-going training and educational programs. Identify RR sites to pilot educational efforts in different formats using a variety of approaches. Review the curricular/training materials developed to support the website as the primary learning tool. 2) Determine to what extent and in what ways these pilot initiatives have educational impact in the short term within the context of broad application across the industry.
Analyze data from a pre-/post-assessment of knowledge and attitudes. Think-aloud cognitive interviews to understand user interest, engagement, and choice processes through website use, to obtain ongoing interface usability feedback. FRA - Office of Research, Development & Technology Moving America Forward Summary Good evaluation questions ask: What is happening and why? (What?) How well is it working? (So What?) How can it be improved? (Now what?) Sub-questions Who will use the evaluation results? (Intended users) How will they use the results? (Intended uses) Yes, but what exactly is it, said Deep Thought. Once you know what the question actually is, you'll know what the answer means. - The Hitchhikers Guide to the Galaxy FRA - Office of Research, Development & Technology Moving America Forward Guidelines for developing quality evaluation questions Ask big picture questions Usually 5-7 core questions is enough Include sub-questions
Cover most or all of the following: Context (what is being evaluated/why) Input/design (what are the most promising alternatives) Implementation (how is it working, barriers, opportunities for improvement) Impacts (lessons learned, overall value/worth) FRA - Office of Research, Development & Technology Moving America Forward Evaluation Resources FRA - Office of Research, Development & Technology Moving America Forward Evaluation Resources Affiliate Evaluation Associations Washington Research and Evaluation Network (WREN) Federal Evaluators Network Evaluation Journals American Journal of Evaluation (AJE) New Directions for Evaluation (NDE) Evaluation Review
Evaluation and the Health Professions The Evaluators Institute (http://tei.cgu.edu) Claremont Graduate University The Evaluation Center (http://www.wmich.edu/evalctr/) Western Michigan University FRA - Office of Research, Development & Technology Moving America Forward Evaluation Standards* Guiding principles for conducting evaluations Utility (useful): to ensure evaluations serve the information needs of the intended users. Feasibility (practical): to ensure evaluations are realistic, prudent, diplomatic, and frugal. Propriety (ethical): to ensure evaluations will be conducted legally, ethically, and with due regard for the welfare of those involved in the evaluation, as well as those affected by its results. Accuracy (valid): to ensure that an evaluation will reveal and convey valid and reliable information about all important features of the subject program. Accountability (professional): to ensure that those responsible for conducting the evaluation document and make available for inspection all
aspects of the evaluation that are needed for independent assessments of its * The Program Evaluation Standards were developed by accountability. the Joint Committee on Standards for utility, feasibility, propriety, accuracy, and Educational Evaluation and have been accredited by the American National Standards Institute (ANSI). FRA - Office of Research, Development & Technology Moving America Forward Evaluation Resources American Evaluation Association (http://www.eval.org) 3000 members in 2001 over 7100 members today all 50 states over 60 countries $95/year membership, includes American Journal of Evaluation New Directions in Evaluation online access to full journal articles
FRA - Office of Research, Development & Technology Moving America Forward Guiding Principles for Evaluators A. Systematic inquiry: Evaluators conduct systematic, data-based inquiries. B. Competence: Evaluators provide competent performance to stakeholders. C. Integrity/Honesty: Evaluators display honesty and integrity in their own behavior, and attempt to ensure the honesty and integrity of the entire evaluation process. D. Respect for people: Evaluators respect the security, dignity and self-worth of respondents, program participants, clients, and other evaluation stakeholders. E. Responsibility for general and public welfare: Evaluators articulate and take into account the diversity of general and public interests and values that may be related to the evaluation. (http://www.eval.org) FRA - Office of Research, Development & Technology Moving America Forward Evaluation Resources http://www.fra.dot.gov/eLib/Find#p1_z25_gD_kEvaluation%20Implementation%20Plan http://www.fra.dot.gov/eLib/details/L17399#p1_z5_gD_kmanual FRA - Office of Research, Development & Technology Moving America Forward Evaluation Resources
Stufflebeam, Daniel, and Coryn, Chris L. S. (2014). Evaluation Theory, Models, & Applications. Jossey-Bass: A Wiley Brand. 2nd edition FRA - Office of Research, Development & Technology Moving America Forward Evaluation Resources Intended use for intended users 4th edition, 2008 FRA - Office of Research, Development & Technology Moving America Forward Evaluation Resources Davidson, E. J. (2004). Evaluation methodology basics: The nuts and bolts of sound evaluation. Thousand Oaks, CA: Sage. FRA - Office of Research, Development & Technology Moving America Forward
Thank you! Michael Coplen, M.A. Senior Evaluator Office of Research, Development & Technology Federal Railroad Administration U.S. Department of Transportation 202-493-6346 [email protected] FRA - Office of Research, Development & Technology Moving America Forward EXTRA SLIDES FRA - Office of Research, Development & Technology Moving America Forward Evaluation Standards *Guiding principles for conducting evaluations Utility (useful) Feasibility (practical) Propriety (ethical)
Accuracy (valid) Evaluation Accountability (professional) Evaluator Credibility Attention to
Stakeholders Negotiated Purposes Explicit Values Relevant Information Meaningful Processes & Products Timely & Appropriate Reporting Concern for Consequences & Influence Project Management Practical Procedures Contextual Validity Resource Use
Responsive & Inclusive Orientation Formal Agreements Human Rights & Respect Clarity & Fairness Transparency & Disclosure Conflicts of Interest Fiscal Responsibility
Justified conclusions & decisions Valid Information Reliable Information Explicit Program & Context Description Information Management Sound Design & Analyses Explicit Evaluation Reasoning Communication & Reporting FRA - Office of Research, Development & Technology Note: The Program Evaluation Standards were developed by the Joint Committee on Educational Evaluation and have been accredited by the American National Standards Institute (ANSI). Moving America Forward
Evaluation Documentation Internal Metaevaluation External Metaevaluation Stakeholder Input, Evaluation Questions and Findings* Source: Preskill, Hallie and Jones, Nathalie. (2009). A Practical Guide for Engaging Stakeholders in Developing Evaluation Questions. Robert Wood Johnson Foundation Evaluation Series. This publication is available for downloading from the Foundations Web site at: http://www.rwjf.org/pr/product.jsp?id=49951. FRA - Office of Research, Development & Technology Moving America Forward Quantitative evidence is the bones, qualitative evidence is the flesh, and evaluative reasoning is the vital organs that bring them both to life. Source: Davidson, E.J. (2014). How beauty can bring truth and justice to life. In J. C. Griffith & B. Montrosse-Moorhead (Eds.), Revisiting truth, beauty and justice: Evaluating with validity in the 21st century. New Directions for Evaluation, 142, 3143.
FRA - Office of Research, Development & Technology Moving America Forward Conclusion: Evaluation as a Key Strategy Tool Quality evaluation asks questions that matter . About processes, products, programs, policies, and impacts Helped identify, develop, and design pilot safety culture implementation projects Evaluation monitors the extent to which, and the ways in which projects and programs are being implemented. Whats working, and why, or why not? Monitored pilot implementations for ongoing improvement Evaluation measures the extent to which, and the ways in which program goals are being met. Inform others about lessons learned, progress, and program impacts Documented safety and safety culture outcomes from pilot implementations Evaluation helps refine program strategy, design, and implementation. Where successful programs are confirmed, supports broad-scale adoption across the industry Helped identify industry partners and inform strategy for company and industry-wide scale-up Evaluation systematically engages key stakeholders to improve program success. Identifies and actively involves intended users Clarifies intended uses and potential misuses Increased the utilization, impact, FRA and effectiveness of pilot safety
culture project outcomes for broader scale adoption - Office of Research, Development & Technology and sustainability Moving America Forward Forty-two, said Deep Thought, with infinite majesty and calm . . . is the answer to the Great Question, of Life, the Universe and Everything. - The Hitchhikers Guide to the Galaxy FRA - Office of Research, Development & Technology Moving America Forward
What is realism? Broadly defined, a literary technique devoted to "the faithful representation of reality" A reaction against romanticism. Sparked by an interest in the scientific method, the systematizing of the study of documentary history, and the influence of rational...
L'information redondante permet de reconstruire les données en cas de faillite d'un disque: Utiliser tous les disques (« striping ») vs. N'utiliser que des ''check disks'' pour l'info redondante. L'info redondante est calculée en utilisant un schème de parité.
International Telecommunications Union (ITU) - has three umbrellas of standards for VTC: a) H.320 - the standard for VTC over integrated services digital networks (ISDN) b) H.323 - the standard for transporting multimedia applications over LANs and IPs. c) H.324...
PATHFINDERS on the ROAD to HEALTH ... parasitic, trans-species, etc Population surge AIRS, WATERS, PLACES Seasons, Winds (hot, cold) Water Place - Aspect to winds, sun (sheltered or exposed) Life of the people - Hard working, eat & drink wisely,...
QF or utility may petition FERC to take enforcement action against state commission for failure to properly implement PURPA. ... No. 20095) seeking input from stakeholders about methods for calculating utility capacity needs and establishing a test for LEOs.
Social Stratification and Class. A structural ranking of entire groups of people in which some have more power, prestige, and wealth than others. Social Stratification. The group of people who share economic and social position in society.
Ready to download the document? Go ahead and hit continue!