Skip to page navigation
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Skip to main content

Evaluation Standards

OPM Evaluation Standards

OPM's standards for program evaluations, which are aligned to the Federal program evaluation standards in OMB Memorandum M-20-12, cover all OPM program evaluations, including evaluations performed by OPM staff, outside partners, and recipients of Federal awards that are performing work on behalf of the agency.

These standards do not apply to evaluations that OPM staff perform on behalf of other agencies via Interagency Agreements, as those evaluations are subject to the standards of the customer agencies. The Evaluation Officer is responsible for overseeing agency-wide implementation of these standards, in accordance with OMB Memorandum M-19-23 and communicates these standards to program evaluators throughout the agency. Program evaluators in program offices are responsible for adherence to these standards for program evaluations performed within their offices.

Relevance and Utility

OPM will conduct program evaluations that address questions of importance and serve the information needs of relevant stakeholders. Program evaluations will address questions that focus on important agency issues, such as considerations for efficiency, effectiveness, improvement, learning, or accountability. Program evaluators should build upon the existing scientific literature on the topic, assess the evaluability of policies, programs, and actions under consideration, and design evaluations that reflect the policy, program, and cultural context in which findings will be applied. To promote relevance, OPM’s Research Steering Committee identifies the knowledge gaps that inform the Learning Agenda, proposes draft research questions, and prioritizes and approves initial research questions for the Learning Agenda and Annual Evaluation Plan.

OPM will make evaluation findings understandable, easily accessible, and timely available to relevant stakeholders in order to inform agency efforts relating to budgeting, performance improvement, accountability, management, regulatory action, and policy making. To promote utility, OCFO may incorporate the reporting of relevant program evaluation progress and findings into its regular performance reporting and review processes, including its data-driven performance review (Results OPM) meetings.

Rigor

OPM will produce findings that internal and external stakeholders can confidently rely upon, while providing clear explanations of limitations. The quality of an evaluation depends on the underlying design and methods, implementation, and how findings are interpreted and reported. To promote credibility in agency findings, qualified evaluators with relevant education, skills, and experience will manage agency evaluations. Evaluations should incorporate the most appropriate design and methods possible to answer key questions, while balancing efforts with the project goals, scale, timeline, feasibility, and available resources.

Program evaluators should strengthen methodological rigor of their evaluations by acting to establish credibility through peer review and consultation with relevant experts, verify that inferences about cause and effect are well founded, use and collect verifiable data that accurately captures the intended information and addresses questions of interest, and conform to clear understandings of policy, program, cultural, and other applicable contexts.

Agency personnel should also be rigorous in how they document findings and developing reports and other deliverables relating to program evaluation. Program evaluators should document the populations, settings, or circumstances examined and to which findings can be generalized.

Likewise, program evaluators should use precise language to characterize findings accurately based on the design and methods and disclose limitations, assumptions, methods undertaken, data and sampling frame, and justifications for any changes in design and methods from the initial plan.

To promote rigor, the OCFO/Evaluation Officer provides peer review and consultation services to agency program evaluators. OPM’s Research Work Group may advise the Evaluation Officer on data, tools, methods, and/or analytic approaches for evaluations, as well as other evaluation activities.

Independence and Objectivity

OPM program evaluations must be viewed as objective for stakeholders, experts, and the public to accept their findings. OPM evaluators should operate with an appropriate level of independence from programmatic, regulatory, policymaking, and stakeholder activities. While stakeholders have an important role in identifying evaluation priorities, the implementation of evaluation activities, including how evaluators are selected and operate, should be appropriately insulated from political and other undue influences that might affect their objectivity, impartiality, and professional judgement. Evaluators should strive for objectivity in the planning and conduct of evaluations and in the interpretation and dissemination of findings by avoiding conflicts of interest, bias, and other partialities. Evaluators should acknowledge and attempt to mitigate conflicts of interest or bias and other partialities that may be introduced in how they frame evaluation questions, collect or analyze data, or interpret findings.

Transparency

OPM program evaluations must be transparent in the planning, implementation, and reporting phases to enable accountability and prevent aspects of evaluations from being tailored to generate specific findings. Evaluators will document decisions and develop detailed plans about an evaluation’s purpose, objectives (including if it is for internal or public use), the range of stakeholders who will have access to findings, design, methods, and the timeline and strategy for disseminating findings. Program evaluations will consider legal, ethical, national security, and other constraints regarding the disclosing of sensitive information when planning, carrying out, and distributing the findings of agency program evaluations. Evaluators should clearly document these decisions and any disclosure limitations at the onset of an evaluation in order to enable accountability and foster credibility among stakeholders.

For evaluations on the agency’s Learning Agenda, the Evaluation Officer will oversee the use and dissemination of evaluation results to facilitate the reporting of findings in a timely and adequate manner so that results can be reviewed, interpreted, or replicated by relevant internal and external stakeholders, including the public when appropriate. OCFO will incorporate relevant findings into established performance reporting and review processes, including data- driven performance review (Results OPM) meetings, as appropriate.

Evaluation reports should include full descriptions of any limitations of the evaluation, including those related to design and methods, implementation, generalizability, or interpretation of findings. Whenever possible, the data collected during an evaluation should be made available to support replication, reproduction, and secondary use. Evaluation data must only be disclosed in manners consistent with applicable laws, regulations, and policies to support the proper protection of interests such as the security, privacy, and integrity of the data and participants.

Ethics

OPM program evaluators will conduct evaluations in line with the highest ethical standards to protect the public and maintain public trust in agency efforts. Evaluations should be planned and implemented to safeguard the dignity, rights, safety, and privacy of participants and other stakeholders and affected entities. Program evaluators will follow appropriate procedures articulated in law and policy when creating, using, processing, storing, maintaining, disseminating, disclosing, and disposing of data about individuals and entities, whether the data was provided by participants and evaluated entities, or consists of administrative or other data created or obtained from other sources.

Evaluators should abide by current professional standards pertaining to the treatment of participants. Evaluations should be equitable, fair, and just, and should consider cultural and contextual factors that could influence evaluation findings or their use. To safeguard fairness and equity, evaluators should go beyond the perspectives of those usually represented and gain an understanding of the range of perspectives and interests that individuals and groups bring to the evaluation. Likewise, evaluators should inform stakeholders of the evaluation prior to its start and upon significant modification to the evaluation question or research design and communicate findings to affected individuals and entities upon an evaluation’s completion.

Control Panel