Understanding and improving how policymakers respond to program impact
What was the challenge?
The 2018 Foundations for Evidence-Based Policymaking Act set the stage for an ambitious agenda on the advancement of evidence production and use by the U.S. government, such that evaluation is now considered a critical agency function.¹² However, in order for the growing evidence base to effectively impact decisions about which programs to implement, policymakers must be well-equipped to interpret and use evidence.
What was the program change?
To better understand whether and when policymakers incorporate evidence-based information about program impact into funding decisions, in 2021 we administered a survey experiment among 192 federal employees across 22 U.S. government agencies. The evaluation was designed to identify policymaker responsiveness to program impact and also to evaluate decision aids aimed at increasing responsiveness.
How did the evaluation work?
In the evaluation, respondents reported their maximum willingness to pay for hypothetical government programs, given information about program impact which was randomized across participants. Assessments were made with and without two different decision aids, in random order. The first decision aid drew on behavioral insights pointing to the benefits of joint evaluations and presented two alternative programs together (“Side-by-Side”).³ The second decision aid translated total program costs into an annual cost per person impacted based on different features of program impact (“Impact Calculator”).
What was the impact?
There were two main takeaways from the experiment. First, policymakers’ responsiveness to program impact at baseline is low: When program impact increases by 100%, policymakers’ assessment of the value of the program increases by just 33%. Second, each of the two decision aids have large and statistically significant impacts on increasing responsiveness. These results identify decision aids that evaluators, policymakers, and researchers can use to more effectively disseminate the results of program evaluations.
Notes:
- 1 Pub. L. No. 115-435, 132 Stat. 5529 (2019), available at https://www.congress.gov/J l5/plaws/publ435/PLA W-l l 5publ435.pdf.
- Office of Management and Budget. Evidence-Based Policymaking: Learning Agendas and Annual Evaluation Plans (2021), https://www.whitehouse.gov/wp-content/uploads/2021/06/M-21-27.pdf.
- Hsee, C., G. Loewenstein, S. Blount, and M. Bazerman. “Preference reversals between joint and separate evaluations of options: A review and theoretical analysis.” Psychological Bulletin 125 no. 5 (1999): 576-590.
Verify the upload date of this Analysis Plan on GitHub.