Using Evidence:

Learning from Low-Cost Federal Evidence-Building Activities

October 30, 2019 at GSA

The U.S. General Services Administration’s (GSA) Office of Evaluation Sciences (OES) and numerous agency collaborators presented on how the federal government uses low-cost evaluations, unexpected results, and administrative data to inform policy and program decisions. OES staff, collaborators from multiple agencies, and distinguished academic partners presented new results and lessons learned from over 10 OES evaluations in three sessions: Learning from Low-Cost Evaluations, Learning from Unexpected Results, and Learning from Administrative Data. All sessions included information and examples relevant to meeting the requirements of the Foundations for Evidence-Based Policymaking Act of 2018 (Evidence Act).

Presentation slides are available here.

Full agenda here

Welcome and Introduction:

  • Jessica Salmoiraghi, Associate Administrator, Office of Government-wide Policy, GSA
  • Diana Epstein, Evidence Team Lead, Office of Management and Budget

Using Evidence: Learning From Low-Cost Federal Evidence-Building Activities will include sessions on:

Learning From Low-Cost Evaluations agency collaborators and OES team members shared new results and lessons learned from using low-cost evaluation. These leaders addressed priority topics such as health monitoring, improper payments, Veterans education, and vaccination uptake. Speakers included:

  • Stephanie Garcia (Office of the National Coordinator for Health Information Technology (ONC), Department of Health and Human Services) and Hassan Ahmed (Inova Health System) and Pompa Debroy (OES) discussed what ONC and Inova learned in their partnership to increase use of patient-generated health data. Results available here and here.
  • Bob Weathers (Office of Retirement and Disability Policy, Social Security Administration) and Elana Safran (OES) discussed what SSA learned from testing an intervention to increase timely wage reporting among supplemental security income recipients. Results available here.
  • Lucas Tickner (Education Service, Department of Veterans Affairs) and Dennis Kramer (OES) discussed what the VA learned about using proactive communication to increase college enrollment of Veterans. More results here.
  • Vincent Marconi (Atlanta VA Health Care System, Department of Veterans Affairs; Emory University School of Medicine) and Pompa Debroy (OES) discussed what the Atlanta VA learned from testing an intervention to increase vaccine uptake among Veterans. Results available here.

In Learning from Unexpected Results, agency experts in evaluation discussed how to learn from and act on results that may be surprising. The session presented project examples of different types of unexpected results, and perspectives from agency evaluation leads on how they have learned from such results. Panelists then described studies that yielded null results and answer questions about how they have used unexpected results to advance learning agendas in their agencies:

  • Rekha Balu (Office of Evaluation Sciences, GSA)
  • Calvin Johnson (Office of Policy Development and Research, Department of Housing and Urban Development)
  • Susan Wilschke (Office of Research, Demonstration, and Employment Support, Social Security Administration)

Unexpected Results Handout

In Learning from Administrative Data, agency and academic collaborators shared the varied ways they have used administrative data in programs and evaluation. This session highlighted how to use administrative data to address the requirements of the Evidence Act in meaningful and useful ways. Speakers included:

  • Jagruti Rekhi (Office of Policy Development and Research, Department of Housing and Urban Development) and Michael DiDomenico (OES) discussed how interagency access to data supported HUD in better understanding and addressing agency priorities. More results here and here.
  • Adam Sacarny (OES; Mailman School of Public Health, Columbia University) discussed how obtaining the right expertise for utilizing administrative data allowed HHS to build evidence around reducing inappropriate prescribing. More results here.
  • Alan Sim and (Defense Health Agency, Department of Defense) and Elana Safran (OES) discussed how intra-agency data compilation led to the development of a tool DHA can use for clinical decision making and evidence-building.