What was the challenge?
The Higher Education Emergency Relief Fund (HEERF), created as part of the Coronavirus Aid, Relief, and Economic Security Act (CARES Act) signed into law on March 27, 2020, authorized the Department of Education (ED) to provide $14 billion for institutions of higher education (IHEs). Half of HEERF aid was intended to go to students to help offset the costs of COVID-19-related campus disruptions, resulting in the disbursement of hundreds of thousands of one-time cash grants directly to students.
Given the urgency and severity of the COVID-19 pandemic it was likely infeasible to prospectively embed rigorous evidence building into HEERF. However, it may be possible to design quasi-experimental evaluations using available administrative data to build evidence. Results from an evaluation of the effects of HEERF aid to students on student academic outcomes can inform the design of future emergency aid efforts. Providing an evaluation design may support efforts at institutions to build evidence about effective strategies to encourage academic success.
What did we do?
We designed an impact evaluation using a regression discontinuity (RD) design — a type of quasi-experimental method — to help ED and IHEs answer the question: What is the effect of providing HEERF aid on short- and medium-term student outcomes? We identified that many institutions used strict eligibility cutoffs that lend themselves to an RD design that can be applied with many IHEs.
How is the evaluation designed to build evidence?
Several institutions determined aid eligibility using Estimated Family Contribution (EFC), which is part of a student’s financial aid record if they have submitted a Free Application for Federal Student Aid (FAFSA). Students just above and below an EFC threshold are likely similar on observable characteristics (e.g., gender, race, age, other support received) and unobservable characteristics (e.g., motivation, career aspirations).
In most cases an ordinary least squares (OLS) regression of academic outcomes on an indicator of receipt of aid and a function of distance from the threshold can yield unbiased estimates of the effect of HEERF aid among students close to the threshold (the local average treatment effect). Primary outcomes that can be observed with administrative data held by IHEs may include continued enrollment, credits earned, and enrollment intensity (credits as a proportion of full-time enrollment) in the Spring 2020 term; enrollment for the Fall 2020 term; and time to degree completion.
What did we learn?
The RD design can provide evidence of the effect of one-time emergency aid on student outcomes. Evaluations using an RD design can indicate how much emergency aid programs change student outcomes.
Building robust evidence may benefit from coordinated evaluations with multiple IHEs. Because of variation between institutions in how funds were disbursed, results from an evaluation at any one institution may not be generalizable to other institutions. Developing a coordinated effort—for example, with a consortium of IHEs—to conduct RD evaluations at multiple institutions may yield a fuller picture of how HEERF aid affects student academic outcomes.