Increasing AHS response rates
What was the challenge?
The American Housing Survey (AHS) establishes multiple indicators of housing quality and supply in the United States. It is administered by Census on behalf of the U.S. Department of Housing and Urban Development. Every two years, the survey is fielded until an appropriately high response rate is achieved (around 80 percent) to reduce the impact of non-response bias. The survey must be completed in-person, and often requires an enumerator to visit households multiple times, increasing the cost of administering the survey.
What was the program change?
An advance letter is sent to all housing units selected for the survey shortly before the field period begins. Then, field staff began to follow-up in person to selected units. Census has identified a number of barriers to survey response. These include the hassle factor of scheduling and completing an in-person survey, a lack of understanding as to why completing the AHS is relevant to an individual, and trust and data privacy concerns. In an attempt to reduce hassle and address trust, the evaluation tested several versions of advance letters incorporating insights from behavioral science. These included incorporating an implementation planning tear-off tool, providing a simple link to verify enumerator employment, simplifying language describing how data was used to plan for locally-relevant services such as schools and roads, including a social norm stating that more than 86% of households respond, language addressing privacy concerns, and using respected organizations as an additional messenger.
How did the evaluation work?
84,879 housing units were randomly assigned to five groups – four groups were selected to be sent a version of a second advance letter and one group was selected as a control group that did not receive a second letter. Due to implementation timelines, letters were sent at the beginning of August 2017, about one month into the field period (after enumerators had already begun to visit some addresses).
What was the impact?
There were no significant differences between any of the experimental groups and the control group in terms of the overall response rate, the average number of in-person contact attempts, or the average number of days cases were in the field. However, there is some indication that the second advanced mailer that contained simplified language describing the purpose of the AHS, a short link to validate an enumerator’s employment, and an implementation intention could have led to a slightly quicker response.