Person on a laptop

Person on a laptop


View the PDF Version

Target a priority outcome

The American public spends approximately 11.5 billion hours per year filling out federal forms.1 Form complexity can result in lack of submission or completion, and errors on forms can cause processing delays and affect whether the form is accepted–which can have far-reaching consequences. Complex federal forms also place a large burden on government agencies who are responsible for processing responses, investigating errors, and verifying information. This evaluation takes an incremental step to build evidence and capacity on the testing of digital federal forms. While this study tests only one intervention, a central goal of this study was “proof-of-concept” for building evidence to improve federal form design in the future.

Translate behavioral insights

The burdens associated with completing a form can be reduced by providing clear instructions and utilizing effective formatting.2 However, many respondents do not carefully read or follow instructions about how to complete a form, and when written instructions conflict with examples, respondents consistently use the example information and disregard instructions.3,4 Embedding instructions alongside questions may make them more accessible (physically closer), more obviously relevant, and less demanding to process (shorter blocks of text). In contrast, embedding instructions alongside questions may also mean that people start answering questions without having read the full set of instructions. Providing instructions alongside questions may also entail using a link to pop-up those instructions, which may further decrease the likelihood that instructions are read, as they would require an additional effort to access. Rigorous evidence on how the positioning of instructions on a form affects responses is limited and relevant to most — if not all — federal forms.

Embed evaluation

We implemented a randomized control trial (RCT) to build evidence on the magnitude and direction of the effect of instruction positioning in federal forms. We evaluated two versions of a brief digital form which included questions typical of federal forms. One version included the form instructions on the first page, while the other version embedded the form instructions within each page of the form.

To generate a sample of users, we conducted outreach among the general public and federal employees. Outreach included tweets, email newsletters sent to federal employees (i.e., GSAToday) and the general public (i.e., USAGov), listservs, a pop-up on forms.gov, and a posting on challenge.gov. Individuals voluntarily chose to participate by clicking a link to the form that was included in the tweet, on the site, or in the email.5 Between July 19 and August 19, 2022, there were 3,203 instances in which an individual clicked on a link to fill out the form. An online platform randomly assigned individuals who clicked on the form link to either the form with embedded instructions or instructions at the front.6

Analyze using existing data

The primary outcome of the evaluation is form submission, defined as starting and submitting the form.7 The online platform was used to identify the total number of individuals who were randomized to each of the two forms and form response data was used to measure form submission.8

Results

The results offer initial evidence that form completion is affected by instruction placement.9 Across both types of form, almost two-thirds of users who started the form did not submit it. Individuals randomly assigned to the form with embedded instructions were 3.2 percentage points (36.2% versus 33.0%) more likely to submit (p=0.054, 95% CI [-0.001, 0.065]) than individual assigned to the form with instructions at the front, with this result significant at the p<0.10 level.

The pilot study also showed that conducting an A/B test of a typical federal form using a sample of voluntary users recruited from the general public is possible. Over 3,000 users started the form and 1,110 users submitted the form in a one month period. More control over the assignment mechanism and access to more complete outcome data would be beneficial in future studies. Over the examined study period, 93 more users (3% of the total sample) were assigned to the embedded instructions version of the form than were assigned to the instructions at the beginning version of the form. One possible contributor to this imbalance was internal testing of the form design and mechanics. However, this cannot be confirmed without additional outcome data and direct observation of the assignment process.

Build evidence

This evaluation — a first of its kind in the federal government—brought together multiple GSA offices and the American public to learn about the feasibility of incorporating A/B testing into federal forms and to show that form design matters for form completion. The empirical findings indicate that where instructions are placed impacts form submission, a substantive finding on the most fundamental outcome on filling out a form. Agencies could help substantiate this finding by continuing to evaluate the effects of instruction placement on high priority forms.

This pilot on the feasibility of incorporating A/B testing uncovered opportunities (e.g., public enthusiasm for improving federal forms) and challenges (e.g., limited control over the randomization process and access desired outcome data, such as time to completion and capturing incomplete responses). This evaluation showed federal forms are ripe for improvement and evidence-building activities that inform form design could reduce burdens on the public. While interest in improving federal forms among the federal government and public is high, more work is needed to improve the testing infrastructure of federal forms in order to build and apply rigorous evidence to improve form design.

Notes:

  1. Office of Management and Budget, Information Collection Budget of the United States, FY 2018.
  2. Barnard, P. J., P. Wright, and P. Wilcox. “Effects of response instructions and question style on the ease of completing forms.” Journal of Occupational Psychology 52, no. 3 (1979): 209-226. (Experiment 1a)
  3. Ibid. (Experiments 1c, 3b)
  4. LeFevre, J.A. and Dixon, P., 1986. Do written instructions need examples?. Cognition and Instruction, 3(1), pp.1-30.
  5. When users clicked on the link to start the form, they were taken to a page on the OES website that provided additional details about the form and a link to start it.
  6. We cannot determine if these are unique individuals or if some people participated multiple times. If an individual participated multiple times in the same browser they would be routed to the same form, rather than re-randomized.
  7. Non-reported secondary analysis examines form completeness, which is defined as the percent of answers that were blank or listed as “undefined” for users who submitted the form.
  8. Additionally, data from online platforms were analyzed to document sample recruitment and the effectiveness of different outreach strategies.
  9. Unless noted otherwise, all of the analysis reported in this abstract was prespecified in an analysis plan, which can be found on GitHub.