Source Themes

Choice Architecture and the Common Application: Ordering Effects in Online Settings

The proliferation of online college applications paired with application aggregation services such as the Common Application (CA) dramatically increases the scope for choice architects to affect application behavior. Choice architects provide information that allows students to make choices about where to apply. I provide evidence for two settings in which students are affected by the way in which choices are ordered. First, when local schools join the CA, the increased college search undertaken by ACT-takers depends on the options presented on the local school's website. If the CA is the only way to apply to the local school or the CA option is listed first, applicants send more than two additional ACT score reports (a 44% increase in volume). Applicants don't increase score report volume when the local school lists the CA option second. In the second part of the paper, I show that CA usage induced by a local CA school joining its membership makes ACT takers more likely to send score reports to schools higher in the alphabet. The CA website's search functionality returns alphabetized lists, which drives this behavior. These results imply that college applicant behavior deviates from rational choice theory in obvious ways. The results also indicate that applicant decision-making is sensitive not only to the amount of information provided but also how the information is provided.

Effects of the Common Application on Individual Application and Enrollment Decisions

College applicants are highly sensitive to small changes to the costs incurred in the application process. The Common Application (CA) reduces the non-monetary costs of additional applications by allowing students to send the same application to any member school with the click of a button. Using survey data from the National Center for Education Statistics and administrative ACT microdata, I show that appliers induced to using the CA by membership changes in local public universities send 1.4 additional applications (a 39% increase) and 2.3 additional ACT score reports (a 47% increase). CA usage has no effect on the probability of enrolling in any four-year institution but increases the probability of enrolling in a CA member school by 15 percentage points. The effects of CA usage are strongest for low-income appliers, who send 2.3 additional applications and enroll in private schools and out-of-state schools at higher rates. Further estimates, while imprecise, suggest that induced CA users are more likely to enroll in more selective, higher-quality schools. The estimated changes in enrollment decisions are too large to be explained solely by small decreases in the time costs of additional applications. College-going students' enrollment decisions are sensitive to changes in application methods for behavioral reasons. Applicants are oversensitive to small, short-run costs relative to the long-run investment of college attendance and lack information about costs of attendance. Centralized application systems that lower application costs may be beneficial because applicants respond by expanding their college search, and universities are able to provide information directly through acceptance letters.

One Email to Students: Can a Light-Touch Intervention Make a Difference? (with Travis Williams)

Poor performance in introductory courses and lack of individualized assistance may contribute to college non-completion. This research aims to identify the effects of increased, personalized instructor feedback on performance in introductory college courses. We use an experimental method similar to Carrell and Kurlaender (2020), where poorly-performing students in large lectures are randomized to receive additional feedback through email about their course performance along with encouragement and methods for improvement. Half of the treated students receive feedback from their teaching assistant, while the other half of treated students receive feedback from their instructor. We compare the efficacy of feedback from instructors to feedback from teaching assistants. We test the intervention's effects on overall course performance, frequency of help-seeking, and perceptions of instructor and teaching assistant quality. Identified effects have implications for the role of feedback in student success and whether feedback from instructors or teaching assistants is more effective. The low-cost nature of the intervention means that it could be easily implemented elsewhere and scaled rapidly.