There is a growing interest in the study of elite behavior in political science (Kertzer and Renshon, Reference Kertzer and Renshon2022). Scholars of political representation, party politics, and legislative politics increasingly rely on surveys of elected politicians (e.g., Pereira, Reference Pereira2021; Sheffer et al., Reference Sheffer, John Loewen, Soroka, Walgrave and Sheafer2018) and bureaucrats (e.g., Bækgaard et al., Reference Bækgaard, Christensen, Mondrup Dahlmann, Mathiasen and Bjørn Grund Petersen2019; Brierley, Reference Brierley2020; Heinzel et al., Reference Heinzel, Weaver and Briggs2025) to test underlying assumptions of existing theories or to gather new insights into the motivations, attitudes, and behavior of policymakers. Elite surveys have become more common with the blurring lines between the study of political behavior and institutions in political science, the growing interest in subnational politics, and the normalization of survey experimental designs (Butler and Pereira, Reference Butler and Pereira2025; Walgrave and Joly, Reference Walgrave and Joly2018).
A key challenge of conducting surveys of political elites, as well as other hard-to-reach populations, is recruitment (Bailer, Reference Bailer2014; López, Reference López2023; Maestas, Neeley, and Richardson, Reference Maestas, Neeley and Richardson2003). Quantitative surveys with political elites increasingly rely on non-incentivized online surveys including survey experiments (e.g., Butler et al., Reference Butler, Volden, Dynes and Shor2017; Dynes, Hassell, and Miles, Reference Dynes, Hassell and Miles2023; Mayne and Peters, Reference Mayne and Peters2023; Lee, Reference Lee2022; Lucas et al., Reference Lucas, Sheffer and John Loewen2024; Pereira et al., Reference Pereira, Coroado, Sousa and Magalhães2025). Response rates in these online surveys are typically low and depend on the population targeted, the geographical scope, and the timing of fieldwork (Miller, Reference Miller2022; Krause et al., Reference Krause, Fatemi, Nguyen Long, Arnold and Hofmeyer2024; Vis and Stolwijk, Reference Vis and Stolwijk2021; Walgrave and Joly, Reference Walgrave and Joly2018). Low response rates can result in biased samples and underpowered designs, threatening the validity of descriptive and experimental scholarship. Some recent scholarship looks at how incentives in political elite surveys can impact sample sizes and survey responses (Butler and Pereira, Reference Butler and Pereira2018; Heinzel et al., Reference Heinzel, Weaver and Briggs2025). Sending formal invitation letters has a long history in elite studies and is still an established practice when soliciting face-to-face interviews with national politicians (Walgrave and Joly, Reference Walgrave and Joly2018). Formal letters add to the credibility of the research and this seems particularly needed in times when unsolicited emails are increasingly received with skepticism and either end up in the spam folder or Respondents don’t dare click on the included link because of cybersecurity concerns (Krause et al., Reference Krause, Fatemi, Nguyen Long, Arnold and Hofmeyer2024). Finally, earlier research suggests that for mass citizen surveys formal invitation letters—together with email reminders—have proven effective in boosting response rates but the effects are modest (Porter and Whitcomb, Reference Porter and Whitcomb2007; Kaplowitz et al., Reference Kaplowitz, Lupi, Couper and Thorp2012).
In this research note, we assess whether recruiting participants to an online survey via a postal letter increases response rates among local elected officials. We expect formal invitation letters to increase response rates through a number of complementary mechanisms. Letters signal the credibility of the research endeavor and overcome spam and security concerns specific to emails. Finally, given older generations are overrepresented in local offices, letter invitations may be more in line with the daily routine of the population studied.
Method
We embedded this study in the European Panel of Local Officials, a survey of mayors and councilors in six Western countries fielded in December 2022–February 2023. During recruitment, we randomly assigned officials in Germany and the United Kingdom to receive formal invitation letters in addition to our standard recruitment strategy via email. The letter—content-wise identical to the email—described the purpose of the study and included a URL as well as a QR code for direct access to the online survey. Finally, the letter was signed by the two project leaders and included their University affiliations.Footnote 1 See Figure A1 for an example of the letter.
Since the letters were randomly assigned, differences in response rates across groups provide causal estimates of the effects of sending invitation letters on response rates.Footnote 2 Randomization was performed within the country. Hence, the pooled analyses include country-fixed effects.
Results
Figure 1 presents the effects of different recruitment modes on response rates to the online survey. We find that invitation letters produced a significant boost in response rates in both countries.Footnote
3
Response rates increased by 4.6 percentage points in Germany (s.e = 0.01) and by 2.1 points in the United Kingdom (s.e. = 0.004). Letters were significantly more effective at increasing response rates in Germany (difference in differences 0.025;
${p}$
-value = 0.01), despite the lower baseline response rate in the UK. This suggests important heterogeneity in the effectiveness of this recruitment strategy. When pooling results across countries, response rates improved by 2.8 percentage points (s.e. = 0.004).

Figure 1. Response rates in local politicians’ survey by recruitment model. Note: Bars describe response rates by randomly assigned recruitment modes in Germany, United Kingdom, and in both countries combined. Difference-in-means estimates from linear models (with country FEs in the pooled model) were reported over the bars for each group **(
${p}$
<0.001), *(
${p}$
<0.01).
But did the different recruitment modes impact the type of officials recruited, i.e., the characteristics of those taking the survey? Table 1 shows descriptive differences between subjects who completed the survey (post-treatment) and were recruited either by email or mail. We find that representatives recruited via letters tend to come from slightly smaller municipalities and are more likely to be men without a university education. There are no significant differences for the remaining observables. We also find no differences in response quality. Table 2 shows the effects of sending letter invitations on the quality of responses among those who participated in the survey. We find no relationship between recruitment mode and survey progress (column 1), whether officials finalized the survey (column 2), and time spent in the survey (column 3).
Table 1. Descriptives of survey respondents by recruitment mode

Table 2. The effects of recruitment mode on response quality

Note: *p
${\lt}$
0.01; **p
${\lt}$
0.001.
Cost-effectiveness of letter invitations
We have shown that invitation letters increase response rates of local politicians in Germany and the United Kingdom. However, this recruitment mode is costly. We paid 83 Euro Cent per letter (printing and stamp).Footnote 4 Table 3 shows the number of letters sent and responses obtained by country. The cost per response in Germany was around 16 Euros and increased to 37 Euros in the United Kingdom. Using within-country postal services or targeting specific subpopulations of politicians that are more responsive to letter invitations could improve the cost-effectiveness of this recruitment mode.
Table 3. Costs per completed survey by mail

Discussion
We find that sending letter invitations in addition to emails increases response rates in online local elites surveys in Germany and the United Kingdom. Effect sizes are more than twice as high among German local officials. This suggests important heterogeneity in the effectiveness of this recruitment mode. Further research should explore the conditions that make different recruitment modes more effective. The effects of recruitment mode might be different if another elite population is targeted (Heinzel et al., Reference Heinzel, Weaver and Briggs2025), the country selection is different, or additional recruitment methods such as phone calls are added to the mix (Walgrave and Joly, Reference Walgrave and Joly2018). We also find mild differences in the socioeconomic composition of respondents recruited via mail or email, and no differences in response quality. Overall, the results provide a promising path to improve response rates in elite surveys, although the costs are non-negligible. For experimental research, this additional effort can meaningfully increase statistical power.
Supplementary material
The supplementary material for this article can be found at https://doi.org/10.1017/xps.2025.10004
Data availability
The data, code, and any additional materials required to replicate all analyses in this article are available in the Journal of Experimental Political Science Dataverse within the Harvard Dataverse Network, at: https://doi.org/10.7910/DVN/NC8DIP (Giger and Pereira, Reference Giger and Pereira2025). The analyses reported in the manuscript follow all the applicable reporting standards recommended by the APSA Organized Section on Experimental Research.
Competing interests
We report no conflicts of interest.
Ethics statement
This study received ethical approval from the University of Southern California (UP-22-00464) and the University of Geneva (CUREG-2022-08-83). The experimental design was not preregistered. This research adheres to APSA’s Principles and Guidance for Human Subjects Research.