Hostname: page-component-6bb9c88b65-6scc2 Total loading time: 0 Render date: 2025-07-22T22:23:46.116Z Has data issue: false hasContentIssue false

Invitation Letters Increase Response Rates in Elite Surveys: Evidence from Germany and the United Kingdom

Published online by Cambridge University Press:  10 June 2025

Nathalie Giger
Affiliation:
University of Geneva, Geneva, Switzerland
Miguel M. Pereira*
Affiliation:
London School of Economics, London, UK
*
Corresponding author: Miguel M. Pereira; Email: m.m.pereira@lse.ac.uk
Rights & Permissions [Opens in a new window]

Abstract

A key challenge when surveying political elites is recruitment. Low response rates can lead to biased samples and underpowered designs, threatening the validity of descriptive and experimental scholarship. In a randomized control trial, we test the effects of sending postal invitations in a large survey of local elected officials. We find that German and UK local politicians are more likely to complete the survey if invited by postal mail, rather than simply by email. Recruitment mode does not impact the quality of responses but shapes the population of local officials recruited. Officials invited via postal letter were more likely to come from smaller municipalities and less likely to have a college degree. Costs per response are relatively high but can be reduced as we learn more about selection into elite surveys.

Information

Type
Short Report
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of American Political Science Association

There is a growing interest in the study of elite behavior in political science (Kertzer and Renshon, Reference Kertzer and Renshon2022). Scholars of political representation, party politics, and legislative politics increasingly rely on surveys of elected politicians (e.g., Pereira, Reference Pereira2021; Sheffer et al., Reference Sheffer, John Loewen, Soroka, Walgrave and Sheafer2018) and bureaucrats (e.g., Bækgaard et al., Reference Bækgaard, Christensen, Mondrup Dahlmann, Mathiasen and Bjørn Grund Petersen2019; Brierley, Reference Brierley2020; Heinzel et al., Reference Heinzel, Weaver and Briggs2025) to test underlying assumptions of existing theories or to gather new insights into the motivations, attitudes, and behavior of policymakers. Elite surveys have become more common with the blurring lines between the study of political behavior and institutions in political science, the growing interest in subnational politics, and the normalization of survey experimental designs (Butler and Pereira, Reference Butler and Pereira2025; Walgrave and Joly, Reference Walgrave and Joly2018).

A key challenge of conducting surveys of political elites, as well as other hard-to-reach populations, is recruitment (Bailer, Reference Bailer2014; López, Reference López2023; Maestas, Neeley, and Richardson, Reference Maestas, Neeley and Richardson2003). Quantitative surveys with political elites increasingly rely on non-incentivized online surveys including survey experiments (e.g., Butler et al., Reference Butler, Volden, Dynes and Shor2017; Dynes, Hassell, and Miles, Reference Dynes, Hassell and Miles2023; Mayne and Peters, Reference Mayne and Peters2023; Lee, Reference Lee2022; Lucas et al., Reference Lucas, Sheffer and John Loewen2024; Pereira et al., Reference Pereira, Coroado, Sousa and Magalhães2025). Response rates in these online surveys are typically low and depend on the population targeted, the geographical scope, and the timing of fieldwork (Miller, Reference Miller2022; Krause et al., Reference Krause, Fatemi, Nguyen Long, Arnold and Hofmeyer2024; Vis and Stolwijk, Reference Vis and Stolwijk2021; Walgrave and Joly, Reference Walgrave and Joly2018). Low response rates can result in biased samples and underpowered designs, threatening the validity of descriptive and experimental scholarship. Some recent scholarship looks at how incentives in political elite surveys can impact sample sizes and survey responses (Butler and Pereira, Reference Butler and Pereira2018; Heinzel et al., Reference Heinzel, Weaver and Briggs2025). Sending formal invitation letters has a long history in elite studies and is still an established practice when soliciting face-to-face interviews with national politicians (Walgrave and Joly, Reference Walgrave and Joly2018). Formal letters add to the credibility of the research and this seems particularly needed in times when unsolicited emails are increasingly received with skepticism and either end up in the spam folder or Respondents don’t dare click on the included link because of cybersecurity concerns (Krause et al., Reference Krause, Fatemi, Nguyen Long, Arnold and Hofmeyer2024). Finally, earlier research suggests that for mass citizen surveys formal invitation letters—together with email reminders—have proven effective in boosting response rates but the effects are modest (Porter and Whitcomb, Reference Porter and Whitcomb2007; Kaplowitz et al., Reference Kaplowitz, Lupi, Couper and Thorp2012).

In this research note, we assess whether recruiting participants to an online survey via a postal letter increases response rates among local elected officials. We expect formal invitation letters to increase response rates through a number of complementary mechanisms. Letters signal the credibility of the research endeavor and overcome spam and security concerns specific to emails. Finally, given older generations are overrepresented in local offices, letter invitations may be more in line with the daily routine of the population studied.

Method

We embedded this study in the European Panel of Local Officials, a survey of mayors and councilors in six Western countries fielded in December 2022–February 2023. During recruitment, we randomly assigned officials in Germany and the United Kingdom to receive formal invitation letters in addition to our standard recruitment strategy via email. The letter—content-wise identical to the email—described the purpose of the study and included a URL as well as a QR code for direct access to the online survey. Finally, the letter was signed by the two project leaders and included their University affiliations.Footnote 1 See Figure A1 for an example of the letter.

Since the letters were randomly assigned, differences in response rates across groups provide causal estimates of the effects of sending invitation letters on response rates.Footnote 2 Randomization was performed within the country. Hence, the pooled analyses include country-fixed effects.

Results

Figure 1 presents the effects of different recruitment modes on response rates to the online survey. We find that invitation letters produced a significant boost in response rates in both countries.Footnote 3 Response rates increased by 4.6 percentage points in Germany (s.e = 0.01) and by 2.1 points in the United Kingdom (s.e. = 0.004). Letters were significantly more effective at increasing response rates in Germany (difference in differences 0.025; ${p}$ -value = 0.01), despite the lower baseline response rate in the UK. This suggests important heterogeneity in the effectiveness of this recruitment strategy. When pooling results across countries, response rates improved by 2.8 percentage points (s.e. = 0.004).

Figure 1. Response rates in local politicians’ survey by recruitment model. Note: Bars describe response rates by randomly assigned recruitment modes in Germany, United Kingdom, and in both countries combined. Difference-in-means estimates from linear models (with country FEs in the pooled model) were reported over the bars for each group **( ${p}$ <0.001), *( ${p}$ <0.01).

But did the different recruitment modes impact the type of officials recruited, i.e., the characteristics of those taking the survey? Table 1 shows descriptive differences between subjects who completed the survey (post-treatment) and were recruited either by email or mail. We find that representatives recruited via letters tend to come from slightly smaller municipalities and are more likely to be men without a university education. There are no significant differences for the remaining observables. We also find no differences in response quality. Table 2 shows the effects of sending letter invitations on the quality of responses among those who participated in the survey. We find no relationship between recruitment mode and survey progress (column 1), whether officials finalized the survey (column 2), and time spent in the survey (column 3).

Table 1. Descriptives of survey respondents by recruitment mode

Table 2. The effects of recruitment mode on response quality

Note: *p ${\lt}$ 0.01; **p ${\lt}$ 0.001.

Cost-effectiveness of letter invitations

We have shown that invitation letters increase response rates of local politicians in Germany and the United Kingdom. However, this recruitment mode is costly. We paid 83 Euro Cent per letter (printing and stamp).Footnote 4 Table 3 shows the number of letters sent and responses obtained by country. The cost per response in Germany was around 16 Euros and increased to 37 Euros in the United Kingdom. Using within-country postal services or targeting specific subpopulations of politicians that are more responsive to letter invitations could improve the cost-effectiveness of this recruitment mode.

Table 3. Costs per completed survey by mail

Discussion

We find that sending letter invitations in addition to emails increases response rates in online local elites surveys in Germany and the United Kingdom. Effect sizes are more than twice as high among German local officials. This suggests important heterogeneity in the effectiveness of this recruitment mode. Further research should explore the conditions that make different recruitment modes more effective. The effects of recruitment mode might be different if another elite population is targeted (Heinzel et al., Reference Heinzel, Weaver and Briggs2025), the country selection is different, or additional recruitment methods such as phone calls are added to the mix (Walgrave and Joly, Reference Walgrave and Joly2018). We also find mild differences in the socioeconomic composition of respondents recruited via mail or email, and no differences in response quality. Overall, the results provide a promising path to improve response rates in elite surveys, although the costs are non-negligible. For experimental research, this additional effort can meaningfully increase statistical power.

Supplementary material

The supplementary material for this article can be found at https://doi.org/10.1017/xps.2025.10004

Data availability

The data, code, and any additional materials required to replicate all analyses in this article are available in the Journal of Experimental Political Science Dataverse within the Harvard Dataverse Network, at: https://doi.org/10.7910/DVN/NC8DIP (Giger and Pereira, Reference Giger and Pereira2025). The analyses reported in the manuscript follow all the applicable reporting standards recommended by the APSA Organized Section on Experimental Research.

Competing interests

We report no conflicts of interest.

Ethics statement

This study received ethical approval from the University of Southern California (UP-22-00464) and the University of Geneva (CUREG-2022-08-83). The experimental design was not preregistered. This research adheres to APSA’s Principles and Guidance for Human Subjects Research.

Footnotes

1 In Germany this meant that nobody was affiliated with a local University while in the United Kingdom there was a local connection.

2 Table A1, in the Appendix, reports balance tests and reveals no problems in the randomization across a range of demographic and political variables.

3 Subjects who received the letter invitation overwhelmingly used the link/QR code provided in the letter to access the survey, suggesting that indeed the change in recruitment strategy attracted a different set of Respondents.

4 We used the same company based in Portugal to send letters across Europe.

References

Bækgaard, Martin, Christensen, Julian, Mondrup Dahlmann, Casper, Mathiasen, Asbjørn and Bjørn Grund Petersen, Niels. 2019. “The Role of Evidence in Politics: Motivated Reasoning and Persuasion among Politicians.” British Journal of Political Science 49(3):1117–40.CrossRefGoogle Scholar
Bailer, Stefanie. 2014. “Interviews and Surveys in Legislative Research.” In The Oxford handbook of legislative studies pp. 167–93.Google Scholar
Brierley, Sarah. 2020. “Unprincipled Principals: Co-Opted Bureaucrats and Corruption in Ghana.” American Journal of Political Science 64(2):209–22.CrossRefGoogle Scholar
Butler, Daniel M., and Pereira, Miguel M.. 2018. “Are Donations to Charity an Effective Incentive for Public Officials?Journal of Experimental Political Science 5(1):6870.CrossRefGoogle Scholar
Butler, Daniel M. and Pereira, Miguel M.. 2025. “Innovations in the Study of Elite Behavior: The Role of Information in Representation and Decision-Making.” In Handbook of Innovations in Political Psychology, pp. 279318. Edward Elgar Publishing.Google Scholar
Butler, Daniel M, Volden, Craig, Dynes, Adam M. and Shor, Boris. 2017. “Ideology, Learning, and Policy Diffusion: Experimental Evidence.” American Journal of Political Science 61(1):3749.CrossRefGoogle Scholar
Dynes, Adam M. Hassell, Hans J.G. and Miles, Matthew R. 2023. “Personality traits and approaches to political representation and responsiveness: An experiment in local government.” Political Behavior 45(4):1791–811.CrossRefGoogle Scholar
Giger, Nathalie and Pereira, Miguel M.. 2025. “Replication Data for: “Invitation Letters Increase Response Rates in Elites Surveys: Evidence from Germany and the United Kingdom”.Google Scholar
Heinzel, Mirko, Weaver, Catherine and Briggs, Ryan. 2025. “Incentivizing Responses in International Organization Elite Surveys: Evidence from the World Bank.” Journal of Experimental Political Science 12(1):1726.CrossRefGoogle Scholar
Kaplowitz, Michael D., Lupi, Frank, Couper, Mick P. and Thorp, Laurie. 2012. “The Effect of Invitation Design on Web Survey Response Rates.” Social Science Computer Review 30(3):339–49.CrossRefGoogle Scholar
Kertzer, Joshua D. and Renshon, Jonathan. 2022. “Experiments and Surveys on Political Elites.” Annual Review of Political Science 25(1):529–50.CrossRefGoogle Scholar
Krause, Rachel M, Fatemi, S. Mohsen Nguyen Long, Le Anh, Arnold, Gwen and Hofmeyer, Sarah L. 2024. “What is the Future of Survey-based Data Collection for Local Government Research? Trends, Strategies, and Recommendations.” Urban Affairs Review 60(3):1094–115.CrossRefGoogle Scholar
Lee, Nathan. 2022. “Do Policy Makers Listen to Experts? Evidence from a National Survey of Local and State Policy Makers.” American Political Science Review 116(2):677–88.CrossRefGoogle Scholar
López, Matias. 2023. “The Effect of Sampling Mode on Response Rate and Bias in Elite Surveys.” Quality & Quantity 57(2):1303–19.CrossRefGoogle ScholarPubMed
Lucas, Jack, Sheffer, Lior and John Loewen, Peter. 2024. “Pathways to Substantive Representation: Policy Congruence and Policy Knowledge Among Canadian Local Politicians.” Political Behavior 120. https://doi.org/10.1007/s11109-024-09982-2 Google Scholar
Maestas, Cherie, Neeley, Grant W. and Richardson, Lilliard E.. 2003. “The State of Surveying Legislators: Dilemmas and Suggestions.” State Politics & Policy Quarterly 3(1):90108.CrossRefGoogle Scholar
Mayne, Quinton and Peters, Yvette. 2023. “Where you sit is where you stand: Education-based Descriptive Representation and Perceptions of Democratic Quality.” West European Politics 46(3):526–49.CrossRefGoogle Scholar
Miller, David R. 2022. “On Whose Door to Knock? Organized Interests’ Strategic Pursuit of Access to Members of Congress.” Legislative Studies Quarterly 47(1):157–92.CrossRefGoogle Scholar
Pereira, Miguel M. 2021. “Understanding and Reducing Biases in Elite Beliefs about the Electorate.” American Political Science Review 115(4):1308–24.CrossRefGoogle Scholar
Pereira, Miguel M., Coroado, Susana, Sousa, Lus de and Magalhães, Pedro C. 2025. “Politicians Support (and Voters Reward) Intra-Party Reforms to Promote Transparency.” Party Politics 31(1):4054.CrossRefGoogle Scholar
Porter, Stephen R. and Whitcomb, Michael E. 2007. “Mixed-Mode Contacts in Web Surveys: Paper is not Necessarily Better.” Public Opinion Quarterly 71(4):635–48.CrossRefGoogle Scholar
Sheffer, Lior, John Loewen, Peter, Soroka, Stuart, Walgrave, Stefaan and Sheafer, Tamir. 2018. “Nonrepresentative Representatives: An Experimental Study of the Decision Making of Elected Politicians.” American Political Science Review 112(2):302–21.CrossRefGoogle Scholar
Vis, Barbara and Stolwijk, Sjoerd. 2021. “Conducting Quantitative Studies with the Participation of Political Elites: Best Practices for Designing the Study and Soliciting the Participation of Political Elites.” Quality & Quantity 55(4):1281–317.CrossRefGoogle Scholar
Walgrave, Stefaan and Joly, Jeroen K.. 2018. “Surveying Individual Political Elites: A Comparative Three-Country Study.” Quality & Quantity 52(5):2221–37.CrossRefGoogle Scholar
Figure 0

Figure 1. Response rates in local politicians’ survey by recruitment model. Note: Bars describe response rates by randomly assigned recruitment modes in Germany, United Kingdom, and in both countries combined. Difference-in-means estimates from linear models (with country FEs in the pooled model) were reported over the bars for each group **(${p}$<0.001), *(${p}$<0.01).

Figure 1

Table 1. Descriptives of survey respondents by recruitment mode

Figure 2

Table 2. The effects of recruitment mode on response quality

Figure 3

Table 3. Costs per completed survey by mail

Supplementary material: File

Giger and Pereira supplementary material

Giger and Pereira supplementary material
Download Giger and Pereira supplementary material(File)
File 199 KB