Aneesh Rai

Aneesh Rai
  • Doctoral Candidate

Contact Information

  • office Address:

    532.1 Jon M. Huntsman Hall
    3730 Walnut St.
    Philadelphia, PA 19104

Research Interests: Diversity, Discrimination, Behavior Change

Links: CV, Personal Website


Aneesh Rai is a fifth-year PhD student in the Decision Processes group. Prior to his doctoral studies at Wharton, he graduated summa cum laude from Princeton University with a BA in Psychology, and minors in Computer Science and Cognitive Science.

His research is primarily focused on using insights from the judgment and decision-making literature to better understand what drives decisions to diversify organizations. Within this space, he examines how people’s perceptions of group diversity affect personnel selection decisions and influence how diverse organizations become. He also explores how the salience of diversity influences people’s decisions, and leverages this knowledge to design interventions for organizations to increase their gender and racial diversity. Finally, his secondary research interest is using large-scale field experiments to test interventions to promote positive behavior change in organizations.

Continue Reading


  • Aneesh Rai, Marissa A. Sharif, Edward H. Chang, Katherine L. Milkman, Angela Duckworth (2022), A Field Experiment on Subgoal Framing to Boost Volunteering: The Trade- Off Between Goal Granularity and Flexibility, Journal of Applied Psychology.

    Abstract: Research suggests that breaking overarching goals into more granular subgoals is beneficial for goal progress. However, making goals more granular often involves reducing the flexibility provided to complete them, and recent work shows that flexibility can also be beneficial for goal pursuit. We examine this trade-off between granularity and flexibility in subgoals in a preregistered, large-scale field experiment (N = 9,108) conducted over several months with volunteers at a national crisis counseling organization. A preregistered vignette pilot study (N = 900) suggests that the subgoal framing tested in the field could benefit goal seekers by bolstering their self-efficacy and goal commitment, and by discouraging procrastination. Our field experiment finds that reframing an overarching goal of 200 hr of volunteering into more granular subgoals (either 4 hr of volunteering every week or 8 hr every 2 weeks) increased hours volunteered by 8% over a 12-week period. Further, increasing subgoal flexibility by breaking an annual 200-hr volunteering goal into a subgoal of volunteering 8 hr every 2 weeks, rather than 4 hr every week, led to more durable benefits.

  • Erika Kirgios, Aneesh Rai, Edward Chang, Katy Milkman (2022), When Seeking Help, Women and Racial/Ethnic Minorities Benefit From Explicitly Stating Their Identity, Nature Human Behaviour, 6, pp. 383-391.

    Abstract: Receiving help can make or break a career, but women and racial/ethnic minorities do not always receive the support they seek. Across two audit experiments—one with politicians and another with students—as well as an online experiment (total n= 5,145), we test whether women and racial/ethnic minorities benefit from explicitly mentioning their demographic identity in requests for help, for example, by including statements like “As a Black woman…” in their communications. We propose that when a help seeker highlights their marginalized identity, it may activate prospective helpers’ motivations to avoid prejudiced reactions and increase their willingness to provide support. Here we show that when women and racial/ethnic minorities explicitly mentioned their demographic identity in help-seeking emails, politicians and students responded 24.4% (7.42 percentage points) and 79.6% (2.73 percentage points) more often, respectively. These findings suggest that deliberately mentioning identity in requests for help can improve outcomes for women and racial/ethnic minorities.

  • Katherine L. Milkman, Dena Gromet, Hung Ho, Joseph S. Kay, Timothy W. Lee, Predrag Pandiloski, Yeji Park, Aneesh Rai, Max Bazerman, John Beshears, Lauri Bonacorsi, Colin Camerer, Edward Chang, Gretchen B. Chapman, Robert Cialdini, Hengchen Dai, Lauren Eskreis-Winkler, Ayelet Fishbach, James J. Gross, Samantha Horn, Alexa Hubbard, Steven J. Jones, Dean Karlan, Tim Kautz, Erika Kirgios, Joowon Klusowski, Ariella Kristal, Rahul Ladhania, George Loewenstein, Jens Ludwig, Barbara Mellers, Sendhil Mullainathan, Silvia Saccardo, Jann Spiess, Gaurav Suri, Joachim H. Talloen, Jamie Taxer, Yaacov Trope, Lyle Ungar, Kevin Volpp, Ashley Whillans, Jonathan Zinman, Angela Duckworth (2021), Megastudies Improve the Impact of Applied Behavioural Science,, 600, pp. 478-483.

    Abstract: Policy-makers are increasingly turning to behavioural science for insights about how to improve citizens’ decisions and outcomes. Typically, different scientists test different intervention ideas in different samples using different outcomes over different time intervals. The lack of comparability of such individual investigations limits their potential to inform policy. Here, to address this limitation and accelerate the pace of discovery, we introduce the megastudy—a massive field experiment in which the effects of many different interventions are compared in the same population on the same objectively measured outcome for the same duration. In a megastudy targeting physical exercise among 61,293 members of an American fitness chain, 30 scientists from 15 different US universities worked in small independent teams to design a total of 54 different four-week digital programmes (or interventions) encouraging exercise. We show that 45% of these interventions significantly increased weekly gym visits by 9% to 27%; the top-performing intervention offered microrewards for returning to the gym after a missed workout. Only 8% of interventions induced behaviour change that was significant and measurable after the four-week intervention. Conditioning on the 45% of interventions that increased exercise during the intervention, we detected carry-over effects that were proportionally similar to those measured in previous research. Forecasts by impartial judges failed to predict which interventions would be most effective, underscoring the value of testing many ideas at once and, therefore, the potential for megastudies to improve the evidentiary value of behavioural science.

  • Edward Chang, Erika Kirgios, Aneesh Rai, Katherine L. Milkman (2020), The Isolated Choice Effect and Its Implications for Gender Diversity in Organizations, Management Science, 66 (6), pp. 2752-2761.

    Abstract: We highlight a feature of personnel selection decisions that can influence the gender diversity of groups and teams. Specifically, we show that people are less likely to choose candidates whose gender would increase group diversity when making personnel selections in isolation (i.e., when they are responsible for selecting a single group member) than when making collections of choices (i.e., when they are responsible for selecting multiple group members). We call this the isolated choice effect. Across 6 preregistered experiments (n=3,509), we demonstrate that the isolated choice effect has important consequences for group diversity. When making sets of hiring and selection decisions (as opposed to making a single hire), people construct more gender-diverse groups. Mediation and moderation studies suggest that people do not attend as much to diversity when making isolated selection choices, which drives this effect.