Celia Gaertig

Celia Gaertig
  • Doctoral Candidate

Contact Information

  • office Address:

    527.7 Jon M. Huntsman Hall
    3730 Walnut St.
    Philadelphia, PA 19104

Research Interests: consumer behavior, judgment and decision making, decision making under uncertainty

Links: CV, Personal Website

Overview

Celia Gaertig is a fourth-year PhD student in the Decision Processes Group. Prior to her PhD studies at Wharton, she completed a BA in Business (DHBW Karlsruhe) and a BSc in Psychology (University of Freiburg), both in Germany, and worked as a Research Assistant at the Harvard Kennedy School.

In her research, Celia explores the psychology of consumer judgment and decision-making. Much of her work focuses on understanding how consumers make judgments and decisions in situations that involve uncertainty.

How Does Uncertainty Affect Consumers’ Judgments and Decisions?
Several of Celia’s projects focus on understanding how uncertainty affects consumers’ judgments and decisions. Here are some of the research questions that she has investigated:

  • Do people inherently dislike uncertain advice? (Gaertig & Simmons, 2018, Psychological Science)
  • When are uncertain price promotions effective, and why? (Gaertig & Simmons, under review at the Journal of Marketing Research)
  • How do people combine multiple pieces of uncertain advice that contain numerical vs. verbal likelihood statements? (Mislavsky & Gaertig, working paper)

Second Guessing

Celia is also interested in understanding how people make quantitative judgments under uncertainty. Recent research suggests that, when making quantitative judgments under uncertainty, averaging two estimates from the same person can improve judgments, an effect dubbed the “wisdom of the inner crowd.” It is not obvious, however, why or when this works. Celia and her advisor, Joe Simmons, are examining the conditions under which making a second guess leads to superior judgments (Gaertig & Simmons, under review at Management Science).

Extremeness Aversion

People’s estimates of uncertain quantities are influenced by values that they previously considered, a phenomenon known as anchoring. Celia and her collaborators demonstrate that extremeness aversion causes people to insufficiently adjust from anchor values (Lewis, Gaertig, & Simmons, forthcoming in Psychological Science).

Additional Research Projects

Celia is also interested in understanding how people judge others based on the advice they give or the information they have about them. Celia has explored these research questions:

  • Do people prefer advisors who provide paternalistic advice over those who provide decisional autonomy? (Kassirer, Gaertig, & Levine, data collection in progress)
  • How does the magnitude of an anger expression influence perceptions of competence and status conferral decisions? (Gaertig, Barasch, Levine, & Schweitzer, working paper)
Continue Reading

Research

  • Celia Gaertig and Joseph Simmons (2018), Do People Inherently Dislike Uncertain Advice?, Psychological Science, 29 (4), pp. 504-520.

    Abstract: Research suggests that people prefer confident to uncertain advisors. But do people dislike uncertain advice itself? In eleven studies (N = 4,806), participants forecasted an uncertain event after receiving advice, and then rated the quality of the advice (Studies 1-7, S1-S2) or chose between two advisors (Studies 8-9). Replicating previous research, confident advisors were judged more favorably than advisors who were “not sure.” Importantly, however, participants were not more likely to prefer certain advice: They did not dislike advisors who expressed uncertainty by providing ranges of outcomes, numerical probabilities, or by saying that one event is “more likely” than another. Additionally, when faced with an explicit choice, participants were more likely to choose an advisor who provided uncertain advice over an advisor who provided certain advice. Our findings suggest that people do not inherently dislike uncertain advice. Advisors benefit from expressing themselves with confidence, but not from communicating false certainty.

  • Joshua Lewis, Celia Gaertig, Joseph Simmons (Forthcoming), Extremeness Aversion Is a Cause of Anchoring.

    Abstract: When estimating unknown quantities, people insufficiently adjust from values they have previously considered, a phenomenon known as anchoring. We suggest that anchoring is at least partially caused by a desire to avoid making extreme adjustments. In seven studies (N = 5,279), we found that transparently irrelevant cues of extremeness influenced people’s adjustments from anchors. In Studies 1-6, participants were less likely to adjust beyond a particular amount when that amount was closer to the maximum allowable adjustment. For example, in Study 5, participants were less likely to adjust by at least 6 units when they were allowed to adjust by a maximum of 6 units than by a maximum of 15 units. In Study 7, participants adjusted less after considering whether an outcome would be within a smaller distance of the anchor. These results suggest that anchoring effects may reflect a desire to avoid adjustments that feel too extreme.

  • Celia Gaertig and Joseph Simmons (Under Review), The Psychology of Second Guesses: Implications for the Wisdom of the Inner Crowd.

    Abstract: Prior research suggests that averaging two guesses from the same person can improve quantitative judgments, an effect dubbed the “wisdom of the inner crowd.” In this article, we suggest that this effect hinges on whether people (1) resample their second guess from a similar distribution as their first guess was sampled from (what we call a Resampling Process), or (2) explicitly decide in which direction their first guess had erred before making their second guess (what we call a Choice Process). We report the results from seven studies (N = 5,768) in which we manipulated whether we asked participants to explicitly indicate, right before they made their second guess, whether their first guess was too high or too low, thereby inducing a Choice Process. We found that asking participants to decide whether their first guess was too high or too low before they made a second guess increased their likelihood of making a more extreme second guess. When the correct answer was not very extreme (as was often the case), this reduced people’s likelihood of making a second guess in the right direction and harmed the benefits of averaging, thus rendering the inner crowd less wise. When the correct answer was very extreme, then asking participants to indicate whether their first guess was too high or too low improved the wisdom of the inner crowd. Our findings suggest that the wisdom-of-the-inner-crowd effect is not inevitable, but rather that it hinges on the process by which people generate their second guesses.

  • Celia Gaertig and Joseph Simmons (Under Review), Why (and When) Are Uncertain Price Promotions More Effective Than Equivalent Sure Discounts?.

    Abstract: Past research suggests that offering customers an uncertain promotion, such as an X% chance to get a product for free, is always more effective than providing a sure discount of equal expected value. In seven studies (N = 11,238), we find that uncertain price promotions are more effective than equivalent sure discounts only when those sure discounts are or seem small. Specifically, we find that uncertain promotions are relatively more effective when the sure discounts are actually smaller, when the sure discounts are made to feel smaller by presenting them alongside a larger discount, and when the sure discounts are made to feel smaller by framing them as a percentage-discount rather than a dollar amount. These findings are inconsistent with two leading explanations of consumers’ preferences for uncertain over certain promotions – diminishing sensitivity and the overweighting of small probabilities – and suggest that people’s preferences for uncertainty are more strongly tethered to their perceptions of the size of the sure outcome than they are to their perceptions of the probability of getting the uncertain reward.

  • Robert Mislavsky and Celia Gaertig (Working), 60% + 60% = 60%, but Likely + Likely = Very Likely.

    Abstract: How do we combine others’ probability forecasts? Prior research has shown that when advisors provide numeric forecasts, people typically average them. For example, if two advisors think an event has a 60% chance of occurring, people also believe it has a 60% chance (more or less). However, what happens if two advisors say that an event is “likely” or “probable”? In four studies, we find that people combine verbal forecasts additively, making their forecasts more extreme than each advisor’s forecast. For example, if two advisors say something is “likely,” people believe that it is “very likely.”

Awards and Honors

  • Jay H. Baker Retailing Center Doctoral Student Grant, 2018
  • Wharton Risk Center Russell Ackoff Fellowship, 2015, 2016, 2017, 2018
  • SPSP Graduate Student Travel Award, 2017
  • Paul R. Kleindorfer Scholar Award, 2017
  • Winkelman Fellowship, 2016-2019, 2016
  • Marjorie Weiler Prize for Excellence in Writing, 2015
  • Wharton Doctoral Student Travel Grant, 2015
  • Graduate and Professional Student Research Travel Award, University of Pennsylvania, 2014
  • German Academic Exchange Service DAAD Scholarship, 2013
  • Erasmus Program Scholarship, 2010

Activity

Latest Research

Celia Gaertig and Joseph Simmons (2018), Do People Inherently Dislike Uncertain Advice?, Psychological Science, 29 (4), pp. 504-520.
All Research

Awards and Honors

Jay H. Baker Retailing Center Doctoral Student Grant 2018
All Awards