Abstract: Listening to the other side is essential for communication and conflict resolution. However, even when a listener listens well, the speaker may still exclaim, “You are not listening to me!” We reason this occurs because speakers think someone who disagrees with them simply has not listened. Across seven studies with 2,090 participants, we find that speakers think a listener listened better when the listener agrees with them than when they do not. This effect emerged despite the conversation being the same and the listener signaling they understood the speaker. The effect held across communication contexts, discussions of moral or non-moral issues, and controlling for other positive impressions of the listener. We find that naïve realism explains this effect: speakers believe their views are correct, so they think listeners who agree with them can better process objective information. We discuss the implications of these findings for managing conflict and group decision-making.
Abstract: Across a pilot study and three preregistered experiments (N = 4128), we demonstrated that people knowingly shared conspiracy theories to advance social motives (e.g., to receive “likes”). In addition to accuracy, people seemed to value social engagement (e.g., “likes” and reactions). Importantly, people not only expected most conspiracy theories to generate greater social engagement than factual news, but they were also more willing to share conspiracy theories when they expected such theories, compared to factual news, to generate sufficiently greater levels of social engagement. In an interactive, multi-round, content-sharing paradigm, we found that people were very sensitive to the social feedback they received. When they received greater social feedback for sharing conspiracy theories than factual news, participants were significantly more likely to share conspiracy theories, even when they knew these theories to be false. Our findings advance our understanding of why and when individuals are likely to share conspiracy theories and identify important prescriptions for curbing the spread of conspiracy theories.
Abstract: Deception scholarship has focused on deceivers and has largely conceptualized targets as passive victims. We integrate the articles in this special issue, along with a broad body of literature on deception, moral judgment, and blame, to introduce the Shared Responsibility Model of deception (SR Model). The SR Model conceptualizes deception as a social process to describe how both communicators and targets are responsible for deception. Observers’ perception of the targets’ responsibility is a function of (1) whether targets should have expected deception, (2) whether targets took preventive actions, (3) targets’ inferred motives, and (4) targets’ characteristics. The SR Model also challenges the implicit assumption that as communicators’ responsibility for deception increases, targets’ responsibility decreases. The SR Model has important implications for research on ethics, communication, and behavioral decision making.
Abstract: Although many virtuous leaders are guided by the ideal of prioritizing the needs and welfare of their subordinates, others advance their self-interest at the expense of the people they purport to serve. In this article, we discuss conspiracy theories as a tool that leaders use to advance their personal interests. We propose that leaders spread conspiracy theories in service of four primary goals: 1) to attack opponents; 2) to galvanize followers; 3) to shift blame and responsibility; and 4) to undermine institutions that threaten their power. We argue that authoritarian, populist, and conservative leaders are most likely to spread conspiracy theories during periods of instability.