Emotion and Reason in Moral Judgment

نویسنده

  • David A. Pizarro
چکیده

A great deal of research in moral psychology has focused on the interplay between emotion and reason during moral judgment, characterizing the two as forces working in opposition to influence judgment. Below, we review recent psychological research on morality, with a special focus on disgust and the nature of its role in moral and political judgment. We review behavioral, neuroscience, and physiological data looking at the role of disgust in moral judgment, with a particular emphasis on the role of emotion regulation—the process of shifting emotional responses in order to meet one’s goals. We suggest that dual-process theories of moral judgment are not well-suited to understanding the role of emotion regulation in influencing moral judgments and decisions. Theories that emphasize the primacy of one process over another may ultimately be missing the complexity how these processes interact to influence moral judgment. Keywords: emotion, reason, disgust, moral judgment, emotion regulation EMOTION AND REASON IN MORAL JUDGMENT 3 “It was on the moral side, and in my own person, that I learned to recognize the thorough and primitive duality of man; I saw that, of the two natures that contended in the field of my own consciousness, even if I could rightly be said to be either, it was only because I was radically both” -Robert Louis Stevenson, 1886, p. 56 Stevenson’s 1886 classic novel “The Strange Case of Dr. Jekyll and Mr. Hyde,” tells the tale of Dr. Jekyll, a man who wants to end the struggle between good and evil within him. In order to do so, he formulates and ingests a potion designed to split his mind into two—one personality to house his baselessness and immorality (Mr. Hyde), and one to house the morally pure traits he values most. The inner conflict that humans experience between their moral selves and their more unrestrained, egoistic selves has been a consistent theme in literature for centuries. While (largely) discarding the goodversus-evil aspects of this dichotomy, moral psychology has nonetheless embraced the basic division of mental processes into two general types—one mental system that is cold, rational, and deliberative, and another that is emotional, intuitive, and quick. This characterization has served as a basic organizational framework for understanding the processes involved in human judgment across a variety of domains, including moral and ethical judgments (e.g., Kahneman, 2011). This “dual-process” approach to the mind has motivated a great deal of research on moral judgment within the last decade, and has led psychologists to reconsider the historically dominant approaches to moral judgment, approaches which emphasized the primacy of reason. But the division of the mind into two systems, while fruitful, has encouraged researchers to characterize emotion and EMOTION AND REASON IN MORAL JUDGMENT 4 reason as forces acting upon moral judgment in opposing directions, and to focus on factors that give rise to the dominance of one over the other. This chapter will focus on a particular emotion—disgust—in order to illustrate that the simplicity of the dual-process approach may hide some of the more nuanced ways in which emotion and reason interact to produce moral judgment and behavior. In particular, it will highlight research on emotional regulation as an example in which reason guides emotion rather than battles against it. Taken together, these two bodies of research suggest that characterizing reason and emotion as separate, opposing forces in moral judgment is a caricatured description of two processes that often interact in complex ways to motivate judgment and action (Pizarro, 2000). The dethroning of reason For the majority of the twentieth century, psychologists viewed decision-making as a “cold” process in which individuals calmly and rationally weigh pros and cons to arrive at a decision that maximizes their utility (Loewenstein & Lerner, 2003). Research in moral psychology echoed this emphasis on reason: early studies of moral judgment focused on the importance of cognitive abilities across development, and on how the emergence of specific abilities shaped the child’s understanding of moral rules (Kohlberg, 1963; Piaget, 1932). As individuals matured, they would approach a higher “stage” of moral thinking, and while many would never reach the highest stages of moral reasoning, the majority of individuals would reach a stage of moral sophistication required to uphold the norms and laws of a functional society. 1 Notably, some early researchers suggested that children’s morality was largely emotionally driven (Wendorf 2001; Kline 1903). The insights from this approach were all but lost in the coming dominance of the cognitive developmental approach. EMOTION AND REASON IN MORAL JUDGMENT 5 However, as research on emotion grew it began to transform the behavioral sciences, and the study of judgment and decision making in particular. No longer was judgment characterized as a “cold” emotionless process, but as infused with emotion at every turn. In an influential paper, Haidt (2001) built upon this emerging affective science, and applied it to moral judgment by making a radical claim—that reason played a much smaller role in ethical and moral judgment than psychologists thought. Taking a note from the philosopher David Hume (1777/1960), Haidt argued that intuitions (often in the form of emotional responses) were, in fact, the primary causes of moral judgment. In what he dubbed “social intuitionism,” Haidt proposed that when individuals are faced with a moral question (e.g., “Julie and Mark are a brother and sister that had consensual sex”), and are asked to make a moral judgment (“How morally wrong was Julie and Mark’s behavior?”) it is their experience of emotion (in this example, disgust) that gives rise to the moral judgment (“What Mark and Julie did was wrong!”), not their ability to formulate reasons. Notably, in examples like this individuals are actually at a loss to justify their moral judgment and become “morally dumbfounded”—reduced to offering something like “it’s just gross” as their only reason for their judgment. For Haidt, ethical judgment is a great deal like aesthetic judgment: it is made quickly, effortlessly, and without a great deal of conscious deliberation (Greene & Haidt, 2002). More radically, Haidt claimed that if deliberate reasoning played a role at all in moral judgment, it was most likely as a post hoc attempt to justify an individual’s intuition-driven moral judgments (Haidt, 2001; Haidt & Bjorklund, 2008). As support for his claim, Haidt also drew on growing evidence from social psychology, demonstrating the power of nonconscious influences on judgment, as well as from cross-cultural EMOTION AND REASON IN MORAL JUDGMENT 6 research showing that cognitive-developmental theories (such as that of Kohlberg) did a poor job at predicting moral judgments in non-Western cultures (Haidt, Koller, & Dias, 1993; Shweder, Much, Mahapatra, & Park, 1997). Haidt’s social intuitionist model happened to coincide with a new approach to studying morality pioneered by Greene and his colleagues, who paired modern neuroimaging techniques with classic philosophical moral dilemmas to arrive at a similar conclusion—that emotions played a much larger role in moral judgment than previously thought. Greene and colleagues (2001), demonstrated that when faced with classic sacrificial moral dilemmas (in which one individual must be sacrificed to save a greater number of people), individuals often responded in a manner consistent with their gut, emotional reactions, and argued that this could be seen in the patterns of neural activation observed while individuals were reading emotionally evocative dilemmas. However, when given enough time to deliberate, individuals could overturn their initial gut reactions and reply with a more calculating (i.e., utilitarian) response. Yet for both Greene and Haidt, the emphasis was on the divided mind—one system that produces an initial moral judgment, and another capable of intervening occasionally to alter the judgment. While these approaches were influential in shifting moral psychology away from the historically dominant rationalist theories moral judgment, the pendulum swung swiftly in the direction of emotional primacy—the view that reasoning played little or no role in most moral judgments. Indeed, combined with the research emerging in other areas of judgment and decision making, it seemed fairly obvious that human judgment was driven if not solely, at least primarily by emotional and non-rational processes (Bargh & Ferguson, 2000). EMOTION AND REASON IN MORAL JUDGMENT 7 Yet a number of more recent findings have demonstrated that reason is not as inert as these accounts (or, at least, the stronger versions of these accounts) implied. For instance, there is evidence that simply temporarily encouraging rational thought can have a demonstrable effect on moral judgment. Researchers recently demonstrated that engaging in a cognitively challenging task prior to making a moral judgment can cause individuals to go against their initial intuitive moral judgment when responding to moral dilemmas (Paxton, Ungar, & Greene, 2011). In addition, when given sufficient time to deliberate, individuals are more likely to be persuaded by reasoned arguments that a seemingly wrong, yet harmless act was not, in fact, immoral (Paxton, Ungar, & Greene, 2011). Encouraging individuals to adopt a rational mindset can lead them to make judgments of moral blame that go against their intuitive initial reaction. For instance, individuals intuitively reduce their judgments of blame for harmful acts that were intended, although the causal link between the intention and the harm occurred in an unexpected fashion (Pizarro, Uhlmann, & Bloom, 2003). However, when instructed to think rationally and deliberately, individuals become insensitive to extraneous information about the causal link and focus on the intention of the actor and the harmful outcome. In other words, a simple prompt to respond rationally is enough to shift the nature of moral judgment toward answers that are more careful, deliberate, and free of normatively extraneous information. One final source of evidence for the ability of reason to influence moral judgment comes from research demonstrating that the likelihood of engaging in moral reasoning changes based on cognitive resources available to the individual at the time the moral EMOTION AND REASON IN MORAL JUDGMENT 8 judgment is made. Occupying an individual’s mind with a task (i.e., placing them under “cognitive load”) has been shown to interfere with the moral reasoning process, with individuals under load more likely to favor intuitive moral decisions (Greene, Morelli, Lowenberg, Nystrom, & Cohen, 2008). Cognitive load can also derail the moral justification process, reducing the likelihood that individuals will rationalize their own immoral behavior (Valdesolo & DeSteno, 2008). The corollary of this evidence is that, in the absence of constraints on the ability to deliberate, individuals are indeed making use of careful deliberation when arriving at moral judgment. In short, there is plenty of evidence that individuals can (and do) engage in moral reasoning when motivated or prompted to do so. Yet even research emphasizing the influence of reason adheres to a fairly simplistic dichotomy pitting reason against emotion/intuition. This may be because researchers tend to utilize methods designed to pit one process against the other, or that favor one process over the other (Monin, Pizarro, & Beer, 2007). The answers to emerge from these methods will, by design, seem as evidence for or against one side of the dichotomy. For instance, Kohlberg’s (1963) methodology involved presenting participants with moral dilemmas that pit two equivalent courses of action against each other, and to ask participants to verbally justify their decisions. When presented with the famous “Heinz” dilemma (in which participants must determine if it is better for Heinz to steal an expensive medicine in order to keep his wife from dying) participants are not only asked to determine which is the better decision, they are also asked detailed questions about their reasoning process. In its explicit prompting of reason, and in its use EMOTION AND REASON IN MORAL JUDGMENT 9 of dilemmas with equally compelling courses of action, any researcher would conclude that reasoning is at the heart of moral judgment (Monin, Pizarro & Beer, 2007). On the other hand, researchers who favored emotional/intuitive accounts of moral judgment often ask participants to evaluate the moral infractions committed by others, frequently focusing on extreme scenarios involving bestiality, child pornography, and incest (for reviews see Pizarro & Bloom, 2003; Monin, Pizarro & Beer, 2007). Unlike the Kohlbergian dilemmas, these moral scenarios create a strong, immediate affective reaction (and are not “dilemmas” in the traditional sense of the word, since one course of action seems clearly worse than another). Faced with such scenarios, participants rarely need to deliberate about competing moral principles in order to arrive at a judgment. Moreover, rather than being asked to reason aloud to justify their moral judgments, participants are generally asked to assess moral wrongness after the putative moral infraction has been committed (rather than to debate the possibilities for a future course of action). These kinds of reaction-focused questions tend to stack the deck in favor of an emotion-based account of moral judgment. Despite these methodological limitations, however, there is still a great deal of evidence pointing to a more complex interrelation between reason and emotion/intuition. Perhaps shifting the question from simply asking if reason influences moral judgment, and toward when and how reasoning influences moral judgment yields more nuanced insight. Take one example of a more subtle interaction between reason and intuition— studies of expertise have shown that a learned process can be made intuitive over time (Zajonc & Sales, 1966). Similarly, moral intuitions themselves may arise from prior instances of moral reasoning. A person may reason their way to a particular moral view EMOTION AND REASON IN MORAL JUDGMENT 10 (such as that animals should not be killed and eaten, or that slavery is wrong), and over time this moral view becomes intuitive (Pizarro & Bloom, 2003). The mechanism by which a reasoned choice may become an intuition is, unfortunately, not well-captured in a dual-process approach that does not take into account the ways in which intuition and reason interact over time. Disgust and Moral Judgment: A Brief Overview When it comes to the regulation of human behavior, many moral codes extend far past the concerns over harm and justice that have traditionally been the focus of the cognitive developmental tradition in moral judgment. A growing body of work has demonstrated that moral codes emphasize a number of other domains—respect for authority, group loyalty, and purity. For instance, large sections of the Bible, Koran, and many other religious texts focus on the importance of keeping oneself clean in body and in spirit (Haidt, Koller, & Dias, 1993; Shweder, Much, Mahapatra, & Park, 1997). The motivational fuel that enforces moral norms across these domains appear to be emotions such as anger, empathy, contempt, guilt, shame, gratitude, and (of particular relevance to this discussion) disgust (Rozin, Lowery, Imada, & Haidt, 1999; Decety & Batson, 2009; Lazarus, 1991; Trivers, 1971). Disgust—an emotion that likely evolve to prevent individuals from coming into contact with dangerous pathogens—has recently been strongly linked to moral, social, and political judgments (Schnall, Haidt, Clore & Jordan, 2008; Inbar, Pizarro, Knobe, & Bloom, 2009; Inbar, Pizarro, & Bloom, 2009). A great deal of research has been conducted in the past two decades in an attempt to classify when, where, and to whom EMOTION AND REASON IN MORAL JUDGMENT 11 disgust is expressed (Haidt, McCauley, & Rozin, 1994; Rozin et al., 2000; Olatunji et al., 2008; Tybur, Lieberman, & Griskevicius, 2009). It has become clear that disgust is strong, easy to induce, and provides immediate motivation to avoid the target of disgust. There also appear to be a set of near-universal elicitors—bodily and animal products such as feces, urine, vomit, and rotten meat (all potential transmitters of pathogens) seem to elicit disgust in most people. In addition, often all it takes is a single picture or word to make an individual feel full-blown disgust. In this sense, it is one of the least “cognitive” emotions—it can often seem more like a reflex. Research on individual differences in disgust has also shed light on the nature of disgust elicitors. The most widely used measure of individual differences in disgust is the Disgust Sensitivity Scale (DS-R; Haidt, McCauley, & Rozin, 1994, modified by Olatunji et al, 2007). This scale divides disgust into three unique sets of elicitors: core, animal reminder, and contamination disgust. A more recent scale proposes a different set of subdomains of disgust: pathogen disgust, sexual disgust, and moral disgust (Tybur, Lieberman, & Griskevicius, 2009). The authors suggest that each facet of disgust fits a specific evolutionary function: pathogen disgust (analogous to core/contamination disgust) is meant to protect an individual from disease, sexual disgust is meant to protect an individual from actions that would stand in the way of one’s evolutionary fitness (e.g. incest, individuals that one does not find aesthetically or histologically attractive), and moral disgust is meant to protect an individual from those that would hurt the success of the individual or the group (such as acts of selfishness). Nonetheless, both scales emphasize the fact that disgust appears to be a response to potential contamination from a substance, individual, or action. EMOTION AND REASON IN MORAL JUDGMENT 12 While disgust may not be the most relevant moral emotion, nor even the most common, we focus on disgust because the wealth of research on this emotion helps shed light on a more general point—that the interaction between reason and emotion in moral judgment is far more complex than one might expect (Pizarro, Inbar, & Helion, 2011). A great deal of evidence accumulated in the last decade that disgust easily extends its influence to the sociomoral domain—individuals use disgust terminology to describe both the revolting and the reviled (Rozin, Haidt, & McCauley, 2008). While it may have originated as an emotion meant to keep us from ingesting something dangerous, it now seems to motivate us to keep away from individuals or entire social groups, and to evaluate a certain kind of act as morally wrong. For instance, feeling disgust at the time of moral judgment increases the severity of a moral judgment—people think that an act is more wrong, and that an individual is more blameworthy—even when the source of disgust is completely unrelated to the target of judgment (Schnall et al., 2008). While Schnall and colleagues (2008) found a domain-general increase in moral severity, recent research has shown that feeling disgust may play an especially strong role in moral judgments having to do with purity; acts seen as wrong not because of their direct harm but because of their symbolic degradation or contamination of the self or society. Feeling disgust in response to purity violations (such as consensual incest) has been linked to more severe punishment for moral transgressors (Haidt & Hersh, 2001). Indeed, disgust is especially powerful in influencing judgments in the domain of sexual mores. Inbar, Pizarro, Knobe & Bloom (2009) found that people who are more easily disgusted (as measured by the “disgust sensitivity” scale; Olatunji et al, 2007) have more negative implicit attitudes about homosexuality. Echoing this, individuals who are EMOTION AND REASON IN MORAL JUDGMENT 13 easily disgusted are more likely to show negativity towards homosexuals, but not toward other out-groups such as African-Americans (Inbar, Pizarro, & Bloom, 2012; Tapias, Glaser, Keltner, Vasquez, & Wickens, 2007). In addition, inducing disgust increases people’s explicit and implicit bias against homosexuals (Dasgupta, DeSteno, Williams, & Hunsinger, 2009; Inbar, Pizarro, & Bloom, 2012). Individual differences in the tendency to feel disgust (i.e., disgust sensitivity) has also been linked more generally to political conservatism, specifically in political issues related to purity and divinity—such as abortion and gay marriage (Inbar, Pizarro, & Bloom, 2009; Inbar, Pizarro, Ayer, & Haidt, 2012). This is consistent with work showing that liberals and conservatives rely upon different kinds of moral intuitions for their judgments—liberals rely on assessments of harm and fairness when making moral judgments, whereas conservatives also rely on purity, loyalty, and authority (Graham, Haidt, & Nosek, 2009). Indeed, simply reminding people that there is disease in the environment (consistent with the motivation induced by disgust) can lead individuals to temporarily report being more politically conservative (Helzer & Pizarro, 2011). One of the more interesting ways in which disgust has been implicated in moral judgment comes from work on what some researchers have dubbed “moralization.” Within a generation (and perhaps not even that) we can observe concrete changes in societal views concerning the morality of certain acts. For instance, while in the 1960s smoking was ubiquitous, today smoking is confined to select areas, smokers are shown the door and asked to partake elsewhere, and morally judged for their behavior (Rozin, 1999). How did a behavior like smoking—which was so commonplace fifty years ago— become moralized over time? Rozin (1999) implicates disgust in this process of EMOTION AND REASON IN MORAL JUDGMENT 14 moralization—bringing a behavior that was previously seen as non-moral into the moral domain. Rozin and Singh (1999) showed that the targets of moralizing disgust can even change across one’s life span. They surveyed college students, their parents, and grandparents, and found that all three groups reported being equally as disgusted by and expressive of negative attitudes towards cigarette smoking, even though the grandparents indicated that they had grown up in an age that was more tolerant towards cigarette smoking. Researchers are increasingly learning about the neural and physiological correlates of disgust. Experiencing and recognizing disgust has been linked to activation in the anterior insula and putamen (Moll et al., 2002; Calder et al., 2000) however this relationship is not consistently found across all disgust studies that utilize neuroimaging techniques (Phan, Wager, Taylor, & Liberzon, 2002). Disgust has also been associated with greater facial muscle tension, both increased and decreased heart rates, and increased skin conductance (Demaree et al., 2006). Olatunji and colleagues (2008) found differences in the physiological reactions between different kinds of disgust: core and contamination disgust (such as the disgust over rotten meat, or at sipping out of a stranger’s beverage by mistake) were associated with increased facial muscle tension and heart rate while watching a video of an individual vomiting, and watching a video have having blood drawn was associated with higher facial muscle tension and decreased heart rate in individuals sensitive to “animal reminder” disgust (disgust related to gore, bodyenvelope violations, and dead bodies). Emotion Regulation: The intersection of reason and emotion EMOTION AND REASON IN MORAL JUDGMENT 15 Knowing how disgust works to influence moral, social, and political judgments is informative, but it paints an incomplete picture of how emotions (like disgust) influence individuals over time. A key limitation of many studies on emotion is that they do not take into account the various strategies individuals employ in everyday life to either avoid feeling certain emotions, feel them less strongly, or feel them more strongly. In fact, it is fairly evident that individuals engage in this sort of emotional regulation fairly frequently (Gross, 2002). This regulation is necessary, in part, because the environment in which emotions evolved is in many ways quite dissimilar to the current environment, making emotional responses in the modern world poor guides to achieving goals (Tooby & Cosmides, 1990; Gross, 1998). This ability to regulate emotions allows, more generally, for a rich interaction between an individual’s long-term, deeply valued goals and her short-term emotional reactions. In the case of moral judgment, the need for emotional regulation should be clear—individuals often need to alter their emotional states to coincide with their moral goals. Researchers investigating the regulation of emotion have proposed five different categories of emotional regulation (Ochsner & Gross, 2008): 1) situation selection— selecting situations that are conducive to attaining one’s goals or to avoid ones that are not (for example, a married man declining an invitation to grab a drink with an exgirlfriend), 2) situation modification—taking steps to alter one’s current situation to bring it in line with one’s goals (if the man does accept the invitation, choosing to talk about his happy marriage instead of reminiscing about the past relationship), 3) attentional deployment—focusing one’s attention on something else (choosing to focus on how gray his ex-girlfriend’s hair has become rather than on her ample cleavage), 4) cognitive EMOTION AND REASON IN MORAL JUDGMENT 16 change—changing one’s emotional understanding of the situation at hand by cognitively reappraising features of the situation (reframing the situation as catching up with an old friend rather than drinking with a former lover) and 5) response modulation—regulating the physiological response of an emotional state while it is currently being experienced (the man telling himself that his sweaty palms are due to the crowded bar rather than to any feelings of attraction). The first four components of emotional regulation have been referred to as antecedent-focused regulation strategies, and the fifth is referred to as a response-focused regulation strategy (Gross, 1998). Previous research has indicated that regulating negative emotions, and specifically disgust, can have downstream cognitive and physiological consequences. Multiple studies have asked participants to adopt an antecedent-focused (e.g. reappraisal) or response-focused (e.g. suppression) regulation strategy, and have demonstrated that each makes different contributions to altering one’s emotional experience. Gross (1998) had participants watch a disgust-eliciting video, and found that though both reappraisal and suppression reduced disgust-expressive behavior, reappraisal decreased ratings of subjective disgust while suppression had no effect on subjective disgust, and was instead linked to increased activation of the cardiovascular system. Recent research has demonstrated that this type of reappraisal process can be automated via the use of implementation intentions—regulatory strategies that take the form of an if-then plan— and that different implementation intentions can affect what aspect of the disgust experience is regulated (Schweiger Gallo, Keil, McCulloch, Rockstroh, & Gollwitzer, 2009; Gallo, McCulloch & Gollwitzer, 2012). Gallo and colleagues (2012) had participants form either a goal intention (“I will not get disgusted!”), an antecedentEMOTION AND REASON IN MORAL JUDGMENT 17 focused implementation intention (“I will not get disgusted, and if I see blood, I will take the perspective of a physician!”), or a response-focused implementation intention (“I will not get disgusted, and if I see blood, I will stay calm and relaxed!”) before reporting on valence and arousal while viewing a series of disgusting and non-disgusting images. They found that individuals who had formed an antecedent-focused implementation intention reported that the disgusting images were significantly less negative, but that there were no differences between this group and the goal-intention group on reported arousal, suggesting that this antecedent-focused strategy was changing the meaning of the emotional experience without altering the physical experience. Individuals who had formed a response-focused implementation intention reported significantly less arousal when viewing disgusting images as compared to the other two groups; however, there were no differences between this group and the goal-intention group on assessments of valence. Taken together, these studies suggest that different emotion regulation strategies can alter different components of the emotional experience. Within the moral domain, it remains unclear what aspects of disgust experience (valence, arousal, appraisal) working alone or in tandem contribute to moral judgment, and using different antecedent or response-focused strategies to regulate disgust may help illuminate this process. The neuroscience of emotion regulation We now know a great deal more about how the neural underpinnings of emotion regulation, and how emotion and reason interact within the brain. For instance, a study that used functional neuroimaging to look at different regulatory strategies showed that emotional suppression and reappraisal work on different time courses, specifically showing that when asked to regulate disgust, reappraisal was linked to increased EMOTION AND REASON IN MORAL JUDGMENT 18 activation in prefrontal areas (the medial and left ventrolateral prefrontal cortex—areas associated with cognitive control) during early stimulus presentation, and was correlated with decreased activity in regions known to be implicated in affective responses (left amygdala and insula) during the later stages of stimulus presentation. Emotional suppression showed a distinctly different pattern, and was linked to activation of prefrontal control areas during the later stages of stimulus presentation, accompanied by increased amygdala and insula responses (Goldin, McRae, Ramel, & Gross, 2008). This suggests that different regulatory strategies may play out over a different time course, and that they have a differential impact on the subjective, physiological, and neural components of an emotional experience. A great deal of research has implicated the prefrontal cortex, a region associated with volition, abstract reasoning, and planning, as playing a primary role in the process of emotion regulation (Ochsner & Gross, 2005; Wager et al., 2008). Emotion regulation appears to engage multiple areas of the prefrontal cortex, including the dorsolateral prefrontal cortex (dlPFC), the ventrolateral prefrontal cortex (vlPFC), and the dorsomedial prefrontal cortex (Ochsner & Gross, 2005). In addition, research suggests that successfully reappraising emotional stimuli involves both cortical and subcortical pathways, roughly illustrating that the process recruits areas of the brain associated with “cognitive” and “affective” processes (Wager et al., 2008). For instance, the amygdala, a subcortical structure heavily implicated in affective responses, plays an integral role in the processes of guiding attention to and forming evaluations of affective stimuli (Ochsner, 2004). The amygdala’s detection of affective stimuli can happen rapidly and can even occur non-consciously (Amodio & Devine, 2006). Further supporting its role as EMOTION AND REASON IN MORAL JUDGMENT 19 a key player in emotional regulation, amygdala activation and deactivation has been linked to the augmentation and reduction (respectively) of an affective response (Ochsner & Gross, 2005). Other affective brain regions involved in emotional regulation include the ventral striatum, the mid-portion of the cingulate cortex, and the insula—an area that has been implicated in the subjective experience of disgust and has been of particular importance in linking disgust with moral judgment (Lieberman, 2010). A greater understanding the interactions between affective brain regions and higher cognitive brain regions during emotional regulation may help shed light on both the psychology of regulatory behavior and on an understanding of how emotion regulation may inform moral judgment. A great deal of research in emotion regulation and neuroimaging has focused on cognitive reappraisal, an antecedent-focused regulation strategy that involves reframing emotionally evocative events. Many of these studies involve presenting participants with aversive images during a functional MRI scan, while giving them instructions on how to view the image in ways that may encourage the upor downregulation of their emotional response. Using this method, Wager and colleagues (2008) demonstrated that cognitive reappraisal relies on a bidirectional relationship between affective and cognitive regions. They found that the cognitive reappraisal of emotion involves the successful recruitment of areas associated with memory, negative affect, and positive affect/reward. Specifically, they found that the relationship between the left vlPFC (an area involved in higher cognition) and reappraisal success involves two mediating pathways: 1) a path which predicts reduced reappraisal success involving areas involved in negative emotion, such as the amygdala, lateral orbitofrontal cortex (OFC), and anterior insula, and 2) a path predicting increased reappraisal success involving areas EMOTION AND REASON IN MORAL JUDGMENT 20 implicated in positive affect/reward, such as the ventral striatum, the pre-supplementary motor area (SMA), the precuneus, and subgenual and retrosplenial cingulate cortices. The positive association between left vlPFC activation and the activation of both of these networks suggests that the left vlPFC plays a role in both the generation of (path 1) and regulation of (path 2) negative affect during cognitive reappraisal. In short, the ability to successfully regulate emotion relies on structures implicated in the generation of both negative and positive affect, as well as on the same structures being able to both reduce negative appraisals and generate positive ones. What this suggests is that the regulatory strategy of cognitive reappraisal has properties that overlap significantly with both systems—affective and cognitive. This echoes a claim made by Pizarro and Bloom (2003), who pointed to the importance of cognitive appraisals in guiding moral responses that are typically described as emotional. Taken together, this research suggests that the emotional reactions that accompany a moral evaluation can be regulated via cognitive reappraisal, allowing for a great deal of flexibility on the influence that emotions (like disgust) play in the formation of moral judgments. This bidirectional relationship between emotion and cognition makes sense within the context of moral judgment. In the classic trolley dilemma, an individual is asked to imagine that they are a trolley car operator and that a runaway trolley is hurtling down the track (Foot, 1967; Thomson, 1985). The trolley has to go on one of two diverging tracks: 1) a track where five people are working and 2) a track where one person is working. In a typical moral psychology experiment, participants are then asked about the permissibility of killing one to save five. Using this dilemma, Greene and colleagues (2001) uncovered EMOTION AND REASON IN MORAL JUDGMENT 21 the role of emotional engagement in moral judgment. To manipulate emotional disengagement, participants were presented with two versions of trolley-style dilemmas: 1) In the impersonal version, participants are told that they can hit a switch that will put the trolley onto a different track, where it will only hit one person, 2) In the personal version participants are asked to imagine that they are standing next to a large stranger on a footbridge that goes over the trolley tracks, and if they push the stranger, the trolley will stop, thus saving the five people. The researchers found that increased emotional engagement (personal vs. impersonal) elicited greater activation in regions of the brain that had been implicated in affective processing (the bilateral medial frontal gyrus, the bilateral posterior cingulate gyrus, and the left and right angular gyrus). In the impersonal-moral condition, they observed significantly more activation in regions associated with working memory (the right middle frontal gyrus, the left parietal lobe, and the right parietal lobe). Greene and colleagues (2004) extended this result, and showed that participants exhibited greater activation in the amygdala when they are resolving personal moral dilemmas than when they resolving impersonal moral dilemmas. During the personal moral dilemmas, participants also exhibited increased activation in brain regions implicated in theory of mind: the medial prefrontal cortex (mPFC), the precuneus, the posterior superior temporal sulcus (pSTS) and the temporoparietal junction (TPJ). The researchers used this as evidence to make the claim that personal moral dilemmas are more affectively charged, and further suggest that personal moral dilemmas involve a network that focuses attention away from the “here and now” and instead directs attention to predicting future events and considering the mental states of others (Greene, 2009). More recent work has EMOTION AND REASON IN MORAL JUDGMENT 22 modified the impersonal/personal distinction, instead focusing on psychological and neural differences between deontological judgments (which some have posited are automatic responses driven by emotion) and utilitarian or consequentialist judgments that some claim are the product of conscious moral reasoning (Greene, 2008; Greene et al., 2008). Nonetheless, the revised distinction retains the distinction between “emotional” and “cognitive” processing that gives rise to different kinds of moral judgments. Though the existence of two distinct systems is a plausible account for the observed pattern of results, reconciling this work with research in emotion regulation perhaps prompts a slightly different description regarding the processes involved in guiding these sorts of judgments. Rather than characterizing judgment as driven by two opposing processes fighting over which answer is morally correct, these dilemmas are prompting individuals to reconcile their affective responses with their moral goals through the regulation of their emotional reactions. Though we tend to think of the typical instance of emotional regulation as the down-regulation of an emotional response, there are times when individuals up-regulate their affective responses in order to meet their goals. Within the moral domain, this is particularly the case for empathy, where taking the perspective of another is often accompanied by increased emotional arousal for the self (Batson, 1998). The personal and impersonal versions of the trolley dilemma may just as easily be described as involving cognitive appraisals that facilitate the upand downregulation of emotional experiences, and that those who are able to regulate their emotions effectively are able to suppress or increase the affective response that they view as appropriate for the dilemma at hand. EMOTION AND REASON IN MORAL JUDGMENT 23 One source of evidence for the importance of such up-and down-regulatory strategies comes from research demonstrating that manipulating the self-relevance of emotional stimuli (akin to the personal/impersonal distinction in the moral research) can influence one’s affective experience (Ochsner et al., 2004). In one study, participants were asked to upor downregulate their emotions using a self-focused reappraisal strategy (i.e., to think about the personal relevance of each image as it appeared). For example, if participants were shown a picture of a gruesome car accident, participants were asked to either imagine themselves or a loved one in the negative situation (upregulation) or to think of the situation from a detached third-person perspective (downregulation). Participants reported that down-regulating emotion was significantly more difficult than up-regulating emotion. In addition, amygdala activation was modulated by reappraisal; with up-regulation being linked to increased activation in the left amygdala and down regulation was linked to bilateral amygdala deactivation. This self-focused reappraisal strategy may be analogous to the personal/impersonal moral distinction, in which individuals are asked to put themselves in the situation of physically pushing a man to his death or to physically distance themselves from the event by imagining themselves flipping a switch. Asking participants to imagine themselves causing or being personally involved in a situation is similar to a self-focused up-regulatory strategy, whereas asking participants to imagine pushing the button is a self-focused down-regulatory strategy. Thus, it seems possible that the differences observed in the personal/impersonal moral dilemmas may reflect the effects of upand downemotion regulation, rather than the workings of two distinct processes. This would be consistent with the suggestion that individuals who favor EMOTION AND REASON IN MORAL JUDGMENT 24 utilitarian solutions to affectively charged sacrificial dilemmas are either simply less likely to feel a negative affective reaction to the dilemma in the first place, or are able to down-regulate their negative emotional reactions in order to meet their utilitarian moral goals (Bartels & Pizarro, 2011). The most plausible approach regarding the processes involved in emotion regulation, especially, we think, in the domain of moral judgment, is what researchers have termed an “instrumental” account of regulation. This account breaks from the tradition of straightforward psychological hedonism—the view that individuals are always motivated to feel positive emotions and minimize negative emotions—and instead suggests that emotion selection and regulation can be described as maximizing the utility of a particular goal, even if the goal is best served by feeling a negative emotion (Tamir, 2009). Certain emotions may be more useful in some contexts than others—while pleasant emotions may be selected for when immediate benefits are greater than longterm benefits (e.g. smiling when one’s child presents them with a homemade drawing), when long-term benefits are greater, individuals may instead want to feel a helpful emotion, or one that will help them meet long-term goals (e.g. expressing anger when said drawing has been scrawled in permanent marker on the living room wall). Applying this framework to disgust, it seems possible that individuals may encourage their emotions of disgust when evaluating particular moral acts or individuals in order to effectively communicate disapproval and rejection of immoral behaviors (Hutcherson & Gross, 2011). Gaining a better understanding of when individuals feel disgust within moral contexts and how this response relates to the individual’s long and short-term goals may help us understand the role that individual differences in the EMOTION AND REASON IN MORAL JUDGMENT 25 tendency to experience certain emotions (such as disgust sensitivity) play in forming moral judgments. For example, individuals may up-regulate their disgust within moral contexts (e.g. when a vegetarian intensifies their disgust in order to fuel their moral indignation about animal cruelty) or down-regulate (e.g. when a liberal reappraises two men kissing as an act of love) based on the current context of the judgment, and on their specific moral beliefs and goals. The fact that two individuals who experience strong disgust arrive at different moral judgments makes more sense when taking account the ability to regulate emotional responses, rather than assuming a static, linear relationship between emotion and judgment. This may be true of other emotional reactions as well: individuals may up-regulate their anger when they are making judgments about punishment or assigning moral blame. It seems likely that one of the contributing factors to moral judgment is the ability to up and down regulate emotion depending on the context of the moral situation and the cognitive and motivational resources that are available to the individual at the time of moral judgment—something that simple dual process theories do not accommodate well. Yet the growing body of research looking at emotional regulation—which we believe should play a larger role in our psychological theories of morality—suggests that emotion and cognition are best viewed as a set of processes that are so deeply intertwined that it cannot be captured within a simple dichotomy. Individuals, using a variety of strategies, are able to selectively dampen or heighten emotional experiences—often in the service of their higher-order goals—and thus shape the contribution of their emotions to their moral judgments. In the same way that Jekyll cannot divorce himself from Hyde, human beings cannot divorce the cognitive from the affective. It appears that they are, quite literally, formed of the same stuffs. EMOTION AND REASON IN MORAL JUDGMENT 26

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

The Role of Emotion Regulation in Moral Judgment

Moral judgment has typically been characterized as a conflict between emotion and reason. In recent years, a central concern has been determining which process is the chief contributor to moral behavior. While classic moral theorists claimed that moral evaluations stem from consciously controlled cognitive processes, recent research indicates that affective processes may be driving moral behavi...

متن کامل

An fMRI investigation of emotional engagement in moral judgment.

The long-standing rationalist tradition in moral psychology emphasizes the role of reason in moral judgment. A more recent trend places increased emphasis on emotion. Although both reason and emotion are likely to play important roles in moral judgment, relatively little is known about their neural correlates, the nature of their interaction, and the factors that modulate their respective behav...

متن کامل

The Causal Role of Right Frontopolar Cortex in Moral Judgment, Negative Emotion Induction, and Executive Control

Introduction: Converging evidence suggests that both emotional and cognitive processes are critically involved in moral judgment, and may be mediated by discrete parts of the prefrontal cortex. The current study aimed at investigating the mediatory effect of right Frontopolar Cortex (rFPC) on the way that emotions affect moral judgments.  Methods: Six adult patients affected by rFPC and 10 hea...

متن کامل

The Charon Model of Moral Judgment

We present a model of moral judgment, Charon, which adds to previous models several factors that have been shown to influence moral judgment: 1) a more sophisticated account of prior mental state, 2) imagination, 3) empathy, 4) the feedback process between emotion and reason, 5) selfinterest, and 6) self-control. We discuss previous classes of models and demonstrate Charon’s extended explanator...

متن کامل

Moral judgments, emotions and the utilitarian brain.

The investigation of the neural and cognitive mechanisms underlying the moral mind is of paramount importance for understanding complex human behaviors, from altruism to antisocial acts. A new study on patients with prefrontal damage provides key insights on the neurobiology of moral judgment and raises new questions on the mechanisms by which reason and emotion contribute to moral cognition.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013