Published by Konrad Bocian
SWPS University of Social Sciences and Humanities, Poland
These findings are described in the article entitled The mere liking effect: Attitudinal influences on attributions of moral character, recently published in the Journal Experimental Social Psychology (Journal of Experimental Social Psychology 79 (2018) 9-20) and Self-Interest Bias in Moral Judgments of Others’ Actions, published in the journal Personality and Social Psychology Bulletin. This work was conducted by Konrad Bocian, Wieslaw Baryla, Wojciech M. Kulesza, and Bogdan Wojciszke from SWPS University of Social Sciences and Humanities, and Simone Schnall from the University of Cambridge.
Imagine that you are visiting the university library. Unfortunately, you missed the deadline, and you will have to pay a fine for overdue books. When you approach the librarian, she smiles at you and says that her boss is away and because she knows you a bit, she will waive the fine. Of course, you are baffled, but you also feel relief, so you thank the librarian and leave the library.
A couple of minutes later, a student from the marketing department asks you if you would like to answer a short marketing survey about the university. You agree and then answer some questions about the quality of the university infrastructure and staff working at the dean’s office and the library. At some point, you have to respond to the question, “To what extent do you like the last person contacted in the library, and how much do you agree that this person is honest and fair?” However, you do not know that this is a part of a psychological experiment. This experiment aims to examine how people judge the morality of other people when they benefit from their immoral behavior.
Presumably, right now, you think you would condemn the librarian in the survey and think of her as a dishonest and unfair person. The same answers were given by students who had to imagine the situation described above. However, we found the opposite pattern of responses among students who actually gained money due to librarian actions in the experiment which we conducted in the university library. When the librarian waived the fine, students liked her more and judged her as more moral than the librarian who sustained the fine and collected the money. Even though the librarian broke the university law by waiving the fine, students perceived her as moral. Why?
We develop a positive attitude toward people who help us reach our goals, and when we like someone, we also tend to see this person as moral. This association between liking and morality is bidirectional and arises from the automatic system of information processing. Therefore, as we observed in our experiments, we see the individual who acts in line with our interests (even if she or he breaks the norms) as more moral, because we like him or her better. We called this phenomenon the self-interest bias. But why do people think that they would condemn the librarian who helped them save some money? Because they are not aware of how this bias operates.
People think that their moral judgment is as rational and objective as scientific statements, but science does not confirm that belief. Within the two last decades, scholars interested in moral psychology discovered that people produce moral judgments based on fast and automatic intuitions than rational and controlled reasoning. For example, moral cognition research showed that moral judgments arise in approximately 250 milliseconds, and even then we are not able to explain them. Developmental psychologists proved that at already the age of 3 months, babies who do not have any lingual skills can distinguish a good protagonist (a helping one) from a bad one (a hindering one). But this does not mean that peoples’ moral judgments are based solely on intuitions. We can use deliberative processes when conditions are favorable – when we are both motivated to engage in and capable of conscious responding.
When we imagine how we would morally judge other people in a specific situation, we refer to actual rules and norms. If the laws are violated, the act itself is immoral. But we forget that intuitive reasoning also plays a role in forming a moral judgment. It is easy to condemn the librarian when our interest is involved on paper, but the whole picture changes when real money is on the table. We have known that rule for a very long time, but we still forget to use it when we predict our moral judgments.
Based on previous research on the intuitive nature of moral judgment, we decided to test how far our attitudes can impact our perception of morality. In our daily life, we meet a lot of people who are to some degree familiar, and we either have a positive or negative attitude toward these people. For example, we could meet someone who has socio-economical beliefs similar or dissimilar to our own. From previous studies we know that we like people who resemble us, so we should like individuals who agree with us more than those who do not. But how does this attitude influence our judgments about the moral character of these people?
In line with our assumptions, we observed that participants liked more strangers whose socio-economical beliefs were similar to theirs and perceived them as more moral and trustworthy. When strangers’ beliefs were dissimilar to our participants, we found that liking, morality, and trust decreased. In a different experiment, we used mimicry manipulation. Past research showed that mimicking another person gestures or facial expressions influences attitudes — unconsciously, we like the people who imitate us more. We confirmed that effect in our experiment and again found that the mimicry manipulation biased participants’ perception of the target person’s moral character.
In the last experiment, we used a classical attitude manipulation – mere exposure. This time, we showed our participants photos of men who they did not know. However, we manipulated how many times they saw each man. For example, in one condition they saw the same man 20 times while in another one only 5 times. As we predicted after 10 and 20 exposures, participants started to like the man from the photo more and judged him as more moral, compared to when they saw him only once or five times.
To this end, we have enough evidence to confirm that interpersonal attitudes — an exemplary case of subjective preferences — influence moral judgments and perceptions of moral character. This knowledge could be essential for a better understanding of how moral judgments determine our social life. For example, research shows that for 13,000 daily events, we report that 29% of them are related to morality. Based on these judgments, we decide who we trust and who we should avoid, with whom we can work, and who is our enemy.
We believe that these judgments are objective and, therefore, morally justified, even when they are harmful to others. Thus, as a society, we often disagree about morality. Yet, these disagreements are usually not about rational arguments, but about attitudes which are undoubtedly subjective and inextricably connected to our social life. Ignoring the influence of our personal opinions on a moral judgment may lead to a severe misrepresentation of reality.