“1004.5€ a month in cash – to do nothing. Still unsure?” This post received 2,207 likes and 8,130 shares. It was posted on the Facebook page of Aufwachen Deutschland (Wake up Germany). It’s not a falsified update from the regional government of Rostock – it’s real. But that this money isn't allocated to refugees, but rather to a total of four personnel, is often denied. Thanks to social networks, rumors like these spread rapidly, whether from the left or right. For now, we’re going to refrain from mentioning the actual political camps or organisations that such posts come from.
Information and opinions can be spread with just a click of a button and are often illustrated through allegedly representative photos. A group of men kicking a person who’s lying on the ground. Or the disfigured face of a person who, according to the photo caption, had allegedly been beaten and raped by refugees. Photos that stir feelings of disgust and scorn. And this generates clicks and likes because emotions are key to the dissemination of content on social networks.
If someone is moved by the photos and words they see, those posts will quickly spread. If a post validates their own opinions, then it's unlikely that its sources and its substance will be scrutinized. Probably because there isn't much of an interest to do so, says Klaus Kamps, professor of communication studies at the Hochschule der Medien (College for Media) in Stuttgart. “Generally, humans have the problem of being resistant to other positions and outlooks. If we form an opinion, because everyone around us has the same opinion, for example, then we'll hold onto it for a relatively long time. Even in the face of backlashes or opposing arguments”. But the dissemination of posts through images and emotions naturally works in the other direction as well. Relief organisations post photos of children and families rather than men standing in front of a refugee shelter.
How a topic is perceived on Facebook depends largely on which pages people like and on what makes their friends tick. This is affected not least through people’s own Like habits. If I like a lot of posts from friends who care about human and asylum rights, then Facebook will show me more of these types of posts in the future. Behind social networks are algorithms that calculate what to show me. The algorithm knows what I like, what opinions and positions I “share” with others, and continues to validate my worldview with similar content. “Things get really interesting when people isolate themselves, when people stew in the juices of their own opinions in specific areas and never learn that there are different perspectives on a given issue”, says Kamps.
The combination of algorithm and people’s own behaviors can act as blinders to other opinions on social networks. A “Like” may be just a click on the web, but it significantly influences people's own perceptions. Posts on the web put emphasis on simple answers and statements and a lot of emotion, because of their brevity. They are often generalisations, are terse, and are in some cases completely untrue, such as what appears on platforms like mimikama or Facebook groups like Zuerst Denken – dann klicken.
To be guided by unsophisticated arguments about multifaceted topics like refugees and migration rarely leads to anything useful. But sometimes reading articles that deal with a topic in depth can help shape one’s opinion better than just sharing the opinions of others. “Now, not all refugees are criminals or dangerous. But just picture a bowl of M&Ms where 10 per cent are poisonous. Would you eat a handful?” LIKE! Wait . . . really?
You can read Ella Buchschuster’s fact check on three common prejudices against refugees here.