You could get the impression Fake News is the substructure of Social Media. It can’t work without it. And with every debate especially about Facebook’s influence comes a side track about the company’s responsibility to stop the spread of fake news. The solution to those problems are quickly determined. Social Media companies just don’t put enough effort into the fight against fake news. And although these critics might not be wrong, they fall short on describing the big picture. Facebook, though being the main spreading platform of fake news, only is one wheel in the whole fake news machinery. Regulating more might diminish the spread. But it will never fight the fire. Because the deep structual problem originates from how we handle knowledge in the first place.
Primarily, we see knowledge as a commodity. It is precious. This is why we keep it, why we hide it, why we sell it. Take academic knowledge. Science is the foundation of fact-based, rational and logical decisions. With Climate Change as one of the most pressing topics of our century you would think that access to scientific knowledge has the rank of a fundamental right. Yet science is parked behind the paywalls of big scientific publishing houses.
And sites that want to change that, for example Sci-Hub, are fought like criminals. Same goes for most of the big newspapers that are allowing access to their articles only after buying a monthly subscription and lobby their way into a strict regulation about who can share what from their news sites. Meanwhile, fake news are accessible for free.
Furthermore, we use knowledge as a means to create separation. Knowledge is power. I know something that you don’t. Scientific communities create whole jargons around their knowledge. Only the ones who learn this vocabulary are “worthy” of joining the talk. Language is used to distinguish from those who don’t speak it and by its very nature prove to lack this knowledge.
Last but not least everyone of us became a sender. We can share the message we want to a broad number of people. The means to spread our opinion are exponentially more powerful. As are the means to spread the opinions of others. This comes with a certain form of responsibility. We allow a message to spread. We allow a message to reach more people. Many people are not used to that form of responsibility.
What could solve that problem now, though? Could Social Media be the solution? After all it claims to be there for everyone. And making information more accessible to a broader range of people and easily understandable might sound like a solid plan.
However, the same phenomenons I just described also apply for technology. When was the last time you *really* understood what the blackbox that we call “Facebook’s algorithm” is doing? Neither does Facebook have an interest in using accessible language to make what they do understandable nor do they want to share their codes for Open Source purposes. And when they are questioned about their responsibility in basically every topic, they are fast to push that responsibility to the users.
Fake news fall on very fruitful grounds in Social Networks. The issuers of those kinds of posts know how to play the game very well. They keep their message simple and in easy language, they make use of the fact that people can’t or don’t want to spend money on superficial information and they exploit the attention-boosting Social Media game.
This makes it hard for other players, for example NGOs or political parties to reach new user groups with posts that are not pushed to have a sensational character or a clickbait-y headline. Forcing Social Media networks to filter more, to delete more, to change their algorithms will not get rid of the problem. Because your Uncle Bill still won’t know what of the remaining notifications is Fake News and what comes from a serious source. And who is to say what a serious source is, anyway? The right for freedom of press is a very precious right to keep.
One thing that definitely would release some pressure from the cooker would be investments in government-led programs for more digital media literacy. Teaching the people how to judge information they see online, just how we learned that we judge news in the Daily Mirror differently than in The Guardian, if you look on the UK media landscape. Feel free to fill in the Tabloid News Peddlers in your country.
And we also need to rethink our general relationship to knowledge. How do we evaluate what we know? Are we open to share it with the world or do we want to exploit it? Do we want to have accessible comprehension or do we decide to separate ourselves from the “normal” people? Most scientific communities in an attempt to prove their raison d’être choose the second way. And through that maneuver themselves into an ivory tower of practical insignificance where they do not reach the people on the ground.
Most tech companies choose that path as well. The fancy tech words that you never quite knew what to do with them are more often than not used to separate “the futurists” from the masses and to create an aura of myth around it.
What would happen if both science and tech started working more with the general public instead of creating something that becomes an end in itself? Would this world be free of fake news? Probably not. They have always been around and they will always be around. But if everybody worked more towards a general awareness around knowledge, maybe we can reach a state where we openly exchange opinions and expertise by discussion rather than shouting at each other stuck in one’s own convictions.
Leave a Reply