The case of Trump winning the US election is surprising. No polling institute could have predicted this result. There are many articles with attemps to explain how this could happen. And the felt truth is that if you search long enough you will find every reason, from it was Hillarys fault, over Trump using “the art of war” principles (not kidding), up to a systematic failure of social media.
Nowadays, social media is playing a major role. This was the case in the last two Obama elections, it was the case in this election and it will be more and more in the future. What is new, is the discussion about how a failure in the system of social media could lead to a “wrong” outcome. More specifically: How can fake news influence voters in thinking wrong statements are fact and therefore voting for a wrong candidate they never wanted to vote for.
I think about this almost everytime I am using Spotify. Based on what it thinks I like, it will recommend me and show me music I should like. I hate it. I want to discover new music and just hear random songs which I do not know and find out if I like it. Maybe I like Schranz but I would never discover it if Spotify just shows me what it thinks I like based on what I hear? (Spoiler: I do not like it)
Without actively countersteering against these algorithms, it will be a neverending circle downwards in what the algorithm thinks you like, but you do only to a certain degree. And you get tired countersteering against it. After a while you just let it happen, because the algorithm never gets tired. And this is how “news” showing up works on Facebook.
It is easy to let Facebook influence you in only one direction. And you pull friends into it, too. Additionally it is very easy to just isolate yourself against other opinions. In the past this was not even possible. Reading news without seeing other headlines is almost impossible. Also talking and discussing about events and opinions is mostly enrichened by a even just a few other opinions. To block those opinions was not that easy. It is now. Just a few clicks ahead.
An opinion should not only base on one opinion or one side of the medal. But actually on hearing every side, thinking about what is right for yourself and then decide what your opinion really is. With social media this is not the case. What it lacks is an editor service like traditional newspapers have. Give readers a distinguished picture of what happens, what “facts” are right and wrong and also expose lies and wrong stories.
Facebook has seen itself for many years as just a platform where people can share content. But to only distancing itself from the responsibility of what is shared is not an option. Facebook realized this by making a decision of not showing nipples on the platform, they realized it by having lawsuits against themselfes in Germany because of demagoguery and they realized it now. What lacks is a solution to the problem. And firing all human editors and trusting only AI is maybe not the right solution, because there is often a large context to consider. Is a nipple porn? What about breast cancer awareness campaign? (Fun fact: In Germany there is a fixed degree a penis can stand without being porn (I think it was 45°))
For Spotify and Amazon this decision is an easy one. They add the degree of showing products which they think the user likely likes and compares it with an A/B-test based on usage and revenue. For Facebook it is more complicated – it is not just about how much revenue they generate and how long users stay on facebook, it is about how much they care about their users and about how much they care about them, that they get fact-checked distinguished opinions to make their own decisions.