What does OORP mean on social media

Mark Zuckerberg's announcement looks like he's had some epiphany. "There is too much sensationalism, misinformation and polarization in the world these days," the Facebook founder wrote in a blog post on Friday: "Social media allow people to disseminate information faster than ever before, and if we don't target these problems fight, then we will strengthen it in the end. " Anyone who has not lived under a stone for the past few months nods in agreement.

Zuckerberg now wants the two billion Facebook customers to read the most trustworthy messages possible in their timeline and then debate them with their contacts. Only: what is trustworthy news? And what is fake news? Zuckerberg does not want to make the decision himself ("We would not be comfortable") and also not leave it to experts ("That would not solve the objectivity problem"), but rather to the users of his platform. "As part of our ongoing quality reviews, we're going to ask people if they know a news source - and if so, if they trust it." Such "trusted sources" are to be given priority in the USA from next week. This means that people on Facebook are more likely to see messages from and links to those sites that users themselves consider trustworthy. After the announcement, the share price of well-known media brands such as the rose New York Times and Rupert Murdochs News Corp.

The change has an immediate impact on which news and analytics are given priority. So it determines the worldview of the not so few people who mainly get information via Facebook. The total number of news will not change, Zuckerberg wrote, it will only shift the weight of the news providers to those that are considered credible by the community. So: what can go wrong?

On Facebook, as Zuckerberg found out, people don't just debate objectively or show themselves photos of funny unicorns. The election campaigns in the USA, France and Germany have shown here just how aggressive and cruel people can be. The users in social networks are often only surrounded by people who think like them. In the "echo chamber" worldviews solidify - until people choose for themselves which message they believe and which not, completely independent of the facts. It was this phenomenon that made fake news big in the first place. In addition, there is also intentional influencing. The short message service Twitter just announced that during the US presidential election campaign, many more user accounts than expected were connected to a troll farm belonging to the Russian government.

"It's a gratifying step to want to separate the wheat from the chaff," says Jason Kint, head of the nonprofit industry association Digital Content Next about Zuckerberg's idea: "The devil is in the details of how the company wants to do it: Can we do it Really trust ranking? What happens if it is manipulated? " With the innovation, Zuckerberg is in any case pulling the network out of the politically required responsibility for the content distributed there and its weighting. This signals that Facebook is just a platform, just a distribution technology. Zuckerberg chooses the path, at the end of which the guilt rests with the user if it goes wrong. And as has been shown recently, a lot can go wrong.