YouTube will leave misinformation in the search results, but will no longer recommend it.
The earth is actually flat, the Americans never landed on the moon, and the world is ruled by a secret government - the most popular conspiracy theories are very easy to find on YouTube. Because of the filter bubble , psychological conservatism , selective perception and the effect of the illusion of truth, a person searches the Internet for information that confirms his point of view, and is inclined to ignore new information if it contradicts established beliefs.
YouTube intends to overcome this unpleasant property of human nature. On Friday, the largest video service on the Internet announced that it plans to exclude from the list of recommendations video with conspiracy theories.
YouTube has been criticized for many years for recommending videos that spread disinformation. Now the policy is somewhat relaxed. On January 25, 2019, YouTube’s official blog announced that it would no longer offer videos with “border content” and which “misinform users in a harmful way” even if the video itself does not contradict the recommendations of the community and cannot be deleted.
YouTube says the policy change affected less than 1% of all video clips on the platform. But with the billions of entries in the YouTube library, that's a really large number.
Recently, Facebook, YouTube, Twitter, and other UGC platforms have come under serious criticism for helping to spread misinformation, fakes, conspiracy theories, and other viral information, which by its nature is easily distributed in social networks. There is a twofold situation. On the one hand, censorship of user-generated content contradicts fundamental human rights. On the other hand, no one forbids people to express their opinions, but the platform also has the right to apply its own rules and at least not help in distributing fakes, even if it makes a profit for it (number of views, amount of advertising, etc.).
Social media are forced to listen to these requirements. For example, Facebook recently banned hundreds of Sputnik employee accounts.who were engaged in propaganda in the Baltic countries and Eastern Europe. Similar actions are being taken by Twitter, deleting accounts that are used for coordinated manipulation of public opinion .
The main complaint about social media is that they recommend dubious content even if users clearly do not express interest in it. For example, recently, YouTube, for no apparent reason, recommended millions of users to watch a videotape of the September 11 terrorist attack. He is also accused of widening the political split in the country, pushing the already biased viewers to more extreme points of view. Studies show that YouTube’s algorithms really recommend a person to more and more extreme videos.on a topic he was interested in before. Probably, the algorithms of recommendations in the course of machine learning came to this behavior - perhaps this is a really effective strategy that maximizes the number of views. But this is how YouTube turned into “one of the most powerful tools for radicalizing society in the 21st century,” critics write .
The new policy is also the latest example of a more aggressive YouTube approach to content that many find unpleasant, even if it does not violate Community Rules .
At the end of 2017, YouTube moved “controversial religious or suprematic” content to a “limited state” (limited state), in which videos are not monetized by advertising, and comments and likes are disabled. Some videos are accompanied by a short message that they may be inappropriate or offensive.
YouTube has named three examples of videos that he will no longer recommend:
- that promote fake miracle cure for serious illness;
- who claim that the earth is flat;
- content that makes plainly false statements about historical events, such as the September 11 attacks.
Of course, this is not enough to completely filter misinformation, but with censorship you need to be careful, because it is a double-edged weapon.
YouTube will continue to recommend dubious videos to users who have clearly subscribed to this channel and will leave misinformation in the search results: “We believe that this change balances between maintaining freedom of speech and respecting our responsibility to users,” YouTube wrote in an official blog.