Posted on February 9, 2018

How YouTube Drives People to the Internet’s Darkest Corners

Jack Nicas, Wall Street Journal, February 7, 2018

{snip}

People cumulatively watch more than a billion YouTube hours daily world-wide, a 10-fold increase from 2012, the site says. Behind that growth is an algorithm that creates personalized playlists. YouTube says these recommendations drive more than 70% of its viewing time, making the algorithm among the single biggest deciders of what people watch.

The Journal investigation found YouTube’s recommendations often lead users to channels that feature conspiracy theories, partisan viewpoints and misleading videos, even when those users haven’t shown interest in such content. When users show a political bias in what they choose to view, YouTube typically recommends videos that echo those biases, often with more-extreme viewpoints.

A search for ‘the pope’ this week returned conspiracy theories and sensationalist videos alongside mainstream clips. The Journal conducted all searches while logged out of YouTube with history cleared, leaving the site with little user data on which to base its recommendations.

Such recommendations play into concerns about how social-media sites can amplify extremist voices, sow misinformation and isolate users in “filter bubbles” where they hear largely like-minded perspectives. Unlike Facebook Inc. and Twitter Inc. sites, where users see content from accounts they choose to follow, YouTube takes an active role in pushing information to users they likely wouldn’t have otherwise seen.

“The editorial policy of these new platforms is to essentially not have one,” said Northeastern University computer-science professor Christo Wilson, who studies the impact of algorithms. {snip}

{snip}

YouTube sometimes surfaces conspiracy theories on innocuous queries. A search last week for ‘lunar eclipse’ returned a video, with just 3,000 views, that suggested the earth is flat.

{snip}

YouTube has been tweaking its algorithm since last autumn to surface what its executives call “more authoritative” news sources to people searching about breaking-news events. YouTube last week said it is considering a design change to promote relevant information from credible news sources alongside videos that push conspiracy theories.

{snip}

YouTube engineered its algorithm several years ago to make the site “sticky” — to recommend videos that keep users staying to watch still more, said current and former YouTube engineers who helped build it. The site earns money selling ads that run before and during videos.

{snip}

Cristos Goodrow, YouTube’s lead recommendations engineer, said this week that the algorithm struggles with news and political recommendations partly because “it’s basically the same system that’s working for people who come to YouTube for knitting or quilting or cat videos or whatever.”

There is another way to calculate recommendations, demonstrated by YouTube’s parent, Alphabet Inc.’s Google. It has designed its search-engine algorithms to recommend sources that are authoritative, not just popular.

{snip}

Consider the results of a search for “FBI memo” on Friday, several hours after Republicans released a memo on how intelligence officials sought a warrant authorizing surveillance of a former Donald Trump adviser.

On YouTube, after small thumbnails from mainstream news sources, the top result came from BPEarthWatch, which describes itself as “Dedicated to Watching the End Time Events that Lead to the Return of Our Lord Jesus Christ. Comets, Asteroids, Earth Quakes, Solar Flares and The End Time Powers.” There were also videos from Styxhexenhammer666, whose informational page simply says, “I am God,” and from Alex Jones, the founder of Infowars, a site that often promotes conspiracy theories.

In contrast, a Google search led users to only mainstream news sources.

Google spokeswoman Crystal Dahlen said that Google improved its algorithm last year “to surface more authoritative content, to help prevent the spread of blatantly misleading, low-quality, offensive or downright false information,” adding that it is “working with the YouTube team to help share learnings.”

{snip}

In October, YouTube tweaked its algorithm to return more mainstream sources on breaking-news queries after searches about the deadly Las Vegas shooting yielded videos claiming the government was involved.

YouTube’s results to a search for ‘Las Vegas shooting’ on Oct. 3, two days after the attacks. The fifth result suggested the shooting was a government hoax.

In recent weeks, it has expanded that change to other news-related queries. Since then, the Journal’s tests show, news searches in YouTube return fewer videos from highly partisan channels.