Internet

Facebook and Google showcase Las Vegas fake news

People stand behind police barrier tape outside the Luxor hotel Obelisk and the Mandalay Bay Resort and Casino, following a mass shooting at the Route 91 Festival in Las Vegas, Nevada, October 2, 2017.

Facebook and Google are facing fresh criticism for failing to hold back the tide of fake news online, as the aftermath of the mass shooting in Las Vegas once again revealed shortcomings in their algorithms.

Early on Monday the two leading online media companies helped showcase inaccurate reports that wrongly identified a man with strong leftwing leanings as being connected to the killings. The reports circulated on rightwing news sites before slipping through the automated filters used by Facebook and Google.

Both companies said the problems were short-lived and they were working to fix the failures but not before exposing themselves to a new round of criticism for not doing enough to prevent the spread of false and damaging information.

“There’s a way — the fact is, they don’t have the will,” said Scott Galloway, a professor of marketing at New York University and author of The Four, a new book about Amazon, Apple, Facebook and Google. He said the recent hiring of more staff to identify and remove false information was too limited to have an effect: “It’s pi**ing in the ocean — it’s a series of half measures.”

For Facebook, already under intense political pressure over the use of its network by Russian operatives during the US election, the latest slip has come at a difficult time. The misinformation, spread by a site called Alt-Right, appeared on Facebook’s “Safety Check” page, which people use to make sure their friends and family are safe after a crisis.

Facebook said the offending post was spotted by its global security operations centre but that “its removal was delayed by a few minutes”. In that time, it added, the post was “screen-captured and circulated online”.

The social networking company did not explain how its algorithms had allowed the fake information to be posted. “We are working to fix the issue that allowed this to happen in the first place and deeply regret the confusion this caused,” it said.

In Google’s case, a search for the name of the man wrongly accused of the shootings brought up a page of search results topped by three prominent boxes labelled “Top Stories”. One of these was a post from 4Chan, a site known for its online hoaxes and misinformation, which contained the false claim.

Google’s Top Stories are drawn both from its News service, which has some degree of curation, and from a general web search. The 4Chan result was drawn from the web.

While Facebook manually removed its post, Google said the 4Chan post was “algorithmically replaced”, and that this had taken “hours” from the time it first appeared. To protect itself from accusations of subjectively favouring some search results over others, Google relies on the weight of “good information” to drive out the bad from its results, or making changes to its algorithms that affect all searches equally.

“This should not have appeared for any queries and we’ll continue to make algorithmic improvements to prevent this from happening in the future,” Google said.

Meanwhile, Twitter also came under fire on Monday after a user posted a screenshot of a search that returned a result from Infowars, a site frequently criticised for peddling conspiracy theories, as the top result. The post reported a claim from militant Islamist group Isis that it was behind the Las Vegas shootings.

Though Isis had made the claim, reporting its statement without pointing out that it was unsubstantiated was seriously misleading for readers, said Dan Gillmor, a digital media expert who teaches at Arizona State University. “If a responsible news organisation is going to mention it, it should be in context,” he said.

Twitter was unable to say how many users saw the search result, but said the personalisation in its system meant that people who searched for the same thing often saw different results.

Show More
Close