Facebook & Google scamper to battle fake news as outcry rises
Since the past two weeks, Google, Twitter and Facebook have been stepping up the fight against fake and malicious content as critics say that the proliferation of such content on their platforms could have been one of the factors influencing the outcome of the US election in favour of Republican candidate Donald Trump
Published - Nov 30, 2016 8:27 AM Updated: Nov 30, 2016 8:27 AM
If the US election brought one thing to the fore it was the immense role that the likes of Facebook, Twitter and Google play in shaping public opinion just through the kind of content and articles are shared or circulated through them.
Since the past two weeks, Google, Twitter and Facebook have been stepping up the fight against fake and malicious content as critics say that the proliferation of such content on their platforms could have been one of the factors influencing the outcome of the US election in favour of Republican candidate Donald Trump.
Recently, both Facebook and Google announced that they will be taking new steps to curb websites that generate false or misrepresented facts. Media reports say that Google has vowed that it will work on a policy change to prevent websites that misrepresent content from using its AdSense advertising network. Meanwhile, Facebook also said that it updated its advertising policies to know include even fake news within the gamut of its misleading content rules.
“We have already launched work enabling our community to flag hoaxes and fake news, and there is more we can do here. We have made progress, and we will continue to work on this to improve further,” said Facebook Founder Mark Zuckerberg in a post on November 13, 2016.
While he admitted that hoax news was a concern, he did mention that it amounted to only a small portion of all the content that is shared on Facebook.
“Of all the content on Facebook, more than 99% of what people see is authentic. Only a very small amount is fake news and hoaxes. The hoaxes that do exist are not limited to one partisan view, or even to politics. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other,” he wrote.
On a follow up post on November 19, Zuckerberg answered how Facebook was looking to combat misinformation in more detail:
Stronger detection. The most important thing we can do is improve our ability to classify misinformation. This means better technical systems to detect what people will flag as false before they do it themselves.
Easy reporting. Making it much easier for people to report stories as fake will help us catch more misinformation faster.
Third party verification. There are many respected fact checking organizations and, while we have reached out to some, we plan to learn from many more.
Warnings. We are exploring labeling stories that have been flagged as false by third parties or our community, and showing warnings when people read or share them.
Related articles quality. We are raising the bar for stories that appear in related articles under links in News Feed.
Disrupting fake news economics. A lot of misinformation is driven by financially motivated spam. We're looking into disrupting the economics with ads policies like the one we announced earlier this week, and better ad farm detection.
Listening. We will continue to work with journalists and others in the news industry to get their input, in particular, to better understand their fact checking systems and learn from them.”
But Facebook is not alone in the eye of the storm. Recently, The Verge reported that a Google search for the US election results was throwing up an obviously biased and factually inaccurate blog as its top search result, which was showing Donald Trump leading both popular and electoral college votes (Donald Trump ended up winning the electoral college votes but not the popular votes).
According to one report, as part of its ongoing fight against hoax content, Google will soon be replacing the ‘In The News’ from the top of the desktop search results with a carousel of “Top Stories” similar to what is seen on the mobile. The idea behind it, says the report, is to reduce confusion among users.
For those wondering how much of an impact would some false content have on a major event like the presidential elections, consider this; Pew Research, in a recent report, states that 62 per cent of US adults get their news from social media. In another report, the institute says that nearly 14 per cent US adults say that social media was the most important source of information regarding the US elections, second only to cable news (24 per cent) and tied with local TV (14 per cent).
Host of Last Week Tonight, John Oliver, in his post-election episode, pointed out to one meme showing a disparaging quote purportedly made by Donald Trump about the Republican electorate that become popular on social media. “He never said that. It is not true, just as it is not true that the Pope endorsed Trump,” he said, talking about another popular news item that did the rounds in the run up to the elections.
“There is no longer a consensus on what a fact is,” Oliver further stated, possibly echoing what a lot of observers have been saying about social media for a long time. Even on a normal day, it is not strange to find posts claiming the most bizarre news as part of the newsfeed as is it is not strange to see social media regularly killing off celebrities even though they are hale and hearty.
The ease with which unverified, untrue and potentially damaging information percolates online is something that not only the tech companies but even users need to be wary about.
WhatsApp, Instagram, LinkedIn, Twitter, Facebook & Youtube