Mark Zuckerberg shares what Facebook is doing about fake news

Many tech companies including Facebook have continued to face accusations that fake news on their platforms may have influenced the recent presidential election after Donald Trump's historic win over Democratic rival Hillary Clinton.

Just a week ago, Mark Zuckerberg was calling it a "crazy idea" that a proliferation of fake news stories on Facebook, majority favoring a conservative bias, helped Donald Trump win the election.

'The bottom line is: we take misinformation seriously, ' the Facebook CEO said in his post. Facebook head Mark Zuckerberg initially dismissed the notion as "pretty insane", but this week Facebook and Google both said they would change their ad policies to prevent fake news websites from using their systems.

Now he suddenly takes it seriously in new post he published on Facebook.

Facebook group has made a decision to put in place better technical systems to detect what people will flag as false before they do it themselves.

Apart from that, he said that Facebook depends on the community to flag fake content so it would be making it easier for people to report fake news easily. That followed a similar step by Google, which acknowledged that it had let a false article about the election results slip into its list of recommended news stories.

"A lot of misinformation is driven by financially motivated spam", Zuckerberg acknowledged in his post.

Facebook's enterprise tool, Workplace, won't have a fake news problem.

For a long time, Facebook has heavily relied on input from the platform's community to cut down on the spread of misinformation or fake news.

Facebook co-founder Mark Zuckerberg recently claimed that 99% of what appears in people's feeds was authentic, but has admitted that steps need to be taken to stop the dissemination of false news.

Zuckerberg called the problem "complex, both technically and philosophically".

"Some of these ideas will work well, and some will not", he concludes.

Facebook's concern with fake news predates the 2016 elections.

As Aarti reported Thursday, Facebook has long relied on users to flag suspicious or offensive stories - and it relies on subcontractors in the Philippines, Poland, and elsewhere to make quick yes-no rulings on those cases, often within 10 seconds.

Working with more third-party fact checking organizations. He noted Facebook did not want to discourage the sharing of opinions or become "arbiters of truth". It is also considering more policies, along with stronger "ad farm detection".

Included in the post were seven points laid out by Zuckerberg as a roadmap as to how it now plans to tackle the issue, despite him saying that it goes against its protocol of revealing its plans early. They do, however, serve as click-bait for readers who are then exposed to the site's advertising along with the misinformation, generating revenue for the site itself while fooling the reader into thinking the news is real.

Related:

Comments


Other news