Google set to crack down on terrorism

For example, University advertisements were placed alongside Isis sympathiser videos.

"Terrorism is an attack on open societies, and addressing the threat posed by violence and hate is a critical challenge for us all". The company said that it thinks "this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints". On Sunday, the company said in a blog post about the implement more measures to identify and remove an extremist/terroristic content from YouTube.

Google said it would rely on the specialized knowledge of groups with experts on issues like hate speech, self-harm and terrorism.

Walker said Google is also increasing its counter-propaganda efforts, employing technology that "harnesses the power of targets online advertising to reach potential Isis recruits and redirects them towards anti-terrorist videos that can change their minds about joining".

The steps it plans to take were outlined in an editorial published in the Financial Times newspaper. If you're unaware, some of the most high-profile brands had quit the video streaming platform for the ads were being placed on or next to extremist content. The company is bulking up on technology and further adding to its human team of reviewers, who'll be able to catch what filters can not. Videos with nudity, graphic violent footage or copyrighted material are usually taken down quickly.

While the platform is already using image-matching technology and has invested in systems that use content-based signals, Google is now pledging four additional steps.

During the final six months of 2017, Twitter suspended nearly 377,000 accounts for promoting terrorism.

In the wake of recent terrorist attacks in England, U.K. lawmakers have placed blame on the Internet for allowing extremism to spread without effect. According to Walker, since 2012, there is one hour of content uploaded on YouTube each second.

YouTube has always been in a struggle about how to fight videos which encourage violent behaviors and hateful ideologies.

A spokesman for YouTube said the new policies were not the result of any single violent episode, but part of an effort to improve its service.

"We will also expand our work with counter-extremist groups to help identify content that may be being used to radicalise and recruit extremists", Walker said.

Acknowledging that more needs to be done to tackle content that violates its policies, Google said it was taking four fresh steps to help do so.

Earlier this year, YouTube lost millions in advertising revenue with major brands temporarily pausing spending as it was revealed their names were appearing next to videos with extremist views.

In the wake of atrocities in London Bridge and Manchester, Theresa May has urged social media companies such as Google, Facebook and Twitter to take down terrorist content.



Other news