Google/YouTube: Unsafe At Any Speed

Google/YouTube: Unsafe At Any Speed
This post was published on the now-closed HuffPost Contributor platform. Contributors control their own work and posted freely to our site. If you need to flag this entry as abusive, send us an email.

On the margins of this month’s UN General Assembly, Silicon Valley’s vainglorious social media emissaries were taking heat from European leaders — and they were deservedly sweating.

The occasion was an unprecedented event called “The Leaders Meeting on Preventing Terrorist Use of the Internet” -- a gathering convened by European government leaders fed up with the irksome shortcomings and broken-record alibis social media companies offer up why they are unable (more appropriately, unwilling) to self-police dangerous content and expeditiously remove terrorist and extremist videos and accounts from their platforms.

Hosted by the UK’s Prime Minister Theresa May, one European leader after another scolded social media companies for failing to live up to their public commitments to up their game against terrorist content. PM May, whose country had endured a seventh terrorist attack – the latest on its London Underground a few weeks ago -- pressed social media companies to get their acts together:

“Industry needs to go further and faster in automating the detection and removal of terrorist content online and developing technological solutions which prevent it from being uploaded in the first place.”

European leaders indicated that over two thirds of the dissemination of new terrorist content takes place within two hours of release. Consequently, the combined leadership of Great Britain, France, the Netherlands and Italy on behalf of the European Union issued a direct two-part challenge to social media giants:

1. Develop solutions to remove such content within one to two hours of upload, with the wider objective of preventing such content from being uploaded in the first place; and

2. Develop technical tools to ensure that persons tempted by violent extremism are not exposed to content that reinforces their extremist inclinations (so called “algorithmic confinements.”

Europeans lose men, women, and children to terror (as do Americans). On the other hand, social media moguls – comfortably isolated in their glass encased Silicon Valley ivory towers – don’t even lose a good night’s sleep over these dangers to the safety and well-being of Americans and their allies abroad.

Three months ago, Google-owned YouTube’s General Counsel pledged to speed up YouTube’s efforts to take down extremist content.

Since then, YouTube’s management team has spent more time deflecting, and less time doing – and whatever it may be doing is at a snail’s pace. YouTube is failing so far to meet its commitments to the public: instructional ISIS terrorist videos (e.g., how to ram pedestrians or mix chemicals for bombing making) remain up far too long on before being flagged by folks like myself. Its management is technologically tongue-tied why it still refuses to deploy new and PROVEN software fixes to remedy its shortcomings.

Worse, instead of devoting resources to remedy its neglect of the public’s safety Google’s executives are lavishly spending millions of dollars on lobbying efforts in Washington to prevent Congress from shining a public spotlight on the best ways to solve the problem. It seems that Google would just as soon hide under a rock rather than be transparent with the American people.

Just two weeks ago, Google dispatched its lobbyists to the Senate to prevent a Senate Judiciary Subcommittee chaired by Lindsey Graham from convening a Congressional hearing on social media and extremism. It succeeded! What was supposed to be a full-fledged hearing reverted to a closed door off-the-record briefing because Google did not want the media or the public in attendance. It reminded me of the tobacco industry doing everything possible to squelch Congressional hearings into the dangers of smoking.

Had there been an open public Congressional hearing this is what the public and the media would have discerned:

·According to the Counter Extremism Project (CEP) (www.counterterrorism.com), YouTube is still inconsistently applying “warning labels” to unquestionably inciting terrorist inspirational content of dead radical cleric Anwar al Awlaki (.https://www.counterextremism.com/press/anwar-al-awlaki%E2%80%99s-violent-legacy-continues-years-after-his-death).

· Despite appeals from governments and counter terrorism experts, Google YouTube has refused to remove Awlaki’s most incendiary sermon “Battle of the Hearts and Minds.” YouTube’s search, autofill, and recommendations features easily guide viewers to extremist content.

· American corporate ads are still appearing on extremist content despite Google, having been boycotted by advertisers (and losing over $700 million in ad revenue earlier this year), even though Google promised that would cease.

· Google refuses to invite third party software developers to any meeting of its Global Internet Forum for Combatting Terrorism that are not on their payroll for fear of it being revealed that existing third party-developed software exists to expedite a solution to the challenge.

· Terrorist groups, such as ISIS are outsmarting social media companies by co-opting multi-integrated platforms, such as Google Plus, Twitter, and Facebook to avoid having one platform serve as a dissemination point.

Eric Feinberg, a developer of GIPEC software that almost instantaneously identifies uploaded ISIS content, and who monitors YouTube’s metrics performance told me:

“The most consistent thing about Google YouTube fighting extremist content is its inconsistency.”

While social media companies dabble and dawdle fighting online extremist content, our terrorist adversaries are running circles around them.

Even on the run as its Caliphate collapses, ISIS is still able to upload onto YouTube daily about 20 inspirational and tactical videos…the latter providing specific instructions how to mow down pedestrians with a truck, how to construct high explosive devices from chemicals available at your local hardware store, and how to construct suicide vests.

These instructional videos have been catalysts for recent terrorist attacks in Nice, London, Barcelona, and in Finland.

YouTube continues to shirk its duties by relying on third parties to come across these instructional terrorist videos at their leisure, rather than Google deploying available technology that would short-circuit these videos from being widely disseminated.

As CEP has proposed in its August 2017 report on Anwar al Awlaki [https://www.counterextremism.com/anwar-al-awlaki-counter-narrative]:

“The logical action for YouTube – and other companies controlling private content supporting services – is to remove such extremist content in the same manner as it removes…pornography.

So why hasn’t Google deployed the available technology to take down terrorist content the same way it prevents pornography from being uploaded? Simple: it wants to avoid any financial exposure assuming a duty of care to remove the content – even when it is violating its own terms of service to its customers.

Under current US law, Google can be held criminally liable and fined for enabling child porn to be uploaded on YouTube, whereas it is shielded under existing laws from terrorist content uploaded on to YouTube. In other words, until Congress amends the law shielding social media companies from disseminating terrorist content, there is simply no public burden on them to decisively act. And when they act, it is on their terms and at their own miserly pace.

Too many lives have been lost that could possibly have been saved had social media companies been compelled to act.

The New York Times stated above the fold in its Business Section on September 21st: “Pressure is Growing to Rein in Tech Titans.” The issue at hand is pending legislation that would hold social media companies accountable for hosting sex trafficking on their websites. Leading the charge against the bill, Google’s representatives asserted that changing the law (referring to the Communications Decency Act of 1996 – the very law YouTube hides behind to absolve itself of terrorist content liability) “jeopardizes bedrock principles of a free and open internet.”

So, I issue my own challenge to Congress: why stop at sex trafficking. Why not add a provision to Senator Portman’s legislation to compel social media companies to cease hosting terrorist incitement and tactical bomb making and instructional videos as well to allow state and local authorities to prosecute social media sites that host such content when they have the available means to expeditiously remove it? Isn’t flagrant terrorist incitement and instructional videos showing how to build terrorist bombs as dangerous, if not more dangerous than sex trafficking?

Sooner or later, the public interest must prevail. Social media companies, such as Google, should no longer be permitted to have their cake and eat it, too. The time is long overdue to consider whether these companies should be regulated like public utilities – free to operate, but subject to limited regulatory oversight. Unless they are prepared to fulfill the pledges set out by our European allies, they must be held accountable.

ISIS and Al Qaeda continue to freely deploy dangerous content on social media platforms to incite and inspire terrorism against us. The first step in preventing more terrorism at home and abroad is to suffocate their ability to do so.

Popular in the Community

Close

What's Hot