Thursday, March 28, 2019

Radicalization and Big Tech

A bit of news that I didn’t post a week ago: This was delightfully swift! New Zealand has banned the military-style semiautomatic and assault guns used in the mosque massacres. They can no longer be sold. There will soon be a buyback program of existing weapons. Owners who don’t comply will be fined. Many gun shop owners had already refused to sell what they had to people wanting to stock up before the ban went into place. Many owners have already turned in these kinds of guns.

NPR had a segment speculating on the difference between American and New Zealand gun culture. Second Amendment! A potent gun lobby! Yeah, but I think there is another component. America has a much stronger level of white supremacy. Our country was built on slavery.

Yes, there is white supremacy in NZ. Just ask any Maori.



The *Marketplace Tech* program on NPR is about five minutes long every weekday. Last week they did a series on how people get radicalized online and what Big Tech can do about it. Before I get into that here’s some sobering and scary statistics reported as part of Tuesday’s episode.

The gunman in the NZ massacres wore a helmet mounted camera to record his carnage. It got posted online. YouTube worked to find and removed tens of thousands of copies of the video. YouTube said the video was being uploaded every second. Facebook said it removed or blocked 1.5 million copies. I later heard they missed 20% of them – or 300,000 copies.

Wow! My heart goes out to the content moderators who were traumatized as they encountered this violence again and again.

So we have a guy so into enforcing social hierarchy that he videos while killing 50 people. And we have perhaps tens of thousands more so into enforcing social hierarchy that they want to make sure the world sees (and is repeatedly terrorized) by this violence. The message is clear: This is what one of ours just did. Submit or you’re next.

Back to what Big Tech is doing. On Tuesday Host Molly Wood talked about it with Becca Lewis of the nonprofit Institute Data & Society.

The extremist communities use humor both as a recruitment tool and because it offers plausible deniability (it’s just a joke!). Lewis’s group learned this from a leaked style guide. Extremists know how to stay just inside the lines by masking real meanings and using dog whistles.

There is definitely a playbook these groups use. They target disillusioned men who feel left behind by the system. They provide a community. They feed explanations on who is to blame for their situation. Then they shift into blaming particular groups.

What can Big Tech do? Treat it as a serious issue. In the same way they tamped down ISIS propaganda videos they need to tamp down supremacist videos. Adjust their operations so that a person watching one extremist video isn’t offered more under the guise of recommending something similar. There are consequences, some unintended, of the way they do business.

As I understand it these white men are disillusioned because they’ve been told all their lives that they are at the top of the hierarchy. But they look at their circumstances and that doesn’t look like the top of the hierarchy. These men are also missing other things – such as being an integral part of a community. They are radicalized by others saying you’re not at the top because those other people stole your spot.

On Wednesday Wood talked to Fathali Moghaddam, a psychology professor at Georgetown University. Radicalization takes place within online echo chambers (or media that acts like echo chambers) where the only voices are those of the group reinforcing each other. An attempt to simply shut it down will prompt users to use a different platform. The reinforcement can be disrupted by refusing to amplify extreme content, instead point users to positive content. Perhaps they can also suggest offline help.

On Thursday Wood talked to Dipayan Ghosh who used to work for Facebook and now is a researcher for the Harvard Kennedy School. Yes, it is hard for Big Tech to block or minimize radicalizing content. But not that hard. They’ve invested dollars into AI to better target ads. They should invest the same number of dollars into AI to identify supremacist content. They already sort out junk mail because they have an economic incentive to do so. But for other content they have an engagement model – what keeps the viewer engaged. Usually that’s more of what was just shown. That business model needs to change. If they don’t do it themselves Europe may force them to do it.

No comments:

Post a Comment