This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

After last week’s mob at the Capitol, Facebook, Twitter and Reddit shut down accounts where people spread false narratives of voter fraud or plotted the attack. Some of the discussions of conspiracy theories and potential violence have moved to lesser-known fringe websites and apps including Gab, Telegram and 4chan.

I spoke with my colleague Sheera Frenkel about the risk of driving people away from the mainstream internet, and what she’s seeing from online conversations about possible further violence.

Shira: What are these lesser-known networks like Gab or Telegram like?

Sheera: Sometimes, like in Telegram groups, it can feel like a disorganized family group text with people talking over one another. But the conversations are usually off the rails. There is a lot of profanity.

And while these online forums typically say they’re havens for people to express any view, there’s a lot of intolerance for ideas that go against the groupthink. If someone in the comments says something like, “Let’s be open to the possibility that Joe Biden will be inaugurated as president,” that person is verbally attacked.

Is it counterproductive for mainstream social networks like Facebook to shut down groups discussing conspiracy theories or planning violence? Does it make people angrier and push them elsewhere online?

It’s complicated. It’s helpful to push conspiracists and extremists off Facebook, Instagram, YouTube and Twitter, which have been fertile ground for them to recruit mainstream followers. But yes, when people move to fringe websites, there are fewer opportunities to dissuade them from extreme beliefs.

People who study extremist movements say that the moment when someone starts to believe in a conspiracy theory or terrorist propaganda is the most effective time for someone to step in and have a conversation about it.

If you see your cousin questioning on Facebook whether dead people voted in the election, you can have a conversation about the evidence that those claims aren’t true. That probably can’t happen if people are talking about false claims of voter fraud on websites where almost everyone else agrees with them.

Since last week’s Capitol attack, what have people discussed on these lesser-known networks?

The Capitol breach emboldened people for what might be next. I’ve seen debate in these fringe groups of whether people should try to disrupt the inaugural proceedings or — and this is becoming more prevalent — whether they should bide their time. It’s important for people to understand that there’s a risk of more violence, even if the inauguration goes on without incident.

(Also catch Sheera’s interview on “The Daily” podcast about the online organizing after the Capitol attack. And my New York Times Opinion colleagues have an analysis of people who shifted over time from banal Facebook posting to sharing inflammatory views. )

From your reporting on the Islamic State and far-right groups in America, what have you learned are effective tactics against extremism?

A lesson from ISIS is that countering extremism requires cohesive action against both online and real-world behavior. Tech companies, supported by the U.S. government, worked together to kick ISIS out of mainstream social networks. That was paired with initiatives in the Muslim world to deradicalize people and military action against ISIS.

Experts say that the fight against extremists in America can’t just be social media bans. It takes expertise, funding and a commitment to reach people in schools and other places in their community to counter those beliefs.

If you don’t already get this newsletter in your inbox, please sign up here.


Sheera also wrote an article with Jack Nicas and Mike Isaac about the reasons behind a recent surge of new people using Telegram and Signal, messaging apps that give users the option for encrypted communications. That technology garbles the content of messages or phone calls so that no one but the sender and recipient can snoop on them.

Whenever there is attention on people using encrypted tech, it’s a chance to look at the good and the harm. Many pro-democracy activists in Hong Kong have organized on Telegram, in part to avoid detection by the authorities. But terrorists and child abusers also use encrypted technology to hide their tracks.

The dangers have made law enforcement organizations demand for years that tech companies create a way, a so-called back door, for them to peer into encrypted messages or burrow into encrypted iPhones. But security and privacy experts say that there’s no way to let good guys tap into encrypted technology without bad guys abusing it.

“The moment you create a back door, it’s an opportunity for oppressive governments to spy on journalists or pro-democracy activists,” Sheera told me. “I use encrypted apps every day to speak with sources.”

Jack has written before about the benefits of a messy middle ground between encryption absolutists and law enforcement.

That involves law enforcement focusing on targeted forms of intelligence gathering, including hacking encryption in individual cases — which the police do often — and doubling down on traditional investigative techniques when they don’t have access to every piece of digital flotsam.

Some technologists have also said that to balance the downsides of encryption, it might not be appropriate to use it in all circumstances.


Muji the cat hid for 11 days in the ceiling at La Guardia Airport before she was reunited with her owner. Here’s the complicated rescue mission that involved Abby the tracking dog and canned tuna.


We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.

Orignially published in NYT.

Do NOT follow this link or you will be banned from the site!
error: Content is protected !!