top of page

Mike Benz (Part 1): The West’s Burgeoning Censorship Industry and the Government Funds Pouring In

“Whoever can control the Department of Dirty Tricks is able to use it to remove all opposition,” says Mike Benz.

He is the executive director of the Foundation for Freedom Online and a former State Department diplomat under the Trump administration.

The Twitter Files were just the tip of the iceberg, says Benz, who has been tracking the rise of the West’s censorship industry for years.

“22 million tweets were categorized as misinformation for purposes of takedowns or throttling through [the Election Integrity Partnership],” Benz said.

“It wasn’t just government individual takedown requests. It was government pressure … to create whole new categories of things to censor and then arming them with the artificial intelligence to then automatically scan and ban the new thought violations.”

In this comprehensive two-part interview, Benz breaks down the major players in today’s censorship regime and how tactics once used abroad were deployed to target Americans and so-called election “delegitimization” or COVID “misinformation” online.

“Graphika was immediately working with NATO’s essentially psychological warfare branch—the Hybrid Center of Excellence—in January 2020 … They had this sophisticated typography of what right-wing media was saying, what left-wing media was saying, what was being shared, the nodes and links between nodes of all the different narrative discourses on social media.”

“They will have a revolving door at the professional level. That is, people who are in government roles, for example, in Misinformation, Disinformation, and Malinformation at DHS, will get their next jobs at the German Marshall Fund or the Atlantic Council’s Digital Forensic Research Lab … It is a career path. It is a path to power,” Benz says.


Interview trailer:

Watch the full interview:



Jan Jekielek:

Mike Benz, it’s such a pleasure to have you on “American Thought Leaders”.

Mike Benz:

Thanks for having me.

Mr. Jekielek:

Mike, in our conversations, you told me that you have a mission of fostering a free and open internet. Where are we at now? You’re basically saying that this is not the case here.

Mr. Benz:

We’re very far removed from the days of what I consider to be the golden age of the internet between 2006 and 2016, when you had this combination of a mature social media ecosystem where people could share information, basically a pure information meritocracy. After that, the political turbulence of the events of 2016, instituted a revenge of the gatekeepers, an increasingly, incrementally more regimented system of censorship that we are now in the process of negotiating our opposition to.

Mr. Jekielek:

You’re saying that something profound happened in 2016 that changed the ecosystem dramatically. You said it was political turbulence, but what actually happened? How did the system change?

Mr. Benz:

There were two enormous and unexpected political events that year. In June 2016, you had Brexit. Brexit at the time was not just a small isolated domestic issue within the United Kingdom, it was viewed as an existential threat to the integrity of the European Union. Because at the time there was a fear that France would then go through Frexit with Marine Le Pen’s movement. Italy would go through Italexit with Matteo Salvini’s movement. You would have Grexit in Greece, and Spexit in Spain. The EU would come undone, and NATO would fall apart. The entire rules-based international order would collapse if something urgent wasn’t done about it.

And then, in quick succession, you had a candidate who at the time was an almost 20 to one underdog in the New York Times. On the morning of the 2016 election, you had Trump at about 5 per cent and Hillary Clinton 90 per cent, and a little bit left for the stragglers. But basically, it was this idea that this couldn’t happen, and yet it did. And it seemed like everything was going to fall apart with the rules-based international order unless the information ecosystem was radically and permanently altered. Because both of these events were viewed as being internet elections, if you will.

Social media was the reason that Nigel Farage developed the popularity of the Brexit movement. It was through his viral YouTube speeches to the European Parliament. It was the domination of Twitter hashtags and Facebook groups that were responsible for Donald Trump’s popularity at the base level. So, you had an organized effort to contain populism by containing the means through which populists could distribute their messaging and mobilize politically.

Mr. Jekielek:

Populist seems like a catchall term. Is it actually populists that we’re talking about?

Mr. Benz:

That’s their terminology. It’s fair to use because it captures the idea that base level opposition to elite institutions can come from both the Right and the Left. It’s not necessarily a Right-wing or a Left-wing thing. Left-wing populists like Bernie Sanders in the U.S. or Jeremy Corbyn in the UK were targeted with equal ferocity. It’s just that they didn’t come as close to power as Trump and the Brexit movement did.

Mr. Jekielek:

Why don’t we just sketch out where we are today? You describe it as a whole of society effort, which just sounds massive and unbelievable. You’re saying that a lot of people are beginning to understand what this is. They might know, “Oh, the Twitter files have exposed a lot of censorship.” They might have themselves experienced something, but they can’t necessarily see the whole picture. The whole of society, what does that mean?

Mr. Benz:

That’s actually the terminology of basically every mainstream censorship industry professional.


Addressing disinformation requires a whole of society approach.

This information is not going to be fixed by governments acting alone. I think we’ve seen that a whole of society effort is really key to the solution.

This is a whole of society challenge.

A whole of society approach. This is a whole of society problem.

Mr. Benz:

This is something that is now such a well-worn phrase within the censorship industry, that they often apologize at conferences for using the term, because it’s so well worn at this point. What it means is four categories of institutions in society all working together towards the common goal of censorship. You’ve got government, the private sector, civil society, and then news media and fact checking. So, let’s break down these four elements.

You’ve got DHS, FBI, DOD, the State Department, the National Science Foundation, the CIA, and National Endowment for Democracy. On issue-specific issues like Covid censorship, you’ve got HHS, NIH, CDC, and NIAID, all of these playing various roles at the government level.

Then, you’ve got the private sector, and you’ve got the tech platforms where the censorship actually occurs. That is where the button gets pressed, so to speak, or where the algorithms play out. You’ve also got private sector censorship technology development, which is the private companies whose job is to create machine learning and artificial intelligence to incorporate the training data to create the tools that are used for the active censorship.

And then, you’ve got corporate social responsibility, the CSR money that pours into it from the private sector. In fact, there’s a whole new impact investing angle, VCs investing in censorship companies, because there’s such a gold rush into this field. On the civil society side, you’ve got universities, NGOs, activists, nonprofits and foundations.

And then finally, at the news media and fact checking level, you’ve got the politically like-minded within the media who are propped up by the government, by the private sector, and by the civil society so that they can manage public narratives about various issues and can amplify pressure for censorship, by creating negative press on the tech companies, for example. You’ve got the fact checking conglomerates within those who flag the individual posts for the tech companies to manage. So, all four of those in concert have all been fused into basically the nucleus of a single atom.

Mr. Jekielek:

It’s hard to conceive how all of this works.

Mr. Benz:

When they have disinformation conferences, there will be representatives from all four institutions there. They will negotiate what their own preferences and needs are, and they will talk with each other about doing favors for favors. They will work out common terminologies, and common problems that they’re having.

They will have a revolving door at the professional level. People who are in government roles in misinformation, disinformation, and malinformation at DHS will get their next jobs at the German Marshall Fund, at the Atlantic Council’s Digital Forensics Research Lab, or at the Alliance for Securing Democracy. Stanford University has a fellowship there.

It is a career path; it is a path to power. We’re now going on essentially year five or six of this industry being created, so it’s reaching a stage of maturity, as it would for a technology space or an energy space. It’s becoming much more seamless as these roles become more interchangeable.

Mr. Jekielek:

What is it that unites these people, is it ideology?

Mr. Benz:

Different people are in it for different reasons. What I find most fascinating is the young people. It’s my contention that censorship is the fastest growing major on college campuses for ambitious young people who want jobs in Washington DC, or in Silicon Valley. Often, a top career path was you would go to Georgetown, you would major in international relations, and you would aspire to get a job on the Hill, and then work your way up, and/or maybe you’d start in finance and then transition over.

What has happened with the rise of the censorship industry, basically they don’t call it that, you don’t get your degree in censorship, you’ll get it in something like computational data science, advanced linguistics, the internet research lab, or the media lab. There are so many different ways to launder the concept, but essentially what they’re doing day-to-day in these majors and in these PhDs is they are fusing the social sciences with the computer sciences to help both Silicon Valley and big government control public discourse and control the political momentum of various ideas.

This puts young people right at the nexus of Google, Facebook, Washington DC, and Congress. So, you can shortcut making a tiny salary at the Hill out of Georgetown. You can take that pedigree into long term by going directly over to Google’s content moderation team or public policy team and working directly with Congress there, or essentially working directly with congressional cutouts. It is a path to power that is stunning in both the salaries these folks make and in how glitzy it is.

You really do get the cocktail party invitations, you really do get access to a beautiful life, and you get impact. You’re not a sort of desk jockey who’s correcting typos for the first five years of your career, you’re in the action. So, I think it’s very exciting for people, and I think they become very intoxicated with the power, the god-like power, if you will, that total censorship capacity gives you.

Mr. Jekielek:

As I’m listening to you speak, I’m still having trouble imagining how in 2016 this whole industry suddenly launches or is created. You’re saying it’s not out of nothing. You’re saying it’s maturing at this time, and it happened without most people being entirely aware, even though they were aware that there was more censorship, especially if they were targeted, of course. But you never imagined it would be something so grand as what you’re portraying here.

Mr. Benz:

These things were not on the front page of the New York Times or the Wall Street Journal. You pick it up in strange vibrations. For me, I came to it through the artificial intelligence space. I was an avid chess player as a kid, and I lived through that period when computers overtook humans in the capacity to play chess well.

I remember all the naysayers saying, “Chess computers will never be able to beat Garry Kasparov,” or “There will always be this ability to have the purity of the human spirit pierce through the dead soul of a chess computer.” And then, I remember the existential dread that came over the chess community when Garry Kasparov lost to Deep Blue, and it was like humans would never be able to compete against computers again. It was like this existential question, “What do we do in a world where you’ve got no hope?”

I remember in late 2016, when I first came across literature around the deployment of artificial intelligence for purposes of content moderation, it gripped me. I became fixated at the cognitive level on the existential threat that this posed. Every time I would try to have conversations with folks about it, both socially and politically, nobody took the concern seriously and laughed it off, in a very similar way that people did in 1996 before the Garry Kasparov match.

And so, for me, none of what’s happened has been a surprise to me. I only wish that folks had taken the issue much more seriously before the infrastructure became consolidated. Because now, it’s like trying to stop a cancer once it has already metastasized into the brain and the lungs, it’s much harder to do. It’s still essential to do, and that’s what I consider to be my purpose.

Mr. Jekielek:

What is it that you saw exactly? What did you realize that no one else realized?

Mr. Benz:

The power of control over words was very similar to the power of control over chess pieces. The way chess computers work for algorithms is they condense everything into a number system, so that you can grade every aspect of a chess position on a number scale to spit out a clean number that tells you who’s winning by and by how much. For example, if the computer says the position is -0.5, it means that the computer assesses the person who’s playing the black pieces to be up by approximately half of a pawn.

When I started looking into what was being done with artificial intelligence and natural language processing and machine learning training models that were being developed, they were using a very similar system to map linguistically what was happening in the human language on social media. If someone was talking about a Trump policy, you could map the linguistic topography of that narrative and you could grade all the different words and slogans and memes and concepts into essentially what looked like a chess computer readout for whether you want to play knight to F3 or bishop to C5.

The power this gives you is to be able to automatically trip varying levels of interventions, as they call it, which means censoring things. If the threshold goes above 1.5, this thing just gets banned. If it’s between 1 and 1.5, we’re going to shadow ban it. If it’s between .5 and 1, we’re going to just affix a fact checking thing to it. It gives you perfect control over the ability to determine the popularity of a narrative.

Mr. Jekielek:

Let me talk about the Twitter files. Okay, we’ve known about censorship for a while. At the Epoch Times, we’ve experienced hit pieces, and the deplatforming and demonetization associated with such hit pieces. This is some of what we’ve been talking about here. But what the Twitter files revealed to me was that there is censorship happening.

The thing that really hit me at one point as we were looking at these dumps is there is the ability to shape the perceptions of a whole significant portion of society by just excluding information. This is what you’re making me think of right now as you describe this chess analogy. But you say that the Twitter files are just kind of the tip of the iceberg?

Mr. Benz:

A very tiny tip of it. The fact is, my foundation, the Foundation for Freedom Online, had already covered a lot of the things that ended up coming out in the Twitter files. A lot of this was available just by listening to these folks involved in their own public meetings. A lot of these things were done on YouTube, or were added as Facebook videos, or were on their own websites. What the Twitter files revealed was basically the presence of censorship operatives at virtually every national security-related institution in the U.S. government, as well as in the intelligence and public health spheres.

There were Twitter files for the FBI, for the DHS, for the DOD, and for the State Department. I saw that at the State Department myself, everything from funding censorship-themed video games to promoting censorship of populist groups around the world, often with a conscious view of it having a boomerang effect on limiting the popularity of populist groups in the U.S. What the Twitter files tended to focus on, even in their most explosive cases, were one-off requests for censorship takedowns.

For example, the FBI would send a message to the Twitter Trust and Safety Team saying, “Here’s a batch of six or seven tweets that we don’t like, and we want you to take down. They violate your terms of service, so you may want to take them down.” That only captures the tiniest fraction of censorship that was actually done in each of the major geopolitical events that we’ve experienced in the past few years.

Look at these six or seven takedowns in the context of something like the Election Integrity Partnership [EIP], which had a formal partnership with the Department of Homeland Security to operate as their formerly designated disinformation flagger. 22 million tweets were categorized as misinformation for purposes of takedowns or throttling through the EIP.

Compare that to the six or seven tweets highlighted in a Twitter files dump. These are six or seven orders of magnitude, it’s not even the same ballpark. This is because it wasn’t just government individual takedown requests, it was government pressure and coordination with the changing of the policies in the private sector themselves to actually coerce the tech companies to create whole new categories of things to censor, and then arming them with the artificial intelligence to then automatically scan and ban the new thought violations that they themselves had helped install. So, they did a one-two punch behind the scenes that the Twitter files still have not even come close to touching.

Mr. Jekielek:

How are you cataloging all this? Where are you discovering all this, and the evidence of this happening?

Mr. Benz:

What we just covered was stated very frankly and directly by an individual named Alex Stamos, who was the head of the Stanford Internet Observatory, the anchor entity of the Election Integrity Partnership.

Speaker One:

My suggestion is if people wanted to get the platforms to do stuff, first you got to push for written policies that are specific and that give you predictability. And so, this is something we started in the summer in August, is as Kate talked about, Carly Miller led a team from all four institutions to look at the detailed policies of the big platforms and to measure them against situations that we expected to happen. Now, we’re not going to take credit for all the changes they made, but we had to update this thing like eight or nine times, right? And so, like putting these people in a grid to say, “You’re not handling this, you’re not handling this, not handling this,” creates a lot of pressure inside of the companies and forces them to kind of grapple with these issues because you want specific policies that you can hold them accountable for. The second is when you report stuff to them, report how it’s violating those written policies, right? So, there’s two steps here, get good policies and then say, “This is how it’s violating it.” We will have our statistics, right? But I think we were pretty effective in getting them to act on things that they hadn’t act on it before.

Mr. Benz:

The November 9th, 2022 report has about 20 to 25 embedded videos of censorship professionals confessing what they did. What I just cited here is how EIP, using DHS’s clout and pressure on the backend, coerced the tech companies to create a new category of censorship called delegitimization, which was anything in the 2020 election that delegitimized public faith or confidence in mail-in ballots, early voting drop boxes, or ballot tabulation issues on election day. 100 per cent of their targets were Trump voters and Right-wing populist groups.

It was the tech companies that didn’t want to do these policies initially, but they were coerced by EIP and EIP’s friends in the legislature; Amy Klobuchar, Elizabeth Warren, Mark Warner, Adam Schiff, and this whole intelligence committee, foreign affairs committee faction, as well as from others in the DNC to put pressure on the tech companies to create the censorship category.

And then, he laid out in that video the two-step process, which is one; you get them to change the policies by putting them in the grid and threatening and then creating negative news media. And then two; you engage in this mass documentation and assist with the actual development of the capturing of all the violations of the new policies you just got put in. Now, the reason they do all these confessions on video is because you have to understand censorship is not just an industry, it is a mercenary business.

Everyone in the censorship industry is competing for the same pool of government grant funds and donor dollars. It is a competitive industry at this point, we’re not in 2018, 2019 anymore. It is a mature industry with many players in it. You need to stand out. You need to prove what a good mercenary you are, what a good censor you are, how effective you are at silencing the opposition to the donors and the grant organizations.

You need to brag about it on video, so that you are more qualified than your opposition and your competitors at getting more government grants. In fact, right after Alex Stamos made this confession, not just on video, but in a 292-page public report, he, and the lab that he partnered with, got a $3 million government grant from the Biden administration. They became government-funded for the first time ever right after he made that confession.

Mr. Jekielek:

So many things are coming out of what you just said. But the first one is that this is now actually a competitive market for censorship that you’re talking about.

Mr. Benz:

It is an industry. It is a business subsidized by the federal government and by large entrenched commercial and political interests who all have varying investment in neutralizing opposition to their concerns, which can be done through censorship. Because social media is the great equalizer when it comes to creating social and political momentum.

Mr. Jekielek:

What is really interesting is what you’re describing. You’re talking about it in the context of election integrity, you used that term. It also applies directly when it comes to Covid misinformation, similarly. Is it the exact same tools that are essentially being used in the same way?

Mr. Benz:

Actually, it’s funny you say that, because we just covered the Election Integrity Partnership, EIP. It’s the entity that DHS formerly partnered with as their disinformation flagger. When the 2020 election ended, they had censored their 22 million tweets. They had 120 staffers censoring the Trump supporters for the 2020 election for DHS. There was no more election cycle until 2022, when they came back and partnered with DHS again for the midterms.

But in between then, they folded up briefly and then rebranded and renamed themselves as a new entity consisting of the same censorship entities. But instead of calling themselves EIP, they called themselves VP, the Virality Project. They did the exact same system of coordinating the government, the civil society, the private sector, and the news media and fact checking organizations.

Instead of doing election censorship, they did Covid censorship, but they did it with the exact same ticketing system. They had the exact same relationships with Facebook, with Google, with YouTube, with Twitter, with TikTok, with Reddit, and with the 15 different platforms they monitored. They had the same system of chopping conceptual opposition, which was in the election context, opposition to mail-in ballots and drop boxes and ballet tabula