‘Misinformation’ panel dismissed censorship concerns as ‘bad faith’

'Misinformation' panel dismissed censorship concerns as 'bad faith.' 'It should come as no surprise that this activity was facilitated by unaccountable bureaucrats in government.'

‘It should come as no surprise that this activity was facilitated by unaccountable bureaucrats in government’

WND News Center

Members of a Department of Homeland Security (DHS) advisory committee on combating “misinformation” privately cast critics of their work, including those who raised the alarm over government censorship of free speech, as malign actors, according to documents obtained by the Daily Caller News Foundation.

An advisory panel under the Cybersecurity Infrastructure and Security Agency (CISA), called the Protecting Critical Infrastructure from Misinformation and Disinformation Subcommittee, issued recommendations to CISA in June on how to address threats to “critical functions” of democracy, including public health measures, the financial system, elections and the court system. The subcommittee recommended CISA detect “informational threats,” work with “non-governmental” sources to dispel mis- and disinformation and expand misinformation research.

However, behind the scenes, subcommittee members indicated their intent to expand CISA’s role over speech, characterizing critics of their work as bad faith actors, according to emails and meeting notes obtained by the DCNF through a public records request. The communications demonstrate the attitudes that members of the subcommittee held regarding speech, at the same time that they were advising the DHS on how to crack down on misinformation.

The subcommittee included University of Washington professor Kate Starbird, Center for Strategic and International Studies senior adviser Suzanne Spaulding, and former Twitter chief legal officer Vijaya Gadde, who played a key role in censoring the Hunter Biden laptop story, along with CISA officials.

The DCNF previously reported that the subcommittee sought to enlist left-wing research groups and pro-censorship organizations in its efforts to crack down on misinformation. Additionally, CISA Director Jen Easterly partly accepted the subcommittee’s recommendation that CISA fund outside research into misinformation and disinformation, with guidelines to be fleshed out in 2023.

“These censorship laundering schemes involving the federal government, social media companies, and other private and quasi-governmental actors are unlawful and an affront to the First Amendment,” Republican North Carolina Rep. Dan Bishop, chair of the House Homeland Security Committee’s Oversight subcommittee, told the DCNF. “Those who perpetuate these schemes seem dead set on destroying freedom of speech and suppressing any speech that they perceive as ‘harmful’ to their preferred narrative.”

‘Bad faith’

Much of the subcommittee’s discussions centered on the role CISA should play in regulating information. For instance, subcommittee members sought to expand CISA’s responsibilities regarding “malinformation,” defined in the committee’s June 2022 report as “information that may be based on fact, but used out of context to mislead, harm, or manipulate.”

CISA currently works with social media companies to flag “disinformation concerns” as part of its larger project to combat mis-, dis- and malinformation (MDM), according to its website, with a focus on elections and COVID-19.

In a May 17, 2022 exchange with Spaulding and Gadde, Starbird noted free speech concerns regarding CISA’s jurisdiction over malinformation, and appeared to blame bad actors for the popular notion that true information is within the bounds of democratic discourse. Additionally, she acknowledged that cracking down on malinformation might make the committee subject to criticism, though she dismissed such criticism as in “bad faith.”

“Hacked/stolen/deceptively obtained materials that are strategically leaked into the public sphere are technically malinformation – but unfortunately current public discourse (in part a result of information operations) seems to accept malinformation as ‘speech’ and within democratic norms,” she wrote. “By invoking malinformation, we may open up a potential vector of bath faith criticism to undermine our work. By not invoking it, we leave out a critical dimension of influence operations.”

Starbird then suggested firmly establishing that malinformation was within CISA’s purview, invoking alleged threats to democracy.

“So, do we bend into a pretzel to counter bad faith efforts to undermine CISA’s mission?” Starbird asked. “Or do we put down roots and own the ground that says this tactic is part of the suite of techniques used to undermine democracy?”

Spaulding agreed with the latter suggestion.

“The censorship bureaucrats acknowledged free speech concerns but said that anyone who objected on behalf of free speech was acting in ‘bad faith,’” Mike Davis, founder of the online watchdog Internet Accountability Project, told the DCNF. “It should come as no surprise that this censorship activity was facilitated by unaccountable bureaucrats in government.”

Starbird defended her comments to the DCNF, noting she was open to “good faith” criticism.

“We were concerned, however, about “bad faith” criticism — i.e., criticism employed for strategic reasons to undermine the work of the subcommittee, score political points, and make it more difficult for society (broadly) to address misleading claims about elections,” Starbird said.

However, she characterized allegations of censorship as “bad faith.”

“We felt that the term ‘malinformation’ was too broad and ill-defined to be useful to what we were recommending and that it would be an easy focal point for bad faith criticism that would attempt to equate efforts to address informational threats with ‘censoring content that we don’t like,” Starbid told the DCNF.

In a March 29, 2022 meeting, Maricopa County Recorder Stephen Richer, during a meeting with the subcommittee, cited citizens filing public records requests to “abuse” state resources as an example of what he claimed was malinformation, according to minutes obtained by the DCNF.

The subcommittee did not appear to push back on the suggestion.

Additionally, in closed meeting notes obtained by the DCNF, the subcommittee appeared to cast those who criticized the involvement of the federal government in regulating information as “bad faith,” while acknowledging “good faith” concerns about the board’s purview.

“[W]e have been working to address concerns about releasing [recommendations] amidst the outcry about the Disinformation Governance Board – that outcry includes good faith criticism (re: about the board’s mandate, potential partisanship, freedom of speech); it also featured bad faith attacks (by some who benefit from MDM … and meant to reduce our collective response to MDM).”

In a previous email with subcommittee members, Spaulding reiterated that malinformation, while “not false,” was within CISA’s mission to address, but that their recommendations should stick to mis- and disinformation as that was easier to defend; other subcommittee members eventually agreed.

Indeed, in the subcommittee’s June 22 recommendations, “malinformation” is mentioned as a threat to critical functions but is not addressed through any substantial recommendations.

“Future recommendations may seek to address the potential impacts on other critical functions and some of the unique challenges in identifying and countering malinformation,” the report reads.

Overseeing social media

In its June recommendations, the subcommittee also advocated for CISA to keep information channels ranging from social media to talk radio “in view.”

“CISA should approach the MD problem with the entire information ecosystem in view. This includes social media platforms of all sizes, mainstream media, cable news, hyper partisan media, talk radio, and other online resources,” the recommendation read.

In a March 16, 2023 statement in response to media criticism of its efforts, the University of Washington’s Center for an Informed Public argued that this language should not be construed as recommending CISA “monitor” these information sources.

“To be clear, the MDM subcommittee explicitly never advocated for CISA to monitor or ‘closely monitor’ anything,” the center said. “During internal discussions, the members noted that questions about social media monitoring by CISA and other government offices were beyond the capacity of the subcommittee and, though initially tasked with providing recommendations on that aspect of CISA’s work, the subcommittee intentionally did not provide recommendations about this.”

However, in the same document, the subcommittee recommended the agency take a proactive approach to “detect” misinformation and disinformation.

“In this work, CISA’s activities should be similar to the Agency’s actions to detect, warn about, and mitigate other threats to critical functions,” the recommendation reads.

The subcommittee defined “critical functions” broadly in a previous section to include “election participation,” the court system, the financial system, disaster response and, notably, public health measures.

CISA Director Easterly partially accepted the recommendation to consider “critical functions,” but noted that resource constraints meant the agency would stay focused on election-related threats.

In closed meeting notes obtained by the DCNF, the subcommittee also intended to give guidance to CISA on “monitoring” at some point in the future, though it’s unclear what this would entail.

Additionally, draft recommendations obtained by the DCNF indicate the subcommittee, at least at one point, envisioned that CISA should “maintain awareness of activity across the broad spectrum of social media platforms.”

An email from Starbird to other members of the subcommittee in August reveals an awareness that the United States government was already communicating with social media platforms on certain “threats,” and that CISA could best support this work by “identifying” threats and informing local and state officials of these threats.

“CISA does not and has never censored domestic political speech and we do not request that content be taken down by social media companies,” a CISA spokesman told the DCNF. “CISA provides broad guidance on foreign influence and disinformation tactics, mitigates the risk of foreign influence and disinformation by sharing accurate information, and amplifies the voices of state and local election offices on issues of election security.”

However, CISA describes its role in combating MDM as flagging “disinformation concerns” to social media platforms, according to the agency’s website. The CISA spokesman acknowledged that the agency performed this function in the 2018 and 2020 elections.

Additionally, DHS, along with the FBI, would routinely email social media companies with a list of accounts to review under the platforms’ terms of service, according to journalist Matt Taibbi. These accounts would often be deleted.

Gadde and Spaulding did not respond to the DCNF’s request for comment.

This story originally was published by the Daily Caller News Foundation.

Also read:

Receive comment notifications
Notify of

Inline Feedbacks
View all comments
Would love your thoughts, please comment.x