top of page

Safeguarding Democracy against Disinformation: Insights from Taiwanese Civil Society


Disclaimer: The opinions expressed in this article belong to the author only and do not represent the views or policies of the U.S. government.

On August 2, 2022, pedestrians near convenience stores and a few public spaces around Taiwan began noticing that the content on digital billboards had been replaced with strident messages in Chinese, such as “Greater China will ultimately be unified!” That same month, the websites of a Taiwan university and other organizations were defaced with messages, including “There is only one China in the world!” Meanwhile, the military of the People’s Republic of China (PRC) was conducting days of drills and missile launches on a scale previously unseen, though certain photos and videos claiming to show Chinese military exercises near Taiwan were later proven by Taiwan-based fact-checking organizations to be doctored or fabricated.[i]

Left: A digital billboard in Taiwan was hacked in August 2022 to display the message “Blood relations between people of the same race cannot be broken, Greater China will ultimately be reunified!” (Credit: Zhushan Town Hall of Nantou County) Right: A Taiwan organization’s website was hacked in August 2022 to display the message “Unification of China, return of Taiwan”

This combination of disinformation, military saber rattling and hacking activity was the latest example of what Taiwan authorities, scholars and civil society have termed “cognitive warfare.” The experiences of Taiwan in trying to protect the integrity of its information environment against such influence attempts merit close attention from all societies and the United States in particular. This is because Taiwan provides a model for a multifaceted, grassroots response to combat disinformation, and also because a large proportion of disinformation in Taiwan relates to discrediting the United States as a reliable security partner.[ii] Based on interviews with a select group of Taiwan non-governmental organizations (NGOs), this article will illustrate how parts of Taiwan’s civil society are dealing with the challenges posed by foreign and homegrown disinformation.

Cognitive Warfare, as Defined by Taiwan Scholars and Civil Society

In a 2021 report, Taiwan’s Institute for National Defense and Security Research defined cognitive warfare as the integration of psychological warfare, information warfare and public opinion warfare, and it described cognitive warfare as “the core of the Chinese Communist Party’s overall strategy.”[iii] An analysis by researchers from two prominent Taiwan-based think tanks also viewed the Chinese government as being actively engaged in cognitive warfare, which in their view encompasses military intimidation, influence via bilateral exchange, religious interference and disinformation. Of these activities, they wrote that disinformation via commercial websites and social media accounts called content farms[iv] has been the most effective in Taiwan because these platforms mix “useful daily information with political placement marketing,” creating “a China-friendly information space that affects Taiwan’s readers without their being conscious of it.”[v]

However, not all Taiwan researchers and analysts are comfortable with the use of the term “cognitive warfare.” Two analysts interviewed from a Taiwan-based cybersecurity firm believe that the term indicates a coordinated campaign directed by a foreign government, yet they have not seen sufficient evidence to support this claim based on the disinformation campaigns they have observed so far. The co-founder of a research group focusing on Taiwan’s information environment said that, although there is evidence of PRC state and private actors attempting to influence public opinion in Taiwan, “the term ‘cognitive warfare’ is also sometimes misused in Taiwan to discredit political opponents or dismiss any views that one disagrees with.”

Civil Society Responses to Disinformation: Varied but Synergistic Approaches

Regardless of how concepts related to influence operations are defined, the challenge posed by these activities inspired a grassroots response by Taiwan’s civil society to harden the public’s resilience toward false and misleading information. Many of the leading Taiwan-based organizations devoted to countering disinformation formed after 2018, when false information surrounding Taiwan’s local elections and a referendum on legalizing same-sex marriage appeared to reach an unprecedented level. By that time, Taiwan’s civil society already had some of the foundations in place to deal with the challenges of influence operations, in particular, a tech movement advocating for information transparency that began in 2012 entitled “g0v.” The networks and connections fostered by this community helped spawn many of the Taiwanese organizations that are now leading the fight against disinformation.

The approaches taken by Taiwan-based civil society organizations can be broadly divided into three areas: fact-checking; monitoring and analyzing the information environment; and public outreach and media literacy education.

Approach #1: Fact-checking

Cofacts, a Taiwan-based civic tech community that supports fact-checking, was founded in 2016 by a pair of young entrepreneurs in Taiwan who believed that, while Taiwan’s journalists were producing high-quality fact-checking, these reports were not always accessible. Billion Lee, a co-founder of Cofacts, believes that “fact-checking is a form of communication, rather than just a question of right and wrong.” She and her co-founder decided to crowdsource fact-checking content, under the theory that this will expose people to more opinions and data. Anyone can submit a fact-check report of news or social media content to Cofacts, and then Cofacts users can respond or vote such reports up or down, depending on whether they find them convincing. The open nature of the Cofacts platform offers the potential for rapid responses to false information before it can spread widely. Although the Cofacts database is occasionally subject to attempts by malicious users to generate false fact-check reports, Lee notes that users and moderators work together to identify and remove this content according to a transparent process.

Cofacts has built a database of over 100,000 user-submitted fact-checked reports, and its database is used by researchers and other counter-disinformation groups in Taiwan. In addition, Cofacts created a fact-checking chatbot on LINE, Taiwan’s most popular messaging app (used by nearly 96% of Taiwan internet users[vi]), that can receive text queries and generate responses based on Cofacts’ reports. LINE’s own fact-checking bot and other organizations in Taiwan utilize reports that Cofacts writers have produced. Lee believes that most of Taiwan’s public would prefer to discuss politics in closed-circle chat groups rather than on more public platforms, so in her view, LINE is the most crucial platform for combatting disinformation in Taiwan. User activity on Cofacts platforms has grown tenfold since 2019, Lee notes, and the Cofacts webpage now receives over 5 million views per month.

Billion Lee, co-founder of Cofacts, engaging in public outreach on the use of fact-checking tools with partner organization Fake News Cleaner

In contrast to Cofacts’ decentralized, user-led approach, Taiwan fact-checking organization MyGoPen has a small team who adopt a more selective process about what information to verify. The staff members at MyGoPen meet daily to determine what stories and claims to fact-check based on factors such as the level of public interest in the topic and whether human life is being threatened by false information. They then conduct research and consult relevant experts to produce fact-checking that meets standards set by the International Fact-Checking Network.[vii] MyGoPen’s fact-checking bot on LINE, which has the ability to process image, video and voice content, has over 400,000 users.

Charles Yeh, who co-founded MyGoPen in 2015, said that the organization focuses on debunking scams and health-related claims but has also debunked PRC-related disinformation. For example, in late July 2022, a video began circulating on Taiwan social media and YouTube under the label “Live Fire Exercises on Pingtan,” that claimed to show the PRC military conducting exercises in the Taiwan Strait. MyGoPen used image and video search tools to show that the video was in fact from Taiwan military drills that took place on one of Taiwan’s islands in November 2020.[viii]

After social media users claimed that the above images showed PRC military drills taking place near Taiwan in July 2022, fact-checking organization MyGoPen determined that the videos were in fact taken during Taiwan military drills in 2020.

Approach #2: Monitoring and Analyzing the Information Environment

Another key aspect of Taiwan’s response to disinformation are organizations that monitor Taiwan’s information environment and identify potential threats. TeamT5 is a Taiwan-based cybersecurity firm whose analysts began researching disinformation to increase public awareness about the problem. For example, they helped a Taiwan-based news outlet with investigative reporting in 2019 that revealed that many seemingly unrelated content farms, websites and Facebook accounts known to be disseminating pro-PRC disinformation were actually run by the same people.[ix] In addition, TeamT5 regularly shares its findings about foreign-influence operations with Taiwan authorities, such as when they discovered that the second-most-popular Facebook fan page for a 2020 Taiwan presidential candidate was actually run by a marketing firm based on the Chinese mainland.[x]

Complementing the efforts of cyber threat specialists like TeamT5 are research groups like the Information Operations Research Group (IORG), which was founded in 2019 by a pair of young Taiwan professionals in Taiwan from tech and social activism backgrounds. IORG maintains an archive of content from Mandarin-language social media, news outlets, Taiwan web forums as well as PRC state outlets and social media accounts, and IORG uses machine learning and statistical models to identify patterns or trends in the information environment. “Our goal is to go beyond fact-checking, to look at narratives and stories,” said Chihhao Yu, one of IORG’s founders. For example, after a celebrity from Taiwan posted on Facebook in May 2022 that many children in Taiwan were dying from COVID-19, IORG noticed that a cluster of posts suddenly appeared, making similar claims.[xi] The appearance of this cluster was consistent with narratives that IORG was tracking that attempted to discredit Taiwan’s response to the pandemic.[xii]

Approach #3: Public Outreach and Media Literacy Education

The Taiwan-based NGOs interviewed for this article believe strongly in the need to engage with the public to build trust and educate. For example, Cofacts holds events to train the public how to use its fact-checking tools and recruit volunteers to contribute to the site, while IORG has conducted trainings for over 2,000 teachers in Taiwan to incorporate information literacy into school curricula. IORG’s co-founder explained his focus on public outreach, stating, “We’ll never be able to keep up with disinformation through fact-checking, so we have to inoculate the population against disinformation with education.”

Complementing these efforts, Fake News Cleaner (FNC) is an organization founded in 2018 whose mission is face-to-face engagement on media literacy with Taiwan’s public. FNC volunteers interact with people at parks and public spaces and discuss how to identify online scams. Through these events, they build trust with neighborhood leaders and receive invitations to give trainings at community centers and schools. The content of their trainings varies, depending on the audience. For less tech-savvy audiences, the trainings cover basics such as how to use smart phones and social media. With other audiences, they show how to fact-check using resources such as Cofacts and MyGoPen and explain how social media algorithms and content farms influence the content people see.

One aspect of FNC’s public education is examining case studies of scams and conspiracy theories that spread on Taiwan social media. For example, in July 2022, sensationalist news about people from Taiwan being lured to Cambodia and being forced to sell their organs there began spreading in both new and traditional Taiwan media. In response, FNC volunteers incorporated into their trainings a presentation showing how the supposed photo evidence of organ harvesting could be traced to photos from other countries. The conspiracy theory itself, FNC points out, was a re-hash of accusations by some Taiwan media and opposition figures in 2018 that the Taiwan authorities were failing to prevent people from Taiwan from being defrauded in Cambodia. FNC uses this example to illustrate how news stories can be distorted to undermine trust in Taiwan’s authorities, so news readers must critically analyze and verify the stories they see.

Staff members from Fake News Cleaner conducting a training on the use of mobile phones for fact-checking

Is Taiwan’s Approach Working?

Although the response of Taiwan’s civil society to the disinformation challenge has been innovative and commendable in many respects, the question of exactly how effective Taiwan has been in countering cognitive warfare requires further study. In a survey conducted by researchers at National Taiwan University in the lead-up to Taiwan’s 2020 presidential election, only about 54% of respondents said that they perceived the prevalence of disinformation in Taiwan’s media environment, and more than 61% said that they do not use fact-checking mechanisms.[xiii]

Nonetheless, the recent surge in the number of users of resources like Cofacts and MyGoPen, and the case studies referenced in this article, illustrate growing interest in Taiwan in improving the information environment, as well as the vital role that civil society can play in preventing the spread of false information. The organizations interviewed for this article acknowledged that it is difficult to evaluate the outcomes of their efforts, but they point to subtle indicators of enhanced public awareness about disinformation. This can include the increased sophistication of questions they receive from participants in trainings and public outreach, or the number of shares of a false piece of information declining after it has been debunked through fact-checking.

What Can Other Societies Learn from Taiwan’s Experiences?

The founders of Taiwan organizations battling disinformation have brought together talent from fields such as journalism, civic engagement, coding and data analytics to develop a comprehensive approach to improving the information environment and public discourse. Members of the anti-disinformation community in Taiwan maintain frequent communications and often collaborate and share information with each other and with international disinformation experts. These groups also invest heavily in research and public outreach to understand which audiences are most susceptible to disinformation, and what types of disinformation are most alluring to these groups. Importantly, Taiwan-based organizations meet these audiences where they are, adjusting their approaches depending on their level of tech-savviness, the area in which they live or the languages they speak.

Taiwan, like many other democracies, often hosts fierce and contentious debates about politics. These discussions can cause parts of its society to become highly distrustful of the local government and politicians. Nonetheless, the Taiwan-based activists interviewed for this article firmly believe that the fight against disinformation has to be led by civil society, because it has the potential to build credibility in ways that the government or tech companies will always find more difficult. Civil society organizations in Taiwan demonstrate the value of continually cultivating public trust so that in times when disinformation spikes, such as elections or pandemics, the public has reliable tools and networks to deal with pollutants in the information environment.

Like U.S.-based fact-checking organizations, the Taiwan-based civil society organizations referenced in this article are subject to accusations that they are politically biased or have ulterior motives. They have tried to counter this view through constant public outreach and transparency about how they operate, as well as through democratizing and demystifying the process of fact-checking. Fact-check reports on Cofacts, for example, represent a broad spectrum of political views. MyGoPen staff members noted that they take a lighter tone when debunking politically related disinformation, with the goal of presenting the facts and letting readers make up their own minds. A founder of Fake News Cleaner said that her organization has largely avoided the perception of political bias by focusing on scams and misinformation relating to health, safety and children.

As seen in the examples of what some Taiwan professionals describe as “cognitive warfare” cited in this article, the nature of disinformation is constantly evolving, and thus fact-checking and media literacy groups will also need to adapt to increasingly sophisticated content. In the face of this growing deluge, many Taiwan professionals interviewed for this article noted that they are motivated to fight disinformation out of a sense of duty, whether because they believe that their democratic system is under threat, or because they feel a civic responsibility to reduce the gaps in Taiwan’s society between people of different education levels and social classes. Taiwan provides an example of how a highly motivated grassroots movement can improve public discourse using varied approaches and engaging the audiences who are most susceptible to disinformation.

[i] [ii]See, e.g.,,, and [iii] Ya-Ying Lin (2021). China's Cognitive Warfare Strategy and Taiwan's Countermeasures [translated from Chinese by the author of this article]. [iv] One Taiwanese news outlet defined “content farms” as websites and social media accounts that “produce massive amounts of low-quality news articles in order to collect money from page clicks.” [v] Hung, Tzu-Chieh, and Tzu-Wei Hung (2020, updated in 2022). How China’s Cognitive Warfare Works: A Frontline Perspective of Taiwan’s Anti-Disinformation Wars. Journal of Global Security Studies. [vi] [vii] [viii] [ix] [x] Chang, Che and Yeh, Silvia. Information Operation White Paper: Observations on 2020 Taiwanese General Elections. [xi] [xii] and [xiii] Tai-Li Wang, Yung-Ting Chen, Duff Na, Josh Su, Milly Tsai (2020). Disinformation Studies in Taiwan’s 2018 and 2020 Elections. Graduate Institute of Journalism, National Taiwan University.


NELSON WEN is a U.S. Department of State Foreign Service Officer and a 2021-2022 Kathryn W. Davis Public Diplomacy Fellow. He is preparing to begin his next assignment as the Branch Public Affairs Officer at the American Institute in Taiwan, Kaohsiung Branch Office.


bottom of page