Audrey Tang prefers precise language. During an interview, Taiwan's minister without portfolio - Tang's name card simply says "digital minister" - makes a swift correction when we mention the term "fake news." The preferred term is "disinformation" because, Tang says, it has a legal definition in Taiwan: "That is to say, intentional, harmful untruth, and most importantly, harmful is to the public, to the democratic system, not harmful to the image of a minister," she says, laughing, "That's just good journalism, right?"
But unlike other governments in Asia, such as Singapore, Taiwan has decided to fight the scourge of disinformation without resorting to censorship or takedowns.
Taiwan emerged from martial law in 1987, and held its first presidential election in 1996. Even as Taiwan's economy depends increasingly on trade and investment with China, which claims Taiwan as its own territory, democracy has continued to set Taiwan apart from the political system on the mainland. Taiwan's voters alternate supporting candidates from the Kuomintang (KMT), which tends to be more open to closer relations with China, and the Democratic Progressive Party (DPP), which stresses autonomy if not outright independence from China. Current President Tsai Ing-wen is from the DPP.
Tang, a software programmer who emerged from the hacker community, sat with CPJ last week in Taipei to talk about how Taiwan tries to maintain the integrity of its media and democratic system in the face of a much larger adversary - China - that severely controls its own media and has the means potentially to sow havoc in Taiwan's open system.
The interview has been edited for length.
China is making an effort to influence public opinion in Taiwan through the media. How can Taiwan maintain freedom of the press and openness?
I think in Taiwan we essentially rely on the society itself to tell disinformation. That is to say intentional, harmful untruth, versus a journalistic work. It's not always easy because just 30 or 35 years ago Taiwan was where the PRC [People's Republic of China] is.
A lot of, especially the elderly, have a difficulty telling disinformation apart from truly journalistic work, simply because the state-run media at the time was the only media and there was, frankly speaking, lots of propaganda around, so it's not very easy to tell. For people who are born or educated after the lifting of the marital law, which is after the '80s, they have a broad swathe of information sources to choose from. Our democracy, with the first presidential election in '96, coincides with the World Wide Web, so people associate democracy with the democratization of information sources.
How do you counteract disinformation?
Disinformation is a threat, especially for open societies. Especially around Taiwan lots of jurisdictions, not just PRC, use disinformation as an excuse for the state to do censorship. We don't want to go there, because we still remember the martial law.
First, before a propaganda campaign or disinformation spreads, we usually observe that there is a point where they are doing some kind of limited testing or A/B testing, and that's before it became really popular. It's just testing the meme, the variation, to see whether it would go viral, so to speak.
Each of our ministries now has a team that is charged to say if we detect that there is a disinformation campaign going on, but before it reaches the masses, they're in charge to make within 60 minutes an equally or more convincing narrative. That could be a short film, that could be a media card, that could a social media post. It could be the minister herself or himself doing a livestream. It could be our president going on a standup comedy show. It could be our deputy premier watching a livestream of a video game.
Our observation is that if we do that, then most of the population reach this message like an inoculation before they reach the disinformation, and so that protects like a vaccination.
How does it work?
We do a kind of scorecard and measure how long each ministry takes to respond. The Minister of Interior maybe takes 60 minutes on average. Minister of Health and Welfare maybe takes 70 minutes on average. They do compete with each other in that kind of friendly rivalry, and so they all respond faster and faster until they reach the 60-minute mark.
They also pass through the [social media platform] LINE, Facebook, and other social media accounts of Dr. Tsai Ing-wen, our president, of our premier, our deputy premier, and so on. Each of them has a large number of followers, and so they basically ask their followers to spread the clarification so that it reaches their friend and families before the disinformation does.
The mainstream media, of course, then picks up this counter-narrative and then do a balanced report. What we have witnessed is that if we don't come up with this counter-narrative and ready videos or films, or at least picture cards, then after six hours, that's after a news cycle, it's hopeless.
Truth to be told, it is actually very exhausting.
Can you counter all disinformation?
Some of them do get viral without our notice. Usually it takes place on end-to-end encrypted channels. What end-to-end encryption means, unlike Facebook, Twitter, or something that's google-able, these channels you cannot use search engine to discover them. It's like a closed room, an echo chamber, and so it's really easy for them to mutate into a more potent meme before they release this out in the wild, so to speak.
We have developed, in conjunction with LINE and Facebook and friends, a system, what we call Notice and Public Notice. This system is akin to a spam. If you receive an email and you think it's spam or junk mail, theoretically it's personal communication. The state should have nothing to inspect your email. If you think this email is from a random country that has a princess that has $5 million that ask for your account or whatever, then you can flag that as spam.
Back in the early 2000s we, the Internet community, convinced each and every mail operator to add that flag button to its interface so that when you flag that as spam you're essentially donating the signature of this message. It's not involuntary. It's a voluntary donation to a global system called the Spamhaus, the Domain Block List, and so on.
There's a whole system for that. It's like the email's immune system so that after sufficient people flag it they do a correlation. After they correlated the sender of the spam, once the sender sends another email it still reach the recipient, it's not censorship, but it goes to the junk mail folder so it doesn't waste people's time by default.
We're developing a very similar system here where people online and other instant message systems, they can forward a suspicious disinformation to a bot. Currently the most popular bot for that is called CoFact, for collaborative fact, but very soon, in June, LINE will build that as its core functionality so all you have to do is to press a message, or a long tap, and then you can flag it as disinformation.
For example, there was a really popular rumor that says whenever an earthquake is larger than degree 7.0 then other nearby jurisdictions can send its rescue teams without the approval of the country that suffers the earthquake. An excuse for invasion, you see? God knows why they spread this message. In any case, it is a popular message, and so the Taiwan FactCheck Center goes into the conventions, the actual treaties that our Minister of Foreign Affairs signs, and things like that, cites, or its sources, and finally say this is false.
Once they do that, Facebook promises, by June, that this will inform the Facebook's algorithm so that it will stop being preferred to show on people's newsfeed, but it's not censorship. If you look specifically for that friend, that post is still there, but they have a warning that says it's already fact-checked as false.
We are reaching very similar agreements with other social media as well. And so LINE, Facebook, and so on, they are all on board to implement this "Notice and Public Notice" system, which is definitely not a notice and takedown system. [Editor's note: A LINE spokesperson told CPJ by phone that it is working with fact-check groups, including CoFact, ＭyGoPen, Taiwan FactCheck Center, and Rumor & Truth, to build a "LINE Fact Checker Official Account," where users can paste questionable information and request fact checking. Facebook did not immediately respond to an emailed request for comment.]
And for elections?
During election, we have a special set of rules, that [limit campaign donations]. Because of that, we do witness that foreign money prefers not to enter through this route, because it will get revealed. They prefer to buy precision target advertisement on social media or even on regular media. So we're basically saying no, it's the same as campaign donation. You have to reveal it in exactly the same way, and only domestic people get to spend money and sponsor political advertisements. Each advertising agency, each middle person, needs to reveal where their funding sources come from, just like anti-money laundering. By the end of the chain, if it points to a foreign national, or to a PRC, or Macau, or Hong Kong source, then that is actually a crime.
What we're saying is that elections are special, and we're protecting elections in a way that is much higher than the usual notice and public notice system.
At the end of the day, I think the most useful education tool is media literacy.
[Reporting from Taipei, Taiwan.]
Steven Butler is CPJ's Asia program coordinator. Iris Hsu is CPJ's China correspondent.