The CCP has become synonymous with surveillance. On an increasingly authoritarian trajectory, Xi Jinping has bet his and the Party’s longevity on its ability to not only neutralize challenges to authority but also anticipate them before they arise. Enormous investments in China’s surveillance apparatus over the past few decades, particularly during Xi’s tenure, have given the state an unprecedented level of access to the personal lives of its citizens, thereby unlocking new opportunities for social governance and control.
These experiments in state control extend from Beijing to the borderlands and beyond. The most infamous example is in Xinjiang, where mass surveillance has helped the CCP subjugate millions of ethnic minority members in the name of ostensible antiterrorism policies that the UN declared may constitute crimes against humanity. Nationwide, the pandemic has allowed the CCP to bring surveillance more prominently into the mainstream via ubiquitous health-code apps that restrict movement for both political and public-health ends. The government has also exported its surveillance tools to regimes eager to replicate China’s success in tracking dissidents and stifling any opposition.
While the CCP tries to sell a vision of technological utopia to its citizens, growing pockets of popular resistance against government intrusion are challenging the expansion of the CCP’s surveillance project. It remains to be seen whether the world’s largest, most sophisticated surveillance state will succeed in shaping the will of its people.
Joining CDT to discuss these issues is Josh Chin, co-author with Liza Lin of Surveillance State: Inside China’s Quest to Launch a New Era of Social Control. Josh Chin is deputy bureau chief at The Wall Street Journal’s China bureau—a role he plays from Seoul following his expulsion from China in February 2020—and was previously an editor at CDT. Our interview touches on China’s use of American technologies and rhetoric to cultivate its surveillance state; the AI advances that have helped make it a reality; the methods and motivations for exporting surveillance technology abroad; the CCP’s influence over conceptions of privacy; the gap between myths and realities of modern surveillance; and the moral dilemma that surveillance poses to societies around the world.
China Digital Times (CDT): What role did the U.S. War on Terror have in influencing the CCP’s ethnic policies in Xinjiang?
Josh Chin (JC): The War on Terror created the global market for digital surveillance, which ultimately funded the technological advances that made the Communist Party’s campaign in Xinjiang possible. It also popularized the notion of “integrated joint-operations,” or the coordination of multiple branches of a military through information technology to confront a nimble enemy. Surveillance data in Xinjiang is stored on an “integrated joint-operations platform”—built by a Chinese defense contractor—that allows the various arms of the local government to track and categorize Uyghurs and other Turkic Muslim minorities.
CDT: How did Western technologies and companies provide inspiration for the CCP’s early surveillance tactics?
JC: Western companies were there at the birth of the Chinese surveillance state in the late 1990s and early 2000s, when the Communist Party first started trying to track and censor internet traffic under what it called “the Golden Shield” project. They included names like Cisco Systems and Sun Microsystems. Their technology allowed the Chinese government to build a far more effective version of the Great Firewall and helped the Ministry of State Security build a nationwide network connecting all of its branch offices.
CDT: How did AI bridge the gap between the largely theoretical, patchwork, nascent forms of surveillance in the 1990s-2000s, and the more efficient, comprehensive realities of surveillance that we see today?
JC: Communist Party theorists had toyed with the idea of using behavioral data analysis to engineer a more perfect society dating at least to the early 1990s. Starting in the late 2000s, Silicon Valley companies demonstrated how the internet made it possible to harvest the vast amounts of data necessary to make that vision plausible. And an evolutionary leap in artificial intelligence around the same time opened the door to putting that data to work. A technique called Deep Learning, which enables machines to “study” huge quantities of data in order to learn patterns, suddenly made it much easier to build cameras that could recognize faces, or to try to predict when a protest might break out based on sentiment being expressed on social-media.
CDT: By what means has the Chinese government supported Chinese technology companies in expanding their reach into police forces abroad, and how has that differed from the support other states give to domestic tech companies marketing their services abroad?
JC: China’s technology companies are like other companies in that they are constantly seeking new markets. There isn’t a government out there that isn’t eager to help their domestic tech companies expand abroad. China is noteworthy, though, for how directly involved it can become in helping its tech companies land overseas business. In Uganda, for example, the Chinese ambassador had arranged for local police to travel to the Ministry of Public Security headquarters in Beijing as a way to help Huawei sell a $127 million “safe city” surveillance system there.
CDT: What motivations does the Chinese government have for facilitating the export of Chinese surveillance technology? How does this affect the democratic nature of countries that import this technology?
JC: China is already home to more than 400 million surveillance cameras—roughly half of all the cameras in the world—so Chinese makers of tracking hardware need new markets to sell into. For the Party itself, the question is whether it wants to remake the global political order in its image. That’s a very tall order, and the evidence suggests Beijing realizes that. Instead, it seems content to promote the idea that governments should be free to use surveillance technology as they see fit—similar to their view of internet censorship. We’re still early in this story, but so far it looks like borderline democracies that adopt Chinese systems tend to move in more authoritarian directions. That makes sense, since surveillance technology by its nature is given to promoting authoritarian outcomes.
CDT: In the book, you trace the parallels between American companies’ involvement in the CCP’s surveillance state and IBM’s involvement in the Holocaust. Could you describe some of those similarities and differences? What surprised you most in your research into how Western companies profited from China’s surveillance market?
JC: Even knowing beforehand that American companies had been helping China’s surveillance companies, I was surprised at both the depth and breadth of their involvement. IBM’s enabling of the Holocaust was driven by a lot of motivations, though the dominant one—no surprise—appears to have been profit. Silicon Valley’s participation in China’s surveillance state is similar in that respect. Corporate boardrooms have an insatiable appetite for the China market, while being deathly allergic to considering the consequences of doing business there. The difference with Nazi-era IBM is that today’s companies have more plausible deniability. Because of how mind-bendingly complex today’s tech supply chains are, they can supply critical components like advanced AI chips to Chinese surveillance startups in indirect ways that are extremely difficult to trace.
CDT: How has the Chinese concept of privacy evolved in its relationship to authority, and how has the CCP tried to co-opt it to bolster the surveillance state?
JC: Baidu CEO Robin Li infamously observed a few years ago that Chinese tech companies benefited from the unusual willingness of Chinese consumers to trade privacy for convenience. The fact that one of the country’s most experienced tech founders offered that observation so casually suggests it was probably true at one point. But the immense backlash to the comments that followed shows how privacy consciousness is growing in China. The Communist Party has been remarkably savvy in shaping the conversation around privacy. Rather than censor it, authorities have instead encouraged it, but they’ve made sure it stays directed at companies like Baidu, rather than the government. That has given the Party one more cudgel it can use to make sure tech companies keep in line.
CDT: In what ways is the CCP’s surveillance state “a propaganda exercise as much as an infrastructure project,” and how does this benefit the CCP? To what extent do China’s real-world and online surveillance systems complement one another to the benefit of the party?
JC: One of our most surprising discoveries in working on state surveillance was how unconcerned the Party seemed to be with our reporting, at least until we started writing about Xinjiang. If anything, Beijing seemed to embrace this notion that it was all-seeing. State media started producing this stream of stories about AI surveillance catching criminals and finding lost children. When we looked into them, a lot of those stories turned out to be exaggerated, or fabricated entirely. But it didn’t matter. A lot of people we talked to still believed in the power of the technology, both to keep them safe and to keep them in line. To a large degree, that’s the outcome the Party is driving for.
CDT: Now that pandemic-era, app-based health codes have become ubiquitous, what future surveillance technologies are on the horizon for the CCP, and how likely are they to come into operation?
JC: We’re at a stage in the development of AI where we’re not likely to see major new capabilities—at least not until there’s another quantum leap in the technology. Instead, we’re probably going to see current applications like facial-recognition and behavioral analysis grow ever more refined. Where we could see big changes is in the scale of surveillance. Thanks to the pandemic, nearly everyone in China is subject to the sort of 24-hour tracking that used to only be applied to people like the Uyghurs. Will the Party use that national infrastructure to expand from public health risk into assessments of political or social risk? There have already been attempts at the local level, though they’ve run into public opposition. Covid has also made many more people in China aware of the downsides of state surveillance. It bears watching whether that coalesces into genuine pushback.
CDT: Why is a tolerance for complexity and a valorization of transparency necessary for insulating ourselves against the worst effects of surveillance technologies? What societal and political adjustments can we make to attain these ends?
JC: The allure of China’s tech authoritarian model rests on simplicity: It involves trading choice and self-determination for security, convenience, and predictability. That fundamental trade-off is seductive outside China’s borders. Just look at all the people who give their web-browsing data to Google in exchange for a more manageable number of ostensibly more accurate search results. The Google example also illustrates how these technologies can serve positive ends. The question then becomes how to maximize the positive uses of data-fueled AI while ensuring that it doesn’t tip society into embracing the “leave it to me” message of authoritarianism. There’s no one clear solution here. More likely than not, it will involve a lot of mess, deliberation, and constant recalibration. That sounds like a lot of work, but the ramifications of ignoring it could be dire for democracies. All of it starts with transparency—with knowing as much as we can about how these tools are being used, and by whom.