Better Living through Algorithm Regulation: Less Exploitation and Profit-Seeking, More “Mainstream Values” and “Positive Energy”

The latest target in the Chinese government’s regulatory blitz against big tech is algorithms. Last week, a host of ministries announced a new plan to introduce the world’s first comprehensive, nationwide regulatory system for algorithms. The decision follows the release of a set of draft guidelines in late August elaborating provisions meant to bolster this future regulatory system. While the system aims to promote social stability and redistribute power from tech companies to internet users and delivery workers, its implementation is far from guaranteed, and would also tighten the CCP’s control over internet content. Xinmei Shen and Tracy Qu from the South China Morning Post described the government’s full-court press on algorithms:

China has laid out a three-year plan to rein in the use of algorithms, marking Beijing’s latest effort to bring the country’s internet industry firmly under state control.

Under the new policy guidelines, local governments are urged to tighten the regulation of algorithms, while companies are told that they will be held accountable for misusing the technology, according to a notice published on Wednesday by the Cyberspace Administration of China (CAC), the country’s internet watchdog

[…] The guidelines were jointly signed by the CAC and eight other regulators, including the Propaganda Department of the Communist Party, the Ministry of Education, the Ministry of Science and Technology, the Ministry of Industry and Information Technology, the Ministry of Public Security, the Ministry of Culture and Tourism, the State Administration for Market Regulation and the National Radio and Television Administration. [Source]

Sapni G K and Mihir Mahajan from the Diplomat explained the significance of algorithm regulation and which algorithms will be regulated:

Data and algorithms are the fundamental blocks of cyberspace, but while data practices are increasingly being regulated around the world[,] algorithm regulation is relatively untouched. The EU General Data Protection Regulation (GDPR), for example, remains the groundbreaking model for data protection regulations in most parts of the world. However, there is a void in the regulation of algorithms.

In August, China issued the Draft Internet Information Service Algorithmic Recommendation Management Provisions, with an interest in standard setting in this space.

The provisions provide a framework for the regulation of recommendation algorithms. The provisions apply to “search filters” and “personalized recommendation algorithms” as used in social media feed algorithms (Weibo), content services (Tencent Music, streaming), online stores (e-commerce, app stores), and so on. It also regulates “dispatching and decision making” algorithms, such as those used by gig work platforms (like transport and delivery services) and “generative or synthetic-type” algorithms used for content generation in gaming and virtual environments, virtual meetings, and more.

Through the provisions in this draft, China seeks to address multiple concerns, such as the spread of mis- or disinformation, lack of user autonomy, perceived economic harms of price discrimination, online addiction, and issues regarding platformized gig work. They also reflect China-specific concerns like the fear of disaffection and consequent social mobilization. [Source]

The guidelines will form another pillar of China’s overall cybersecurity regulation, in tandem with the 2017 Cybersecurity Law. On September 1, China’s Data Security Law went into effect, regulating the collection and transfer of data across borders. On November 1, the Personal Information and Protection Law will go into effect; modeled on the EU’s General Data Protection Regulation, it aims to protect the use and processing of individuals’ personal information. As for regulation of algorithms, Zheping Huang at Bloomberg summarized the main provisions in the draft document:

Here are some of the key proposed regulations.

  • Companies must disclose the basic principles of any algorithm recommendation service, explaining the purpose and mechanisms for recommendations in a “conspicuous” manner.
  • They must provide users convenient options for turning off algorithm recommendations and “immediately” implement any requests to opt out.
  • Algorithms should not be used for price discrimination based on users’ preferences and habits.
  • Providers must regularly assess and test their algorithms and data to avoid models that will induce users’ obsessive behaviors, excessive spending or other behaviors that violate public order and morality.
  • They must adhere to “mainstream values” and “actively spread positive energy, and promote the application of algorithms for the better.”
  • Algorithms cannot be used to set up fake accounts or falsely influence rankings and search results to benefit the provider, influence online discourse or avoid regulatory oversight.
  • Algorithms cannot endanger national security, disrupt economic and social order or infringe on the legitimate rights and interests of others.
  • Algorithm providers who can influence public opinion or mobilize the masses need to submit their services for the CAC’s approval. Those without approval could be fined by up to 30,000 yuan and ordered to terminate service. [Source]

Some of these new regulations aim to protect citizens from real-world dangers and exploitation. Delivery workers have long been overworked by algorithms that dictate their working lives with ruthless efficiency. Recent reports exposing their suffering have prompted stricter measures on delivery companies such as Meituan and Ele.me. After the government released the draft regulations on algorithms, Meituan published new rules describing how its platform’s algorithm would alleviate the burden on its workers by expanding the time allotted to complete orders. 

Similarly, the government has singled out online addiction among youth as a serious, if not socially constructed, problem. In late August, the National Press and Publication Administration limited minors’ screen time for online gaming to no more than three hours per week, as state-run media called Tencent’s video games “spiritual opium.”

Algorithms are also a source of online price discrimination against consumers. In January, a government-backed consumer group criticized Chinese big tech firms’ algorithms for providing targeted search results that obscure true costs and negative reviews of products, and for using consumers’ personal data to calculate different prices based on how much individual consumers might be willing to pay. In early July, China’s market regulator issued draft rules that would ban online price discrimination and force businesses that violate these rules to pay a fine of 0.1 to 0.5 percent of their annual sales, or even to suspend their operations. 

A related objective of the government’s algorithm regulation plan is to prevent companies from manipulating search results for end users. This entails exposing existing search algorithms and revising them to prevent “abuses such as interference in public opinion.” In doing so, the CCP will be adding to its censorship arsenal. As Yun Jiang points out in China Neican, limiting algorithms to only those that “orient towards mainstream values” and “actively transmit positive energy” could in practice mean censoring those that spread values unacceptable to the CCP and content that is critical of the CCP. Shen Lu at Protocol described how the draft guidelines aim to control public opinion and moderate content:

[T]he drafted rules also evince CAC’s strong desire to control public opinion and moderate social media content. Service providers are asked to “uphold mainstream value orientations” and optimize algorithmically-based recommendation mechanisms to “disseminate positive energy.” Tech companies are blocked from “entering illegal or undesirable keywords as user interests or as user tags” and then “push information content accordingly,” nor may companies “set discriminatory or biased user labels.” And they are also required to allow autonomous user choices and build mechanisms that allow humans to manually curate hot topics, popular search terms, trending topic charts and pop-up windows to “vigorously present information content that conforms to mainstream value orientations.” [Source]

However, achieving the desired effects of the regulatory system is no simple task. Since algorithms are often formed from constantly shifting dynamic inputs and unpredictable organic content, making them transparent to users and codifying them into law is complex. Eliminating “addictive” features from platforms such as Duoyin, whose entire business model depends on users spending more time watching an endless stream of videos, might destroy such apps. And netizens faced with information overload might not be willing to renounce the convenience of content curation algorithms. 

In an interview with China Media Project, Rogier Creemers listed other factors that could explain the CCP’s concerted push for greater regulation in cybersecurity:

[…] because the Chinese government regulates according to purposes and not to principles, it can actually be more agile. It can say: “This wasn’t a problem when these businesses were relatively small and serving 100 million people. Now, these businesses are really big and serving a billion people. So the game has changed, and we now need different rules to deal with this situation.”

On the one hand, you might say this is a centralized campaign. On the other hand, you might say that the stars have aligned — that political conditions are now so that it is logical for all these bureaucracies to intervene because the problems are just getting so big.

The People’s Bank of China is looking at the financial stability risks that emerge from having unregulated money creation through Ant Financial’s services. The Cyberspace Administration of China is looking at the fact that suddenly you have smart cars with lots of sensors, generating all kinds of data. And then you have a situation where if one ministry does something the others don’t want to be seen to be doing nothing.

Another aspect is that for several regulators, this is actually an opportunity to raise their profile. They can raise their bureaucratic standards, get bigger budgets, bigger staff allocations, more prestige, and so on. All of these things are factors. It’s all part of that complex witch’s brew. [Source]

Open popup
X

Welcome back!

CDT is a non-profit media site, and we need your support. Your contribution will help us provide more translations, breaking news, and other content you love.