Location : Home > Resource > Paper > Theoretical Deduction
Resource
Zhao Zerui | Rule of Law Guarantee for Common Governance of Network Information Content Ecology
2023-12-11 [author] Zhao Zerui preview:

[author]Zhao Zerui

[content]

Rule of Law Guarantee for Common Governance of Network Information Content Ecology


*Author Zhao Zerui


PhD Candidate, KoGuan School of Law, Shanghai Jiao Tong University

Research Assistant, China Institute for Socio-Legal Studies, Shanghai Jiao Tong University


Abstract: Under the dilemma of government management caused by platform revolution, the deterioration of network information content ecology has made it a global consensus for network platform companies to participate in governance. In order to break the obstacles of existing laws to the active management of network platform companies, China and the United States have promoted the common governance of network information content ecology by establishing content supervision obligations and creating legal liability exemptions respectively. However, the existing legal reform only stops at encouraging the active management of network platform companies, and does not coordinate the different management modes between the government and network platform companies through unified legal rules, which leads to power disputes between the government and network platform companies. In order to build a legal system that supports the ecological common governance of network information content, the law should transform the business management means of network platform companies into a formal public management mechanism through procedural rules, and adjust the government 's management responsibilities and the responsibility identification standards of network platform companies, so that the management modes of the two can operate freely.


Keywords: network platform; network information content ecology; common governance; rule of law guarantee


1. Introduction


In the past 20 years, driven by Internet technology, our way of life has changed, and the network platform has become an important place for economic and social activities. The penetration of network platforms into social life, on the one hand, has improved the convenience of people 's information transmission and information creation, but on the other hand, it has also led to the proliferation of illegal and bad information, and the ecology of network information content in countries around the world has begun to deteriorate. In response to the deterioration of the network information content ecology, the United Kingdom, France, Russia, Australia and other countries have introduced relevant documents to combat illegal and bad information. Since the 18th National Congress, the Party Central Committee has also attached great importance to the construction of a strong online country. The cyberspace is clear and ecologically good, which is in line with the interests of the people. The network space is miasma and ecological deterioration, which is not in line with the interests of the people. It emphasizes the need to ' build a good ecology of the network, give play to the role of the network in guiding public opinion and reflecting public opinion '. Under the guidance of this, in 2020, China promulgated the " Regulations on Ecological Governance of Network Information Content, " which clearly divided the positive energy information that should be encouraged in the network information content ecology, the illegal information that should be combated and the bad information that should be resisted, and established the goal for the ecological governance of network information content in China. In order to achieve this goal, the National Network Information Office, together with the Ministry of Industry and Information Technology and the Ministry of Public Security, will publicly solicit opinions on the " Internet Information Service Management Measures ( Revised Draft Consultation Draft ) " in 2021, and prepare to establish China 's network information content supervision system accordingly.


In this context, in order to better understand the global legal changes in network information content in recent years, and analyze how the modern rule of law system should guarantee the good governance of network information content ecology in China, this paper will discuss in turn : the necessity of common governance of network information content ecology, the legal obstacles and reform status of network platform companies participating in common governance, the common governance dilemma still faced by network information content ecology in the current legal reform, and how the future reform of the rule of law system should support the government and network platform companies ' common governance of network information content ecology.


2.The Need for Common Governance of Online Information Content Ecology


The concept of "governance" originates from the public management changes that emerged globally in the 1990s. Unlike "administration" that relies on governmental mandatory directives and prohibitions, "governance" emphasises more on the co-organisation and management of society among multiple subjects on an equal footing and under the coordination of rules. Unlike "administration", which relies on mandatory governmental directives and prohibitions, "governance" places more emphasis on the equal and joint organisation and management of society by multiple subjects under the coordination of rules. The order creation mechanism under the concept of "governance" does not rely on the authority and sanctions of the government, but focuses on integrating the self-organisation mechanisms of different governance subjects, respecting and trusting the self-generated order of the society. This conceptual transformation from government administration to pluralistic common governance also happens in the field of network information content ecology.


2.1. Government Management Dilemma of Network Information Content Ecology


Prior to the advent of online platforms, Governments, as the main institutions of State administration, have always sought to control the flow of information in parts of society, and all countries allow some such control. For example, in democracies that claim broad protections for freedom of expression and association, there is a general consensus that information about child pornography, speech that violates the privacy and honour of others, and the steps involved in the production of high-risk substances such as weapons and drugs should be subject to strict government control. Therefore, the government would rely on the administrative management of information intermediaries such as postal services, publishing houses, radio and television stations to control the publication and dissemination of such illegal and undesirable information. However, after the fifth technological revolution pushed mankind into the age of information technology, the state gradually realised that decentralised digital networks made it difficult for people's information flow to be effectively controlled by the government. With the help of information and communication technologies such as the Internet, people no longer need to go through the editing of traditional information intermediaries such as post offices, publishing houses, radio and television stations to publish and disseminate information, and all kinds of illegal and undesirable information have gradually escaped from the government's administrative supervision and begun to spread virally on the Internet. As a result, the long-standing and stable general consensus on government management of information content flow has begun to waver, and a recurring controversial topic has emerged in countries around the globe: whether governments still have the ability to manage the online information content ecosystem on their own in the information age. The United States and China, the two countries with the highest global Internet development indexes, have both experienced a deterioration of their online information content ecosystems in the wake of the platform revolution due to a lack of government regulatory capacity.


In 1997, the first ever social networking platform was born in the United States - SixDegree, and in 2007, MySpace with more than 68 million online users and Facebook with 32 million users were born.However, the rise of online platforms in the U.S. has also induced the deterioration of the online content ecosystem. Deterioration of the online information content ecosystem. A study of user comments on news platforms by American researchers found that incivility is a common feature of public discussion on online platforms, especially when platform users can comment anonymously. According to statistics, 55.5 percent of online content contains at least one uncivil user comment under it. The Pew Research Center also released a study on online harassment, which reported that 73 percent of adult users had seen someone being verbally harassed in some way on an online platform, and 40 per cent had experienced it personally.


Our country faces the same problem. As of 2016, the number of Internet users in China exceeded 700 million, and the Internet penetration rate has exceeded the world average. In the huge online information content ecosystem, there are frequent information content publishers who attract eyeballs by making use of the subject matter to attract attention, leading to the proliferation of illegal and undesirable information content. Official media have repeatedly criticised these vicious incidents in the online information content ecosystem by name, but they have instead attracted more attention, in effect bringing more traffic to these information content publishers and propaganda platforms. In this regard, General Secretary Xi Jinping made it clear at important meetings such as the 2016 Work Conference on Network Security and Informatisation and the National Conference on Propaganda and Ideology that great importance should be attached to the ecological construction of online information content and the strengthening of the guidance of online discourse.


In the face of the global deterioration of the online information content ecosystem triggered by the platform economy, all countries have generally realised that the administrative management model relying on government monitoring alone can no longer adapt to the decentralised and complex network society. As a government report from the United States puts it: the speed of development of modern information technology has completely outpaced the ability of governments to manage online information content on their own, and to manage the flow of information on the Internet through an administrative regulatory model would be like trying to regulate a transport system in which new roads, vehicles and fuels are appearing all the time. Therefore, governments are turning their attention to online platform companies, which have mastered new management models, in the hope of leveraging their technological innovations to manage the deteriorating online information content ecosystem.


2.2. "Boosting" by web platform companies based on web architecture design


Unlike the government's reliance on administrative agencies to issue mandatory directives and bans, the management of online information content ecosystems by online platform companies is a way of using code to design a network architecture that guides people's behavioural choices, whereby people are "boosted" to take the initiative to publish and disseminate positive information content. The network platform company's "promotion" based on the network architecture does not need to adopt a mandatory approach, but under the premise of fully guaranteeing people's free choices, it adjusts people's choice space or behavioural motives in a self-prophetic way, which can reduce the side effects of behavioural constraints to a minimum, or even to zero. Moreover, this management model will be updated and improved with the innovation of network architecture design. In this paper, we only take the network architecture of "modular protocol" and "user comments" as an example.




Existing network platform companies have generally adopted the "modularity protocol" to manage the information content posted by users. This "modularity protocol" is like a filter that allows certain types of information to flow more easily, but at the same time hinders the free flow of other types of information. The development of "modular protocols" initially stemmed from competitive business considerations of online platform companies, which wanted to use the network architecture and user profiles to provide each user with an information content interface that meets his or her specific preferences in order to increase user stickiness and maintain the number of users. However, while this meets the customisation needs of users, it also gives web platform companies the ability to manage their content. With the "modular protocol" network architecture, the network platform company can make it easier for users to receive and share positive information content, while making it more difficult for users to publish illegal and undesirable information content. For example, Twitter, YouTube and other social media platforms in the U.S. have developed a "hash value" recognition technology that can automatically identify the types of images and videos in order to recommend their favourite images and videos to specific users, which can be interpreted as a "fingerprint" recognition technology for images and videos. We can think of it as "fingerprint" recognition technology for images and videos. With the help of this technology and "modular protocols", these social media platforms are able to match a database of child pornography hashes already collected by the National Center for Missing and Exploited Children (NCMEC) to impede the distribution of child pornography and to provide the police with information about offenders.


In addition, based on commercial competition, web platform companies have developed a "user comment" web architecture in order to extract more popular and interesting information content from a large amount of information content. Under this structure, users can find a like or rating button around each piece of information content, and make comments in the middle or bottom of the information content through pop-ups, etc., which can be liked or rated themselves. Using this structure, online platform companies are able to quickly and automatically rank the more popular and interesting messages and comments in an easy-to-view position, which in turn helps users to select the content they are likely to like. This also opens up the possibility of controlling illegal and objectionable content. For example, social media platforms often have a report button next to the content, which allows users to send a report of illegal content to the platform company. The platform company will then arrange for its employees to review the reported content and decide whether to remove it or not. Even some news media platforms in the US will require users to rate the civility of two existing comments at random when commenting on a news story, before staff manually review user comments that the majority of users deem uncivil.


As Lesger predicts, even without government requirements and help, online platform companies, driven by market forces, will use code to design network architectures that facilitate their control over the content of users' messages.



3. Legal Impediments to the Participation of Online Platform Companies in Shared Governance and Current Status of Reform


After entering the age of information technology, promoting the participation of online platform companies in governance has become a general consensus among countries. However, the publisher's responsibility triggered by the government's sole management model has brought obstacles to the active management of online platform companies.


3.1 Obstacles of publisher liability to online platform companies' management of online information content ecology


After the industrial revolution, the state's management of the information content ecosystem relied mainly on the government's administrative supervision of traditional information intermediaries, and post offices, publishing houses, radio and television stations, as the few subjects with the ability to publish information content, were subject to strict control by the administrative authorities. The legal system, in order to match this administrative mode, has developed a system of liability determination based on the criterion of "editorial control", the so-called publisher's liability. It requires those who have the ability to edit information content (collectively called "publishers") to bear joint and several liability for the damage caused by illegal and undesirable information, so as to urge these publishers to actively cooperate with the supervision of the administrative departments. However, with the emergence of online platforms, the publication and dissemination of online information content is no longer under the control of these publishers, and the person who publishes the illegal or undesirable information content may use anonymity services to hide his or her true identity, making it difficult for victims to locate the original publisher of the information. Moreover, even if the original publisher of the offending information is found, these individuals may often not have sufficient assets to compensate for the damages caused by the widespread dissemination of the offending information. This has led to online platform companies becoming more attractive defendants in cases of dissemination of illegal information on the Internet, resulting in a large number of information infringement cases globally in which online platform companies are defendants, and in which courts in various countries can only hold them jointly and severally liable in accordance with the liability of the publisher.



Under Section 578 of the Restatement (Second) of Torts, "a person who republishes a defamatory writing is liable as if he had originally published it," the U.S. common law treats each restatement of the infringing material, whether oral or written, as a new publication and stands alone as a basis for tort liability A basis for tort liability is established even if the republisher attributes the infringing content to the original publisher. Publisher liability has traditionally been applied to print media such as newspaper publishers, book publishers, and broadcast media such as radio and television broadcasts, because these republishing entities have strong editorial control over the content of the information they publish, so in judicial practice, the U.S. courts usually require these organisations to assume the same degree of legal liability as the publisher of the infringing content. The first time that an online platform company was recognised as a publisher and required to bear publisher's liability was the 1995 case of Stratton Oakmont, Inc. v. Prodigy Services Ltd. The case arose when a Prodigy user posted on a bulletin board what the plaintiff believed to be a post containing defamatory information. The main reason the court held Prodigy liable for the user's content was that Prodigy openly declared to the public and its users that it would moderate the bulletin board in order to make it easier for users to keep up with the latest content. In addition, the Court's investigation found that Prodigy would update the bulletin boards using a specific network architecture designed by code to ensure that the bulletin board messages were up-to-date and easy to read. Therefore, the court found that Prodigy's web platform already had editorial control over users' information.


The earliest "Measures for the Administration of Internet Information Services" of China also adopted the same idea of liability determination, and Articles 15 and 20 of the regulations adopted the publisher's liability, stipulating that network platform companies should be held legally liable as long as they copy, publish, or disseminate unlawful or undesirable information due to their control of user content. Under the publisher's liability, which takes editorial control as the criterion for determining liability, the courts in the United States and China, when faced with a large number of cases of illegal information infringement caused by the deterioration of the online information content ecology, can only determine that the online platform company and the publisher of the illegal information content are jointly and severally liable together. This seriously affects the incentive of network platform companies to develop a network structure that is easy to control, and adds a huge legal cost for network platform companies to manage the network information content ecology. In order to avoid the infringement liability that may arise from controlling the online information content ecosystem, online platform companies have refused to review and edit the content posted by users. With the government's lack of regulatory capacity and the online platform companies' fear of regulation, the online information content ecosystem continues to deteriorate.


In order to break the impediment of publisher liability to the active management of online platform companies, the United States and China have taken different paths to legal reform, and as a result, two different status quo's of shared governance of the online information ecosystem have emerged.


3.2 U.S. Legal Reforms to Promote Platform Autonomy with Liability Immunity



The United States, as the origin of online platforms, was the first country to realise that publisher liability was an obstacle to the active management of online platform companies and took the lead in reforming the legal system with liability immunity at its core. on 4 August 1995, less than two months after the decision in the Stratton Oakmont case, U.S. Congressman Cox spoke on the floor of the House of Representatives and highlighted the importance of online platform companies to the management of the information content ecosystem and criticised online platform companies for being an obstacle to the state's control of illegal and undesirable information. managing the information content ecosystem, and criticised publisher liability as an impediment to state control of illegal and objectionable information. His speech can be said to have set the basic course for subsequent institutional reforms in the United States. Shortly after Cox's speech, the U.S. Congress, in order to protect freedom of speech and encourage online platform companies to take the initiative in controlling illegal and undesirable information such as obscenity and violence, passed the Communications Decency Act (hereinafter referred to as the "CDA") in 1996, which has an impact to this day. The Communications Decency Act ("CDA") was passed in 1996. Section 230 of the CDA is widely regarded by scholars as an important legal guarantee for the development of the U.S. platform economy and the high degree of autonomy of online platform companies.


Section 230(c)(1) and (c)(2) of the CDA provides the basis for exemptions from liability for online platform companies. Section 230(c)(1) provides that an online platform company will not be recognised as a publisher when others publish infringing information. This provides strong immunity from liability for online platform companies, protecting them from claims arising from users who publish information about illegal content. Section 230(c)(2), on the other hand, focuses on the fact that online platform companies will not be held liable for actively controlling illegal and objectionable information posted by users. The first time a U.S. court interpreted Section 230 of the CDA was in Zeeland v. America Online, Inc. In that case, the plaintiff held America Online liable for defamatory information posted on its bulletin board. At Zeeland's request, America Online removed the initial postings, but the defamatory messages were re-posted to the same bulletin board with minor modifications by other users. Because of these posts, Zelan received threatening phone calls, and he sought common law publisher's liability from America Online. The Fourth Circuit Court of Appeals rejected this argument, holding that Zelan's claim undermined the purpose of the CDA and that online platform companies should not be held jointly and severally liable for it, even if they had editorial control over the content of the offending messages.


And after Zeeland, in order to further incentivise online platform companies to engage in autonomous management, US courts have interpreted Section 230 of the CDA more and more broadly, not only excluding all state law remedies, but also excluding injunctive relief. For example, in Doe v. America Online, Inc. a mother filed a lawsuit under state law against America Online after photographs of her son being sexually assaulted were sold in America Online's chat rooms. The trial court granted America Online's motion to dismiss the plaintiff's claim under section 230 of the CDA and affirmed the dismissal on appeal. The Florida Supreme Court affirmed, declaring that state law remedies inconsistent with CDA Section 230 were expressly preempted by CDA Section 230. Even online platform companies that edit content posted by users have been granted immunity by the courts as long as the edits are not illegal and infringing. For example, in Bartels v. Smith, the court held that as long as the online platform company did not modify the basic form and general content of the illegal and objectionable information, Section 230 of the CDA precludes it from any liability that may arise from its management of the content of the user's information.


Under this broad framework of legal liability exemption, the United States has formed a system of shared governance of online information content ecology with a high degree of autonomy for online platform companies. Take Google's user agreement and management policy as an example, a document of about 1,200 words that prohibits a large number of illegal and undesirable user information content. Facebook, the largest social platform company in the United States, is no exception. As of April 2009, 150 of Facebook's 850 employees across the company are dedicated to reviewing user-reported content and fostering a positive communication environment. Any content that contradicts the standards of discourse promoted by Facebook is blocked by the "modularity protocol", which provides users with an extremely easy way to report content, and once a user is verified as posting content in violation of the law, he or she will be blacklisted and unable to create a new page.


3.3 Chinese Legal Reforms to Supervise Platform Management with Content Regulation Obligations


After the U.S. chose to establish broad exemptions from liability to encourage independent management by online platform companies, China has also granted certain exemptions from tort liability to online platform companies in the private law domain. For example, China's Tort Liability Law promulgated in 2009 provides for a "notice to delete" standard and a "red flag rule" for online platform companies in the face of infringing information, and the Legal Affairs Committee of the Standing Committee of the National People's Congress (NPCSC) has pointed out that online platform companies have no general obligation to censor the content posted by their users. The Legal Affairs Committee of the Standing Committee of the National People's Congress has also pointed out that online platform companies do not have a general obligation to review the content posted by their users.


However, in the field of public law, China has created a large number of content regulation obligations to urge online platform companies to maintain the online information content ecology. For example, Article 47 of China's 2016 Cybersecurity Law explicitly requires online platform companies to strengthen the supervision of users' information content, and if they find information whose publication or dissemination is prohibited by law, they should immediately take relevant measures and report it, or else they will be subject to corresponding administrative penalties. Moreover, the Interpretation on Several Issues Concerning the Application of Laws in Handling Criminal Cases of Illegal Use of Information Networks and Helping Criminal Activities in Information Networks, jointly issued by the Supreme People's Court and the Supreme People's Procuratorate of China, explains and refines the application of the crime of network service providers refusing to fulfil their obligations of information network security management, as stipulated in the Amendment to the Criminal Law (IX). The Interpretation explains and refines the application of the crime of "network service providers refusing to fulfil the obligation of information network security management" as stipulated in Amendment IX to the Criminal Law, and clarifies the criminal liability of network platform companies for violating the obligation to supervise content. The Provisions on Ecological Governance of Network Information Content issued and implemented by the State Internet Information Office in 2020 also stipulate the legal responsibility of network platform companies for refusing to fulfil their content supervision obligations.


Taken together, the content supervision obligations established for network platform companies in China mainly include the following aspects: (1) the requirement that network platform companies should immediately take measures such as warning, refusing to publish, deleting, etc., and keeping records to report to the governmental competent authorities after discovering unlawful and undesirable information, e.g., Article 7 of Decision of the Standing Committee of the National People's Congress on the Safeguarding of Internet Security, Article 4 of Measures for the Administration of Internet Information Services, and Article 4 of Mobile Internet Application Programmes. Article 4, Article 7 of the Provisions on the Administration of Information Services for Mobile Internet Applications, and Article 16 of the Provisions on the Administration of Internet News Information Services, etc.; (2) requiring online platform companies to set up portals for government departments to receive complaints and reports, and requiring the Internet Information Office of each local government to review the content of these complaints and reports, while online platform companies must assist government departments in their supervision and inspection. For example, Article 24 of the Provisions on Network Protection of Children's Personal Information, Article 19 of the Provisions on the Administration of Internet Live Broadcasting Services, Article 10 of the Provisions on the Administration of Internet Follow-up and Commentary Services, Article 11 of the Provisions on the Administration of Internet Forum and Community Services, and Article 12 of the Provisions on the Administration of Internet Group Information Services, etc.; (3) Requiring network platform companies to provide user information and technical support for the supervision of network information content by national and local Internet information offices. information content to provide user information and technical support, such as Article 8 of the Network Security Law, Article 14 of the Measures for the Administration of Internet Information Services, Article 19 of the Provisions on the Administration of Internet News Information Services and Article 16 of the Provisions on the Administration of Internet Group Information Services. Through these legal reforms, China's information content management capacity has been strengthened, forming an ecological governance model of network information content under unified government leadership and control.


In summary, both the United States and China, in the face of the obstacles of the established legal system to the participation of network platform companies in the common governance of network information content ecology, have taken the reform direction of encouraging the active management of network platform companies. It is only that the US tends to trust the market mechanism and give online platform companies a broad space for autonomous management by establishing legal liability exemptions. China, on the other hand, focuses on the leading role of the government, emphasising the establishment of content regulation obligations to urge online platform companies to assist the government in fulfilling its management responsibilities.


4. Co-Governance Dilemma in Chinese and U.S. Legal Reforms


However, the existing legal reforms in China and the United States only go so far as to promote the active management of online platform companies, and do not reconcile the conflicting management modes and powers between the government and online platform companies. This is mainly reflected in the fact that the existing legal reforms are caught in the "either/or" dilemma of the conflict and confrontation between the management modes of the "government network platform companies": the government's regulatory mode based on the control of the administrative organs and the promotion mode of the network platform companies based on the network structure cannot be coordinated through the legal system. The government's regulatory model based on the control of administrative organs and the promotion model based on the network architecture of network platform companies cannot be coordinated through the legal system, and even produce conflicts that cannot coexist. The status quo of shared governance in China and the United States are two extreme cases of conflict between the management modes and management powers of the government and online platform companies, and the existing legal reforms in both countries have failed to achieve the desired effect of shared governance, which confirms the necessity of further reforms of the rule of law system.


4.1 Abuse of Power and Social Polarisation Due to Broad Exemptions from Legal Liability


The U.S. has provided legal space and trial-and-error opportunities for the independent management and innovation of online platform companies through the establishment of broad exemptions from legal liability. However, this has also led to the government's inability to effectively participate in the common governance of the online information content ecosystem, which in turn has led to the abuse of the management power of online platform companies and exacerbated the phenomenon of social polarisation.


The US court's consideration of the abuse of management rights by online platform companies originated from the 2004 case of MCW Inc. v. Badbusinessbureau.com ("BBB"), an online forum that allows users to browse business complaints and air grievances posted by the public. Some users submitted numerous negative messages about MCW, which led MCW to file suit against the BBB for removal of the content and damages, which the BBB defended under Section 230 of the CDA. The BBB would not have been liable if it had merely provided space for users to comment, but it did not, and used a web-based structure that induced people to post negative information with derogatory headlines in order to appeal to users' penchant for negative comments. Although the judge in the case ultimately rejected the plaintiff's claim, it led the court to reflect on the immunity from liability granted by Section 230 of the CDA. The BBB case is not unique in that some online platforms, in an effort to attract users and increase traffic, have designed web frameworks that entice users to post pornographic, defamatory, and extremist content, and some have even encouraged users to post nude or sexually explicit photos of their ex-lovers in retaliation for the pornography. photos to exact pornographic revenge. Law professor Mary Anne Franks, who has led the fight against non-consensual pornographic retaliation in order to combat such phenomena, criticised Section 230 of the CDA: "Given the ease with which providers of non-consensual pornography can anonymously access or distribute the images, it is difficult to ascertain and prove who they are (especially for the purposes of a lawsuit), and under CDA Section 230, victims are in turn barred from bringing civil claims against websites that distribute such material, which has led to a proliferation of pornographic retaliation."


Faced with the abuse of code-based regulatory powers by these online platform companies, U.S. courts have begun to limit the scope of the immunity from liability granted by Section 230 of the CDA. The United States Court of Appeals for the Ninth Circuit issued its first judgement limiting the scope of Section 230 of the CDA in San Fernando Valley Fair Housing Council v. Roommate.com. The judge in that case posed a question: if a website encourages users to publicly post the name, address, credit card information, etc., of the person they are commenting on, as well as embarrassing facts or stories about that person for comfort or to vent their anger, and the website encourages users to make up as much as they want in order to satisfy their preference for reading such content, and there is no mechanism for reviewing such content, is such a website the kind of online platform that Section 230 of the CDA seeks to protect online platforms? A majority of the judges on the panel concluded that websites should be found to be information content providers under certain circumstances. Chief Justice Alex Kozinski noted that Roommates.com's creation of personal questions about the gender, sexual orientation, and family status of its roommates, and the fact that the online platform company could be considered an "information content provider" for those questions, and that users were required to post those questions on its website in order to use its services, amounted to a compulsion for users to answer those questions as part of their use of the service. This is tantamount to forcing users to answer these questions as a condition of using the service, and thus should not be subject to any exemption. After Roommate.com, the Ninth Circuit Court of Appeals, in Barnes v. Yahoo! Inc. the year after, again rejected a district court's decision granting Yahoo! immunity from liability under section 230 of the CDA on similar grounds.


Indeed, since the Ninth Circuit Court of Appeals issued its much-anticipated opinions in Roommate.com and Yahoo, U.S. courts, in a reversal of their previous stance, have begun to dramatically limit the scope of the immunity from liability afforded by Section 230 of the CDA. In an empirical analysis published in the Columbia Technology Law Review in 2017, it was written that "U.S. courts issued written opinions in 2001 and 2002 in 10 cases in which online platform companies claimed Section 230 immunity. In eight of those 10 cases, the courts concluded that the online platform companies were not liable for the content of the offending user-generated information. The remaining two cases involved intellectual property rights, but both were also expressly exempt from Section 230 of the CDA. In contrast, a review of all written court opinions involving CDA Section 230 issued between 1 July 2015 and 30 June 2016 found that in 14 of the 27 cases, the courts declined to provide an exemption from liability to the online platform companies. Of those 14 cases, only one was an intellectual property claim, and the remaining denials of defendants' invocation of CDA Section 230 were due to the judge's finding that the online platform company had abused its regulatory authority to trigger the generation of illegal and objectionable information content." However, the courts' continued restriction of CDA Section 230 today has made it difficult for online platform companies to obtain immunity from legal liability, leading many scholars to argue that it is a step backward for legal reform in the U.S. Ultimately, the U.S. will have to face the problem of how to promote good governance of online platform companies.


In addition to the abuse of power, the US reform model has exacerbated pre-existing social polarisation. The proportion of those with strong negative feelings towards those with opposing views is rising sharply in the United States following the introduction of Section 230 of the CDA, due to the long-term effects of commercially based management by online platform companies. When the online information content ecosystem is left entirely in the hands of private companies, they constantly push their preferred news media content to users out of a desire to maximise profits, which makes it difficult for people with opposing views to see news content that opposes their viewpoints, which in turn exacerbates the conflict of opinions among the population. For example, in the months leading up to the 2016 U.S. presidential election, there was an example of social media platforms peddling "fake news": a fake news story alleging that Democratic nominee Hillary Clinton ran an underground Washington, D.C., child-pornography ring was circulated, and online platforms earned millions of dollars in advertising revenue as a result. Some online platforms earned millions of dollars in advertising revenue. While, according to their own statements, a small number of these online platform companies have no particular political agenda, but are simply spreading carefully crafted "clickbait" to generate clicks, views, shares and retweets that generate advertising revenue, it has to be acknowledged that others are sponsored by capitalists with greater ambitions and objectives. But it has to be recognised that there are also online platform companies that are sponsored by capitalists with bigger ambitions and objectives. As they hope, some groups tend to believe the worst about Clinton and her team, and they share, vote, and retweet these stories. Data from a study by the Pew Research Centre shows that the political divide between Democrats and Republicans in the United States reached a 20-year high in 2017.


4.2 Impediment of Content Regulation Obligations to Platform Management Mode and Platform Economic Development


Although China's existing legal reforms have simultaneously solved the problems of insufficient government management capacity and the abuse of power by online platform companies, they have conferred a large number of content regulation obligations on online platform companies in an administrative way, which will force online platform companies to regulate the content of online information in the form of a subordinate department of the administrative organ, which will deprive them of the legal space for managerial innovation and trial and error, and at the same time impede the development of the platform economy. The development of the platform economy is also hindered.


First of all, as mentioned earlier, the management of online information content ecology by online platform companies is a kind of boosting mode based on the network structure, which is more in line with the governance objective of guiding online public opinion and reflecting public opinion effectively. Because purely relying on administrative supervision can only hinder the dissemination of illegal and undesirable information, but it does not lead people to develop the behavioural habits of taking care of the online information content ecology and spreading positive content, and it is even impossible to make the online information content ecosystem become a place for the public to reflect public opinion in an orderly manner. Moreover, the government's suppression of undesirable information content through direct regulation will, on the contrary, make such information content gain wider and lasting attention. Take the Ma Baoguo incident, for example, which has been popular on various online platforms for some time. After being criticised by People's Daily Online and ordered by the government to boycott the speculation of Ma Baoguo, the incident has attracted more people's attention. If the government had asked the online platforms to delete the postings, it would have attracted more attention to the illegal and undesirable content. Similarly, the Guo Meimei incident in 2011 received more sustained attention and discussion because of the government's accountability. In addition to the inability to effectively manage illegal and undesirable information content, it is difficult for the administrative model to guide online discourse. On the contrary, the network platform companies based on the network structure of the promotion mode, in full protection of the freedom of expression of the masses on the basis of a softer, more effective way to guide people to jointly maintain the development of the network information content ecology. The overly strict content regulation obligations will force online platform companies to adopt a tough information content regulation model.


Second, our existing shared governance model is also not conducive to the development of the platform economy. For one thing, the large number of content regulation obligations exposes online platform companies to the huge cost of comprehensive censorship. This has not fundamentally solved the government management dilemma caused by information technology, but merely shifted the government's responsibility for information control onto online platform companies, hindering their commercial development. Hundreds of millions of pieces of information flow on the network platform every time, if the network platform company is required to determine the illegality of each piece of information like the traditional government, it will bring a huge burden of personnel and review. Secondly, the content regulation obligations imposed on online platform companies and the exemption from liability in private law are likely to conflict with each other in terms of legislative objectives. In order to avoid administrative penalties, online platform companies have no choice but to curb the content innovation of platform users, all user content creation involving sensitive words are rejected, and all user comments discussing politics and people's livelihoods are blocked, which leads to the establishment of exemption from liability in China's private law, the legislative purpose of which cannot be realised. Third, China's existing system places too much emphasis on the public law censorship obligations of network platform companies, and does not reasonably limit the scope of administrative and criminal liability of network platform companies for user content, leading to uncertainty and expansion of the determination of administrative and criminal liability. The network platform company lacks reasonable expectations when reviewing user content, and can only add layers and layers to control all information content that may cause infringement. Fourthly, the excessive interference of our government in the management of online platform companies will make online platform companies be recognised by foreign countries as subordinate departments of the administrative authorities, which will seriously hinder the development of China's platform economy and the internationalisation of online platform companies. The Snowden incident is the best example of the government's interference in the management of online platform companies that triggered an international boycott. In June 2013, former National Security Agency (NSA) contractor employee Edward Snowden copied and disclosed to the media a large number of documents related to the NSA's illegal wiretapping of citizens' communications worldwide, including many of the documents that the NSA has been using to illegally eavesdrop on citizens. communications, including surveillance programmes involving many US online platform companies. In the wake of the Snowden revelations, US online platform companies have been subject to widespread public resistance and strong accusations from other countries. In an effort to dispel the public anger, web platform giant Apple has spearheaded a campaign to resist government interference by demanding strict corporate encryption of the voice and text communications market. In response, the U.S. government has attempted to gain access to decrypted communications and devices by urging Congress to pass a mandatory decryption statute on the grounds of information control. But online platform companies have fought back, claiming that "backdoors" to mandatory decryption in the name of protecting national security would make the Internet unsafe for everyone.


Therefore, through a large number of content regulation obligations to force the management capacity of network platform companies into the government management system, this legal reform model has maintained the unity of the ecological governance of network information content, but it has curbed the network platform companies based on the network architecture of the boosting mode, but also hindered the development of China's platform economy.


5. Reform of the rule of law system to support joint governance by the government and online platform companies


In summary, the reform of the rule of law system for the joint governance of the online information content ecosystem cannot stop at encouraging online platform companies to participate in governance. Giving network platform companies too broad freedom of management will repeat the mistakes of the United States, but continue to set up too many content regulatory obligations is not conducive to governance innovation and platform economy. In this regard, this paper argues that the reform goal of the rule of law system should be changed from encouraging the active management of online platform companies to supporting the pluralistic management mode of the government and online platform companies, and the reform path of the rule of law system should be changed from granting the legitimacy of the management power of the online platform companies to mediating the conflict of management power between the government and the online platform companies. After all, "the principle of the modern rule of law state boils down to one sentence, a monolithic legal system to support the pluralistic power structure, so that the institutional design of the separation of powers and checks and balances operates freely and coordinates with each other through the unified legal rules". Therefore, this paper will further elaborate on how to rely on the reform of the rule of law system to mediate power conflicts.


5.1 Reform of the Rule of Law System to Eliminate Power Struggles


Maurice Olliou, when talking about the relationship between national governance and law, said that under social change, the power of the subject of national governance is often preceded by the provisions of empirical law on power, and the origin of the law is not the self-evolution of the legal text, but originates from the state's decision to eliminate power strife by order. Therefore, the rule of law countries in response to the transformation of the national governance model due to changes in the times, the rule of law system will be adjusted to eliminate the power conflict caused by the transformation of the governance model, so as to achieve the state under the guidance of the law in line with the development of society and evolution. The reform of the rule of law system in the United States in response to the industrial revolution in the nineteenth century was a case in point.


In the early days of the United States, under the political framework of "separation of powers", the government only enjoyed very limited "law enforcement powers", and people attached greater importance to the role of the legislature in state management because of simpler social relations and economic interactions, so the government departments at the time Therefore, the government departments of the time had to obtain authorisation from the parliament to impose private sanctions. However, under the influence of the industrial revolution, the state's economic affairs became more and more complicated, and the demand for state management in various fields of society also grew, and a series of social problems such as railways, securities, labour relations, and housing credits were urgently in need of regulation by the state. In order to cope with these social demands triggered by the Industrial Revolution, the United States enacted the Interstate Commerce Act in 1887 to encourage active management by government departments. Since then, the number of government administrative agencies began to increase dramatically, and a large number of government departments, such as the Interstate Commerce Commissioners, the Federal Reserve Board, the Federal Trade Commission, and the Board of Tax Appeals, began to come into prominence as they managed various areas of society. Government power is also due to the dramatic increase in the management needs of social change and the expansion of its no longer confined to the framework of the separation of powers "law enforcement", but with a certain quasi-legislative and quasi-judicial powers, such as the government can formulate administrative rules and regulations, administrative rulings, etc., the source of these powers is the change of the times of the national management needs, not the The source of these powers is the demand of state management after the change of times, not the deduction of the existing legal norms. However, this rapid expansion of governmental power also led to power disputes between the parliament, the courts and the governmental departments, when a large number of cases against governmental departments were sent to the Supreme Court, and there were constant debates within the parliament on how to deal with the expanding administrative power based on social needs. In order to realise that the government, parliament and courts after the expansion of power can govern the country cooperatively, and to eliminate the power strife triggered by the changes of the times, the United States introduced a series of administrative law norms, such as the Administrative Procedure Act, the Freedom of Information Act and the Government in the Sunshine Act, which have formed the current United States administrative law system.


5.2 Procedural rules for transforming commercial management of network platform companies into public management


In order to transform the informal management mechanism spontaneously generated by online platform companies in commercial competition into a public management system recognised by the modern rule of law system, we need to make use of the procedural rules of the law to achieve the following two effects: firstly, to make the network architecture design of online platform companies change from purely commercial behaviour to formal public management; secondly, to transform commercial private interaction into abstract public decision-making through procedural safeguards for user feedback, such as comments, likes, complaints and reports, and so on. Second, through procedural safeguards for user comments, likes, complaints, reports, and other feedback behaviours, private commercial interactions can be transformed into abstract public decisions, thus enabling online community autonomy. Specifically, legal norms can establish procedural rules in the following areas:


5.2.1 Disclosure Obligations of Network Architectures


As mentioned above, online platform companies can use "modular protocols", automated recommendations and other network architecture design to guide people's behavioural choices. Although this network architecture-based management mechanism does not have a mandatory effect, it will impose specific restrictions on the scope and form of expression of users. Therefore, in order to effectively restrain network platform companies from managing user information content based on network architecture, the law should require network platform companies to disclose relevant information when using network architecture to interfere with the release and reception of user information content, and explain in what ways this architecture will affect user information content. Through such statutory disclosure and explanation, the commercial management behaviour of network platform companies can be transformed into a formal public management mechanism, and can effectively curb the tendency of network platform companies to misuse the code-based network architecture for illegal profits.


5.2.2 Customised Push "Notification Options" Rules


Network platform companies will provide customised information content push services for users for commercial interests, which provides opportunities for network platform companies to block illegal and undesirable information, but also makes it easy for the network information content ecology to cause social polarisation or kitsch phenomenon under commercialised management. In order to attract a large amount of user traffic, online platform companies in the social media field will only push those information contents that can attract attention, and stimulate authors to create relevant contents favoured by users through various reward mechanisms. This makes creators on social media platforms, in order to survive, can only continue to create vulgar fun that caters to and pleases a large number of users, so that the in-depth content that they have spent time creating is replaced by a large amount of vulgar and funny content. Therefore, the law should establish a procedural guarantee of "notice and choice" for users, requiring online platform companies to inform users when they collect user data and push customised information, and allowing users to reject or choose the type of information they want to push, so as to counteract the phenomenon of social polarisation and kitsch that has been exacerbated by commercialised management.


5.2.3 Procedural safeguards for user feedback


In order to allow users to actively participate in the governance of online information content ecology, the law should establish procedural safeguards for user feedback and evaluation. Firstly, network platform companies must provide feedback channels such as user comments and complaints for the information content on their platforms, which should be transformed from a business practice to a legal obligation. Secondly, network platform companies should clearly disclose in user agreements or management methods the rules of management behaviours such as blocking, deleting and blocking based on users' feedback, and there must be a legal procedure of prior disclosure, solicitation of opinions, and persuasive argumentation for any changes to these management rules. Finally, for the management behaviour of blocking, deleting and blocking, the law should require network platform companies to provide the basis for the rules and explanations, and provide space for users to argue, so that the management of network information content is no longer a commercial act that can be decided by the network platform companies without authorisation, but will be transformed into public decision-making with the participation of users of the platform.


5.2.4 Judicial Remedy Channels for Account Penalties


In the information age, the network platform has become an essential online place for people's normal life, the network platform company's management of the user account punishment will seriously affect the user's freedom of expression and social interaction, for example, WeChat's banning, blocking and other penalties are likely to lead to the user can't carry out normal work communication and daily expression. Therefore, the law, in addition to the need to set up the corresponding procedural rules on the network platform company's punishment, should also set up a certain judicial relief, so that the platform user's personal rights and interests will not be damaged by the excessive management of the network platform company, but also to open up a legal channel for the judiciary to supervise the public management of the network platform company.


Through the above procedural rules, the rule of law system will be able to transform the commercial management behaviours of online platform companies into a formal management mechanism, thus positively granting online platform companies the legitimacy to manage the online information content ecosystem, as well as restricting the abuse of power by online platform companies, and avoiding the problem of social polarisation that may be triggered by the exemptions from legal liability in the existing reforms.


5.3 Adjustment of the Government's Regulatory Model and Online Platform Companies' Content Regulation Obligations


In addition to transforming the commercialised management of online platform companies into a formal management mechanism, the rule of law system also needs to adjust the government's administrative supervision model and the legal responsibilities of online platform companies, so that the two management modes can operate freely and harmoniously in common governance.


5.3.1 Government Procedural Supervision of Online Platform Companies


Existing legal reforms have made clear the superiority of online platform companies in the ecological management of online information content, and the law should give the government procedural supervision obligations over the management of online platform companies in order to ensure that online platform companies have the autonomy to manage the space and at the same time to restrain the use of their management power. This procedural supervision is based on the government's mandatory disclosure of information to online platform companies and the fight against misrepresentation. We have clarified above the content of information that should be publicly disclosed by the online platform company in its management, then the law should give the government the power to require the online platform company to make mandatory disclosure of the content of this information. In addition, due to the obvious information asymmetry between online platform companies and users, the law should also require the government to crack down on misrepresentation by online platform companies. For example, the U.S. Federal Trade Commission has launched a large number of investigations and prosecutions of misrepresentation of network platform companies in the past ten years, and punished the misrepresentation of Facebook's failure to collect data in accordance with the terms of the user's privacy, Snapchat's false disclosure of the storage address of the images and videos sent by the user, Twitter's failure to safeguard the content of the user's information in accordance with the agreement, etc. The basis for the punishment is that these network platform companies do not have the right to disclose the user's information. The basis for these penalties is that these online platforms are not managed in accordance with the rules they have disclosed.


5.3.2 Government Antitrust Regulation of Social Media Markets


With the emergence of online platform companies, governments are no longer able to steer public discourse through administrative control of a small number of information publishers. Therefore, governments have begun to change the idea of guiding public discourse in recent years, and they have taken the initiative to enter into major social media platforms to lead online public opinion through market competition. This not only can make full use of the propaganda mode of the network platform to disseminate political discourse, but also can let the people and government departments communicate with each other on an equal footing, reflecting the government's pro-people nature. In order to ensure the healthy development of online information content ecology driven by market forces, the government should strengthen the supervision and management of the social media market. Under the existing legal system, it is difficult to apply anti-trust enforcement against private enterprises to the social media sector, because users do not use money but traffic as a consideration, which makes many existing market regulatory systems unable to be applied to the social media sector. Social media platforms such as Facebook and Tencent have essentially created a market monopoly in this area, which requires the government to regulate these online platforms using new antitrust tools such as "scope of business restriction" and "structural separation". This will ensure a fair and healthy market order in the online information content ecosystem.


5.3.3 liability standards for inducements by online platform companies


As we discussed above, the management model of online platform companies is not administrative, and their constraints on users' behavioural choices are facilitated by the network architecture. Therefore, when exempting online platform companies from a large number of content regulation obligations, we can introduce the "inducement liability" and "substantial contribution standard" into the determination of civil liability for illegal information infringement. The United States Congress stipulated in the FOSTA Act the "inducement liability" of network platform companies in the field of pornographic information dissemination, which stipulates that when the network structure of the network platform company is sufficient to induce normal users to publish illegal information, the network platform company should bear joint and several liability. This has solved the problem of the proliferation of online platforms for prostitution and sex trafficking, and the United States Congress has repeatedly discussed whether to introduce "inducement liability" into areas such as housing discrimination, fraud or child sex trafficking. In addition, in U.S. Housing Fairness Council v. Roommate.com, the court provided a generalisable test of whether the online platform company was liable for intentionally directing users to post sexist rental notices, i.e., the test was whether the online platform's posting rules made a "substantial contribution" to the infringing content posted by users, or whether it was merely a "substantial contribution" to the infringing content posted by the users. That is, to test whether the rules of information dissemination of online platforms make a "substantial contribution" to the infringing content posted by users, or whether they are just a "neutral tool", and to determine whether the online platform company should be held jointly and severally liable, or whether it can apply for exemption from legal liability.


In summary, the future reform of the rule of law system for the common governance of network information content ecology should be based on the different operating principles of multiple management modes, and reconcile the management power of the government with that of the network platform companies. Only in this way can the two management modes operate freely in the common governance of network information content ecology.


6. Conclusion


"The rule of law in modern countries can provide legitimacy to decisions recognised by all parties through procedural justice that dissipates confrontations caused by differences in substantive value judgements." In terms of online information content ecological governance, legal system reforms focusing on procedural rules can effectively support the dual power structure between the government and online platform companies, and find a middle way of governance that can reach an international consensus among the existing Chinese and American institutional reform models. Perhaps the government and online platform companies play different roles in the governance systems of different countries, but the dilemma of power conflict brought about by social pluralism is the same. Therefore, how to set up procedural rules to delineate and coordinate the power between the government and online platform companies is an issue that countries around the world need to face together. Although China and the United States have adopted different institutional reform paths, they can jointly provide valuable historical experience for the modernisation and reform of global national governance.