[author]Wang Yan
[content]
Public-Private Dualistic Regulation of Private Power on Social Platforms
Wang Yan
Professor, School of Law, Guangdong University of Foreign Studies
Abstract: Social platforms develop unique private power through digital content governance in the realm of network space, and their abuse of private power requires regulation by law. The public laws such as the Constitutions face the dilemma that the subjects are not qualified in regu lating the private power, while the private laws such as the civil codes are entrenched by the misunderstanding that the platform service is free and service contracts are manifested with par ties' free will. Thus, the European Union enacts the Digital Services Act to stipulate the public law obligations such as information disclosure, prudential content moderation, risk evaluation and perfecting remedies to restrain the arbitrariness of platforms' execution of private power, assessing the legality of the platforms- users service contracts from both perspectives of formality and substance through its Unfair Contract Terms Directives, together with the adoption of public— private co- governance route to strengthen the external supervision on social platforms' private power and make up users' lack of bargain power. The social platform regulation in China focuses on the control of illegal contents, lacking the awareness of restraining platforms' private power and safeguarding the users rights. It is necessary for China to impose public law obliga tions on social platforms corresponding to their private power, improving substantive legality review of platforms-users service contracts, and optimize the public-private co - governance route to enhance the effectiveness of private power regulation on social platforms.
Key Words: social platform, private power, public law regulation, private law regulation, public- private co -governance
Scholars have pointed out from the very beginning of the Web that the order of cyberspace depends on the complex interaction of four regulatory forces - laws, markets, social norms, and private structures or norms. Among these four forces, platform norms are not only a key element of cyber governance, but also influence the other three regulatory forces. With the help of these norms, platforms allocate information resources, exclude traditional gatekeepers from the main body of regulation by the borderless nature of the network and the hidden nature of technology, perform the function of "public law enforcer" in the name of autonomy, and create "private power" comparable to the state's public regulatory power by virtue of their strong economic influence and social mobilization ability. Platforms have provided users with an adaptive cyberspace order in the enforcement of private power, but at the same time, they have repeatedly abused their authority in the governance of digital content. Among the platforms of different business types, social platforms provide media and communication services directly to users, and because of their business development, they need to formulate content management guidelines, implement content management measures, and deal with disputes in which users are dissatisfied with their management measures. As rule makers, enforcers and adjudicators, social platforms are at the greatest risk of abuse of power compared to other types of platforms, and the harmful consequences are easily magnified by their user scale and network effects. For example, social platforms such as Facebook, Twitter, and YouTube have repeatedly triggered controversies by setting broad exemption clauses in service agreements, arbitrarily taking down user content to restrict users, and infringing on users' rights to personal data, copyright, and freedom of expression. The practice of users of social platforms at home and abroad in defending their rights has shown that public law norms are usually not applicable to social platforms due to their private subject status, while the principle of autonomy in private law norms further indulges social platforms in overstepping their boundaries in civil behaviors, and that structural reforms of the legal system are needed to restrain and regulate the private power of social platforms.
1.Abuse of Private Power on Social Platforms and Regulatory Expectations
Whether the private power originates from the practice of self-governance or the empowerment of the market or technology, or the provisions of the actual law on the obligations of the platform, there is no uniform conclusion in the academic community; however, the private power is not an ordinary civil and commercial law right but the power of the private subject with the nature of the management and supervision, which has become the consensus of the academic community.Social platforms, as private enterprises, play the role of regulators in the cyberspace, although they are not state regulators. They have formed a new set of institutions or ecosystems outside the existing laws, and their power implementation, if not restricted, will be transformed from a social mobilization force to a destructive force. From the perspective of current practice, the abuse of private power by social platforms to infringe on the rights and interests of users is frequent in the process of formulating guidelines, implementing measures and resolving disputes on their platforms.
First, social platforms have made use of "micro-legislative procedures" to formulate platform guidelines and service agreements, setting red lines for online speech and realizing the function of shaping norms and controlling order in cyberspace. For example, the Facebook Platform Guidelines are not only applicable to the 2.6 billion Facebook users, but also have a radiating effect on a large number of small and medium-sized social platforms, and are known as the "Facebook Rules". Instead of a social management contract reached through collective action, the Facebook Platform Guidelines and Service Agreement are formed through the vertical control of power holders over individual members, which is in the form of an agreement, but not in the substance of an agreement. Due to the lack of consultation with users, the guidelines and service agreements of social media platforms such as Facebook, YouTube, Twitter, etc. are full of platform exemption clauses and user derogation clauses. For example, the service agreements of social media platforms such as Yuetube and Twitter provide that the social media platforms enjoy absolute authority to publish, edit, modify, and delete users' content, and that users may not hold the platforms liable for wrongfully editing or deleting content, even if the platforms have been negligent in the review and processing of the content. Any lawsuit initiated by a user is not subject to damages of more than $500 or $100 by either Youtube or the Twitter social platform.
Second, social platforms have been described as private "bureaucracies" with surveillance functions and police powers to enforce laws against non-compliant users, but their enforcement processes are self-interested and their content audits are inconsistent. For example, as social platforms seek advertising opportunities from users' attention and data, their digital content governance focuses on content curation rather than control of non-compliant content, resulting in various types of targeted ads and content recommendation systems interfering with users' access to information and decision-making, and legitimate content being taken down by platforms because it touches on the interests of advertisers. For example, some social media platforms favour shadow bans that hide user activity, and "non-priority searches" or "forbidden searches" that reduce the ranking of information searched. These measures are usually not listed in the platform's service agreements or guidelines and are not easily noticed by users, reducing the probability of disputes between users and social platforms, but violating users' rights to freedom of expression and information.
Finally, social platforms have constructed dispute resolution mechanisms between platforms and users that do not charge fees for their internal dispute resolution mechanisms, and some social platforms, such as the Facebook platform, have even taken the anthropomorphic governmentalisation of their platforms to the extreme by creating a quasi-supreme court, the Facebook Oversight Board. However, the judicial order shaped by the platforms is characterised by unilateral exemptions from the platforms' responsibility to manage digital content, limitations on users' statutory remedies, and lack of due process constraints. Social platforms adjudicate their own internal dispute resolution and do not explain the reasons for their decisions; in external litigation, they pick and choose the court of jurisdiction and applicable law that is most favourable to them, prohibit users from initiating group litigation, increase the cost of litigation for users and limit the amount of claims that users can bring.
It can be seen that compared to e-commerce platforms and search engine platforms, social platforms are more likely to practice quasi-legislation, law enforcement and judicial power through the formulation of platform service agreements, the management of user content and the handling of disputes when providing social media services to users. As a new economic and social model in the 21st century, social platforms have an ever-increasing ability to mobilise society and intervene in the economy, and the empowerment of algorithms and big data technology hides the truth of social platforms' provision of information and control of speech, making legal intervention and regulation necessary to prevent the abuse of private power.
2.The Dilemma of Public-Private Dualistic Regulation on Private Power of Social Platforms
The private power enjoyed by social platforms is characterised by both the private law of the subject and the coercive nature of the power, and it is necessary to provide an effective legal response to this special form of power of the subject. However, most of the laws in various countries strictly cut public power and private rights, and regulate or protect them by public law and private law respectively, which leads to an obvious dilemma in the legal regulation of the private power of platforms, which is a combination of "public" and "private" forms of power: public law stops regulating it due to the ineligibility of the subject matter of the social platforms, while private law is difficult to regulate due to the illusion that the platforms' services are free of charge, and that the users' agreements are consistent with the platforms'.
2.1subject ineligibility of regulation of private power of social platform under public law
In order to constrain the infringement of private rights by public power due to improper law enforcement, public law norms such as the domestic constitution and administrative law have requirements for due process, checks and balances on power, or accountability, and allow individuals who have been infringed upon by public power to seek remedies under public law. When social platforms censor users' speech, the abuse of their private power often oversteps users' fundamental rights to freedom of expression and access to information. However, public law norms are often difficult to regulate because of the platforms' private subject status.
In John v. Twitter, Inc., a 2018 case in the United States, the plaintiff, whose account was terminated by the social media platform, filed a lawsuit against the social media platform for violating his right to freedom of expression under the First Amendment to the United States Constitution. The plaintiff argued that the Twitter social media platform was the modern equivalent of a public forum for debate, and that users' right to express their views on the Twitter social media platform should be guaranteed and not monopolised by the social media website. The United States court, on the other hand, held that Twitter social media platforms, as private businesses, are not subject to the First Amendment to the United States Constitution, and that the law does not treat Twitter social media platforms as state actors just because they use private property as a venue for the expression of popular opinion, and that they are only subject to state actor liability when they act on behalf of the government or in the exercise of governmental authority. In 2020, in the case of University of Pocola v. Google Inc., the United States court re-emphasised that the First Amendment to the United States Constitution binds governments and public bodies, and that the YuoTube social media platform under Google Inc. is not subject to restrictions on the right to freedom of expression. In fact, public international law and domestic public law for the private subject of public law constraints on the formulation of special rules, such as the World Trade Organisation countervailing system under the identification of public institutions and the United States law under the act of state doctrine can make the implementation of the government's authority and functions of the private sector to undertake public law obligations. However, the private power facilities of social platforms, in the name of autonomy and based on platform service agreements, are difficult to apply to the interpretation of state actors under the enactment law or prior jurisprudence. As such, courts are not immune to path dependency when interpreting the applicability of public law norms to platforms, and exclude social platforms from public law norms in order to guard against the expansion of public forums.
However, this tradition of public law regulation appears to be incompetent when applied to the regulation of private power on social platforms. Traditionally, public law norms aim at restraining the boundaries of the implementation of public power, private subjects do not have the right to decide on public affairs and the right to allocate public resources, in order to prevent the squeeze of public power on private rights, the public law has a strict subject limitation of the target of constraints. At present, the boundaries between the political state and civil society are becoming increasingly blurred in cyberspace, and social platforms, whether in terms of profitability or social mobilisation, are comparable to a sovereign state, and the basic information services and public information they provide for users in cyberspace have become the necessities for human transactions and interactions in modern society, with the nature of quasi-public products, and the platforms actually enjoy the right to make decisions on public affairs and the right to allocate public resources. The platform has actually enjoyed the right to decide on public affairs and allocate public resources. The restrictive interpretation of public law norms to public authorities allows social platforms to escape from the public law system and to counter the public law claims of users, which in fact encourages the abuse of private power by social platforms.
The quasi-legislative act of social platforms in formulating platform guidelines is also the process of reaching platform service agreements with users, which is subject to contract law. When evaluating the validity of the platform service agreement, if the court ignores the power control in the formation of the agreement and places too much emphasis on the principles of freedom of contract and double payment, the problem of misplaced regulation will arise.
First, with regard to the validity of exemption clauses in platform service agreements, United States jurisprudence reflects a high degree of respect for party autonomy and the principle of double recovery. In Lewis v. YouTube Company, the YouTube social media platform suspended the plaintiff's account for improperly collecting personal information for commercial purposes, and did not retain information such as third-party user comments on Lewis's videos and the frequency of video clicks after resumption of the account service. Lewis applied for restoration of the videos to their original state, as the above behaviour of the social media platform had prevented Lewis from benefiting from the traffic of the videos. The US court rejected the plaintiff's claim and noted that the service agreement did not oblige the social media platform to store video views and comments, and that the agreement's disclaimer provided that the platform would not be liable for errors and omissions in the management of user content. Although the disclaimer was entered into between the user and the platform with unequal bargaining power, the disclaimer was not improper given that the platform's service was free of charge. In the case of Song fi Inc. v. Google Inc., Google's social media platform, YouTube, claimed that Song fi Inc. had violated the user agreement by deleting its videos. Song fi Inc. argued that since the user had no bargaining power in the agreement in terms of either procedural or substantive terms, the absolute authority clause of YouTube to delete the user's content was unfair. The U.S. court rejected the plaintiff's claim, stating that YouTube was not the only video distribution site available to Song fi Inc., and that since YouTube's service was free, users could not challenge its absolute authority to remove content.
The People's Court of China has, in similar cases, mainly relied on Articles 496, 497 and 506 of the Civil Code of the People's Republic of China (hereinafter referred to as the Civil Code) to examine and decide on the validity of the form clauses of the platform service agreement in terms of form (whether the provider of the clauses reasonably draws the attention of the users) and substance (whether the clauses exempt itself from liability, aggravate the liability of the other party or exclude the main rights of the other party or exempt the liability for personal injury and liability for property damage caused by intent or gross negligence). Different courts have made very different evaluations of the formal validity of the same platform format terms due to cognitive differences, and it is not uncommon to see different judgements in the same cases and different causes for the same results. Generally speaking, the people's courts in China have focused on the formal validity of the terms of the platform service agreement, and the judgement on its substantive validity depends on the nature and content of the terms, with insufficient attention being paid to the validity of the consultation or negotiation in the conclusion of the terms of the platform agreement.
Second, with regard to the dispute about the platform’s designation of its location as the place of jurisdiction for litigation in a service agreement, courts have been relatively cautious in interpreting the jurisdiction of the litigation because it affects the law applicable to the substantive dispute and touches on the public policy of the forum. For example, in 2017, the Supreme Court of Canada in Duetz v. Facebook Inc. analysed the private power implications behind the jurisdictional clause of the Facebook social media platform user service agreement. In that case, the Facebook social networking platform's user service agreement provided that the platform could use users' names in advertising pushes and designated the United States court in California as the court of jurisdiction to handle disputes between the platform and users. The plaintiff, Ms Duetz, a resident of British Columbia, Canada, argued that the use of her name in ad tweets by the Facebook social media platform infringed her right to privacy and brought an lawsuit in the British Columbia court, to which the Facebook social media platform objected to the jurisdiction. The Supreme Court of Canada rejected the Facebook social media platform's jurisdictional objections on four main grounds: (1) the user service agreement was entered into between consumers and platforms with unequal bargaining power; (2) the terms of the Facebook social media platform service agreement were used by different platforms without the possibility and opportunity for the users to negotiate; (3) the jurisdictional clause, as a judicial remedies clause, is a public product and is not suitable for unilateral designation; and (4) the subject matter of the case was the infringement of the right to privacy, which is quasi-constitutional in nature. Judge Abraaj, who heard the case, emphasised that the jurisdiction clause of the Facebook social media platform's service agreement conferred an unfair and overwhelming procedural and substantive advantage on the Facebook social media platform, and was in fact a non-arm's length contract. The significant value of the decision in this case lies in the fact that the court was aware of the non-negotiable nature of the platform's service agreement and the fact that the content of the agreement disposes of non-private law rights. It is evident that the validity of the terms of a social platform service agreement should be judged by the burden it actually imposes on the user, and not only by whether the terms are brought to the user's attention or by the content features of the terms.
It can be seen that the misunderstanding of the civil law and other private law on the social platform user service agreement adjustment mainly lies in that the adjusting norms of standard terms does not involve compulsory intervention of private power on civil and commercial agreements. Social platform service agreement formation process is private power realisation process, when the private power and civil rights intertwined, simple private law norms will be difficult to play the effect of rights relief. (1) The service agreement of social platforms has the appearance of a horizontal agreement, but its conclusion is realised by the platform's vertical control over individual members. Although users can act collectively to counter the control of social platforms, collective action is rare in practice, and the community connections formed by social platform service agreements are not a manifestation of the collective will of the members and do not reflect the spirit of freedom of contract. (2) The bilateral relationship between social platforms and users and the free service is superficial, but behind it is a multi-party market relationship of "social platforms - original users - advertisers - new users". Social platforms provide online services for users, users post content to attract new users and interact with them, and social platforms sell user attention and data to attract advertisers to place ads. User interaction is a key factor in data generation and platform revenue growth. Focusing too much on the gratuitous nature of the social platform's services ignores the social platform's consumption of user data and attention. (3) Social platforms of a certain size not only deprive users of the ability to negotiate agreements, but also of the freedom to withdraw from them. As shown in Duetz v. Facebook Inc., Facebook has become a ubiquitous social service platform in Canada, and users who "opt out" of the service are disconnected from society. Moreover, even if an alternative platform exists, users "voting with their feet" incur high platform migration costs, such as the breakage of the original platform's chain of relationships, the loss of data and the economic value attached to it, and the cost of time and effort in structuring customer relationships on the new platform.
3.EU Legislative Practice of the Public-Private Dual Regulation on Private Power of Social Platforms
The regulation of the private power of platforms needs to break through the limitations of the traditional public-private law system and carry out structural reforms of platform legislation to thoroughly solve the problem of the lack of constraints on their political accountability mechanisms. The European Union, as a global leader in platform legislation reform, has not been limited by the private subject status of social platforms, but has developed an accountability system for social platforms to constrain the exercise of their private power with the help of the European Union Digital Services Act, which adjusts the terms and conditions of user service agreements under the vertical control of platforms with the legal principle of more substantive fairness.
3.1Public law reform of the regulation on private power of social platforms
Public law regulation of the private powers of social platforms must ensure that the implementation of the powers of social platforms is checked and balanced by their responsibilities. The EU Digital Services Act, as a public law norm for platforms, regulates the user and content management behaviour of platforms, which can directly constrain the implementation of the private power of social platforms to establish service agreements, take down user content and handle disputes on platforms, so as to prevent social platforms from undermining the private commercial and public spheres by exploiting their size, network power and surveillance capabilities. The EU Digital Services Act breaks down the internalised nature of user and content management on social media platforms in a way that is similar to the traditional public law approach of limiting and holding accountable the public power of governments.
First, the formulation of platform guidelines by social platforms is a manifestation of their exercise of quasi-legislative power, but it is often criticised as a "black-box" operation due to the lack of user participation and the opacity of the process. In order to ensure the accountability and legitimacy of private norms and soft lawmaking, the European Union Digital Services Act imposes strict obligations on social platforms in terms of information disclosure and compliance control in response to the opacity of information and the lack of users' true will in the development of their service agreements, so that social platforms must refrain from abusing their power in the exercise of their private authority. Articles 14 and 24 of the EU Digital Services Act require platforms to disclose in a clear, straightforward, user-friendly and non-objectionable manner all policies, procedures, measures and tools used for content auditing, as well as any notable updates thereto, including algorithmic decision-making processes and manual interventions. Traditionally, social platforms have treated content review algorithms and manuals as trade secrets and have not disclosed them to the public, resulting in the public and regulatory authorities having no way of knowing the logic of the social platforms' practical operations and being unable to effectively monitor their content publishing and blocking behaviours in pursuit of advertising interests and user attention. The EU Digital Services Act reinforces the disclosure of rules governing the management of digital content on social platforms, and the mandatory disclosure of algorithms and manuals for manual review of platforms compensates for the lack of information available to users and regulators. To ensure that social platforms' disclosures are truthful and accurate, the EU Digital Services Act requires mega-platforms to have an internal EU Digital Services Act Compliance Officer (Article 33) and to engage an external third-party auditor to audit compliance with their platform guidelines (Article 37).
Second, checks and balances of power are a safeguard against arbitrariness or arbitrariness of public power, and are the ballast of the rule of law and democratic societies to ensure the legitimacy of governance, and the lack of due process control over law enforcement measures by social media platforms is precisely the reason why their content management is arbitrary. The case of Trump v. Twitter Inc., which was previously heard by the Facebook Oversight Committee, exposed the rigidity of social media platforms' content management tools, which are not commensurate with the harmful effects of user content. To address this issue, Articles 14 and 15 of the EU Digital Services Act require platforms to implement their service restrictions in a diligent, objective and proportionate manner, taking into account the rights and interests of the individuals concerned. For example, users' fundamental rights protected under the EU Charter of Fundamental Rights; platforms shall notify relevant users in a timely manner and provide clear and specific reasons before removing user content; platforms' decision-making basis as well as users' challenges and avenues for remedies; and platforms shall make public once a year a report on their content audits, disclosing the number of unlawful content they have removed and measures taken, as well as the number of complaints from users, the reasons for them and the decisions made to deal with them. In addition, the EU Digital Services Act also regulates the "notice-and-takedown" procedure by, on the one hand, setting up a system of trusted complainants, requiring platforms to give priority to notices from trusted complainants (Article 22), and encouraging NGOs to actively intervene in the monitoring of user content on platforms, and, on the other hand, regulating the content of third-party notices by granting users the right to challenge false complaints with counter-notices (Articles 16 and 17),guarding against abuse of this procedure by third parties and the arbitrariness of platforms in taking down user content. These obligations all point to the abuse of private power in the management of user content by social media platforms, and aim to improve the disclosure of information, third-party supervision and protection of users' rights and interests in relation to the censorship and removal of user content by social media platforms.
Third, as the exercise of quasi-judicial power by social platforms is characterised by non-neutrality and non-transparency of the basis for adjudication, the preamble to the EU Digital Services Act emphasises the legislative purpose of guaranteeing users access to effective remedies in the context of the platform's services. Therefore, Article 20 of the EU Digital Services Law requires platforms to provide users with a free online complaint system that can be easily accessed within six months after a dispute has arisen, and puts forward the requirements of promptness, diligence, non-discrimination, and non-arbitrariness with respect to the "quality" of internal dispute resolution; platforms are required to give sufficient reasons for their internal dispute resolution, and to provide users with external dispute resolution mechanisms and judicial remedies. In order to expand the use of platforms' internal dispute resolution mechanisms, measures affecting the visibility, interactivity and profitability of user content will be included in the scope of platform disputes and platforms will be prohibited from handling disputes only through automated systems. The EU Digital Services Act creates "private courts" for platform disputes - out-of-court dispute resolution organisations - as a bridge mechanism between platforms' internal complaints systems and the courts. Its creation not only breaks the internal cycle of platform dispute resolution and promotes inter-institutional competition, but also avoids the expenditure of judicial resources on a large number of micro-disputes. In order to encourage users to submit disputes to out-of-court dispute resolution organisations, Article 21 of the EU Digital Services Act requires platforms to reimburse users for the costs and reasonable expenses of dispute resolution if they are successful in their disputes. The quasi-judicial regulation of platforms under the EU Digital Services Act effectively responds to the problem of social media platforms' automated systems for handling internal complaints without providing grounds for dispute resolution and breaks the internal cycle of dispute resolution on platforms, while the establishment of out-of-court dispute resolution organisations enhances the neutrality of platforms' dispute handling.
Fourth, mega-social platforms are particularly problematic at present in terms of information manipulation and damage to users' rights and interests. With more than a quarter of the world's population living on social platforms such as Facebook and Tube, negligence in risk prevention and allowing third parties to misuse their services will not only undermine the rights and interests of individual users, but will also infringe on the public interest. For example, Cambridge Analytica has used user data obtained on Facebook social media platforms to intervene in the domestic affairs of other countries by manipulating voter turnout through targeted advertising on Facebook social media platforms. The EU Digital Services Act imposes particularly stringent risk prevention requirements on social media platforms. In addition to assuming the obligations of ordinary platforms, mega-platforms should diligently identify, analyse and assess the systemic risks of illegal content dissemination and infringement of the fundamental rights of EU residents in their service systems, and take effective risk mitigation measures (Articles 34-35), as well as assuming a deepened version of the disclosure obligations, setting up a compliance officer, accepting third-party audits, and sharing data with law enforcement authorities (Articles 34, 37 and 40). These obligations respond to scandals in recent years of improper or abusive implementation by mega social platforms and private powers. To prevent these obligations from becoming a mere formality, the EU Digital Services Act establishes tough liability for mega-platforms for non-compliance. A mega-platform can be fined up to 1 per cent of its previous year's global turnover for breaching its disclosure obligations (Article 74), and 6 per cent of its previous year's global turnover for failing to comply with other mandatory obligations under the EU Digital Services Act (Article 74). Even voluntary compliance commitments made by mega platforms, once enshrined in a decision by the European Commission, are transformed into mandatory obligations, and failure to fulfil them amounts to a breach of a mandatory obligation. The strict liability regime, on the one hand, reinforces the deterrent effect of the EU Digital Services Act, which obliges social platforms to be careful in enforcing their private powers, while on the other hand, it reflects a change in the identity of the mega platforms, which have become the gatekeepers of digital content governance.
3.2Private law norms regulating private power of social platforms
Online platform relationships consist of different categories of informal collaborative relationships that cannot be captured one by one by the radar of contract law and must be responded to flexibly. Judicial practice in the EU and its Member States shows that their contract, data and copyright laws limit the application of the principle of autonomy in social platform service agreements and correct overly broad exemption clauses in social platforms.
Firstly, the core of dealing with service agreement disputes between social platforms and users lies in balancing the asymmetric relationship between social platforms and users. If a social media platform uses its private control to create overly broad exclusion clauses in a service agreement, the EU will usually invalidate such clauses under the EU Unfair Contract Terms Directive's principles of transparency and legal certainty. For example, Article 5.2 of the Facebook social platform's 2015 User Service Agreement had provided that "we (the platform) may remove any content and information you post on Facebook that we believe violates our statements and policies". The court held that the disclaimer was too broadly worded and did not meet the legal certainty requirements of Article 5 of the EU Unfair Contract Terms Directive and was therefore invalid. At the same time, the court also held that a form clause in a platform's user service agreement is invalid on the basis of the principle of "imbalance of payments" if the parties' contractual rights and interests are extremely unbalanced without negotiation. As far as Member States are concerned, Sections 305 and 307 of the German Civil Code (BGB) stipulate that clauses lacking transparency, clarity and readability are invalid. According to Section 307 note 2 of the German Civil Code, a clause that restricts the fundamental rights or obligations inherent in a contract to such an extent as to jeopardise the achievement of the contract's purpose is a non-fair clause. Article L212-1 of the French Consumer Code provides that contractual clauses must be drafted or presented in a manner that is clear and comprehensible to the consumer; and Article R212-1-6 provides that contractual clauses that remove or reduce the consumer's right to compensation in the event that the operator fails to fulfil any of its obligations are non-fair clauses. These clauses can also counteract overly broad and vague platform exemption or user derogation clauses when interpreting social platform user service agreements.
Secondly, social media platforms that include clauses in their user service agreements that violate mandatory provisions of copyright law and personal data protection law will be held invalid by the courts. In Germany, a user sued the Facebook social media platform over clause 10.1 of its 2010 User Service Agreement, which allowed the platform to use user account information and images in conjunction with commercially sponsored content. The German court held that the provision did not specify the circumstances in which the user's images would be used and the purposes for which they would be used, violating the principle of information transparency under the German Federal Data Protection Act and the principle that users must give informed consent to the purposes of personal data processing. Similarly, a provision in a social networking platform's user service agreement providing for the free use and sublicensing of user content could be denied validity.
The European Union and its Member States' private law norms are more strongly adjusted to social platform user service agreements for a number of reasons: (1) the EU Unfair Contract Terms Directive's formal review of the form terms of social platform user service agreements includes the requirements of certainty, clarity, clarity, and readability in addition to the provider's duty of care to the user, and these principles have a high degree of scalability, which enables the courts to interpret social platform user service agreements favourably based on the deviation from consensual private power control when applying the Directive;(2) The EU and its member states' private law focuses on the substantive review of agreements and paternalistic protection of vulnerable groups, and the principle of "imbalance of payment" is used to assess the fairness and proportionality of the distribution of rights and benefits between social media platforms and users. The German Civil Code and the French Consumer Code also focus on the relief of disadvantaged groups, so that in the review of social platform service agreements, more attention is paid to the intrinsic agreement of the contract and fairness, and the interpretation of the fairness of the terms and conditions of the service agreements of social platforms is not relaxed due to the gratuitous nature of social platform service for the users as in the U.S. Courts; (3) The European Union In recent years, the European Union has strengthened the regulation of various digital platforms, and the EU General Data Protection Regulation, the EU Digital Marketplace Act, and the EU Digital Services Act have respectively regulated the abuse of private power or market monopoly by platforms from different levels of personal data protection, competition, and digital content governance. This culture of stringent platform regulation permeates into disputes between social media platforms and users over user service agreements, which can also lead to more favourable adjudication outcomes for users. Overall, the EU and its Member States' private law responses to social platform USAs are more closely aligned with the substance of the private power interventions behind USAs.
3.3Public-private partnership in the regulation on private power of social platforms
One of the distinctive features of the EU platform legislation is that it breaks the internal nature of platform governance by means of public-private partnership, whereby the government does not directly intervene in the governance of the platform, but legislates to regulate the platform autonomy to regulate the management of user content, and transforms the platform's private behaviours into a public-private hybrid of governmental public and private powers by means of the law's coercion, which is thus different from the pure high-powered governmental control and network autonomy. This public-private governance aims to rectify the defects of the "single-pipe" public intervention and the arbitrariness and disorder of platform autonomy, break the absolute division between power and rights, and enable users to effectively organise collective actions with the support of public authorities to counter the abuse of private power by platforms.
On the one hand, in order to prevent over-regulation by the government from reducing the efficiency of platform governance, the EU Digital Services Act does not directly interfere with the formulation of platform user service agreements and community guidelines, as well as the use of platform management measures, and it does not even directly negate the designation of jurisdiction and exemptions from liability of user service agreements of social media platforms, such as Facebook and Tube. Articles 4 to 8 of the EU Digital Services Act maintain the immunity from liability of network intermediaries, emphasising that platforms do not have a general obligation to review user content, that ISPs providing intermediary, hosting and hosting services are not liable for infringement of third-party content that is not published by them without their knowledge, and that their bona fide autonomous content review does not result in the loss of their immunity from liability. The respect for platform autonomy in the EU Digital Services Act ensures that social platforms are innovative in their services and prevents them from being held back by heavy legal liability.
On the other hand, the EU Digital Services Law builds a broad community structural participation mechanism among platforms, governments, citizens, and non-governmental organisations (NGOs) through an external monitoring mechanism, weakening the profit-seeking and non-neutrality defects of the private enforcement mechanism, and eliminating the absolute division of power and rights, and of public law and private law in the regulation of platforms. (1) Although mega-social platforms enjoy the autonomy to formulate and implement platform guidelines, they are required to hire third-party organisations to audit their content auditing measures and risk assessment reports; ordinary social platforms are required to give priority to accepting notifications from "credible complainants" and accepting supervision from non-governmental organisations (NGOs) during the "notification-takedown" process. (2) The EU Digital Agency and the European Commission could issue a comprehensive report on mega-social platforms, pointing out their significant systemic risks and publishing best practices to mitigate them. The European Commission, together with national digital service coordinators, could also issue risk-specific generic guidelines (Article 35.3) to guide social platform risk self-inspection. In the event of public safety and public health incidents, the European Commission could draft crisis response protocols and encourage mega platforms to develop crisis response codes with other platforms (Article 48). (3) The EU Digital Services Act specifically empowers regulators to investigate mega social platforms and grants them a variety of investigative tools. For example, the European Commission can hire auditors and external experts in its investigations; the Digital Services Coordinator can send independent investigators to investigate the mega social platforms' systems and programme interfaces, obtaining backend data and lifting the veil on the platforms' internal algorithms and content audits (Articles 69, 40). (4) The EU Digital Services Act empowers the European Commission to transform the voluntary commitments of mega-platforms into mandatory obligations to be enshrined in an EU Decision, making them binding within the EU. In this way, standards of corporate autonomy are transformed into legal provisions and become a source of mandatory obligations. This transformation of voluntary commitments of mega-platforms into mandatory obligations blurs the line between public and private law.
The EU's regulation of the private power of social platforms in both public and private law confronts the problem of the expansion of social platforms' private power due to the lack of duties and procedural controls in the formation of private power, as well as the unilateral volitional nature of the platforms' user service agreements and the lack of users' bargaining power. In particular, the EU Digital Services Act establishes antecedent obligations on social platforms, imposing obligations similar to the accountability and limitation of public power, significantly differentiating it from the American model of ex post regulation that leaves platforms to their own devices. In this way, the EU Digital Services Act breaks down the objectives of platform private power regulation, guides platforms to develop the motivation and inertia to improve content governance and restrain the abuse of private power in their day-to-day business with defined behavioural norms, and improves the effectiveness of supervision of social platforms by regulators, the public, and third-party organisations through public-private governance. The value of the EU Digital Services Act's platform regulation lies not only in its specific norms, but also in its deep understanding of the challenges posed by platforms' private power to established public and private law norms, and its public-private approach to addressing the expansion of social platforms' private power and the erosion of users' rights and interests.
4.Legal Review and Institutional Improvement of Public-Private Dualistic Regulation on Private Power of Social Platforms in China
With the rapid development of platform economy, platform regulation has been incorporated into the scope of China's legislation. Combined with the development of China's digital economy and the goal of network content dissemination, it has become an inevitable move to improve China's social platform legislation by exploring a dualistic public-private regulation scheme for social platforms that meets China's national conditions.
4.1Legal Review of Public-Private Dualistic Regulation on Private Power of Social Platforms in China
China's public law norms on the regulation of digital content on social platforms are scattered in the Law of the People's Republic of China on Electronic Commerce (hereinafter referred to as the "E-Commerce Law"), Measures for the Administration of Internet Information Services and other laws and regulations, which are mainly related to the social platform's responsibility for content auditing, protection of the rights and interests of users, and the responsibility of the gatekeepers of the mega social platforms; and the private law norms are mainly embodied in the Civil Code in adjusting the form terms and conditions of the user agreements of social platforms. The private law norms are mainly reflected in the civil code to adjust the social platform user agreement format terms.
4.1.1 Public Law Norms
First, unlike European and American social platforms that do not bear the general obligation of content auditing, China's law stipulates that all platforms, including social platforms, are responsible for digital content auditing and management, establishes a regulatory form of network management, and establishes the obligation of platforms to stop transmitting illegal content, eliminate the information, prevent the proliferation of such content, keep records, and report to the relevant authorities. For example, according to Articles 14~16 of the Measures for the Administration of Internet Information Services, Internet information service providers engaged in news, publishing, and electronic bulletin services shall record the content of the information provided and its release time, Internet address or domain name, etc., and shall not release illegal information, and shall immediately stop transmission, keep records, and report to the relevant authorities when illegal information is found. Article 29 of the E-Commerce Law stipulates that when illegal behavior occurs within an e-commerce platform, the platform shall take the necessary disposal measures and report to the relevant competent authorities, and the same applies to e-commerce behavior in social platforms such as WeChat. The Provisions on the Administration of Public Account Information Services for Internet Users require all types of platforms to conduct real-time management of interactive links such as messages, follow-ups and comments on user accounts. In addition, Article 10 of the Administrative Measures for the Security Protection of the International Networking of Computer Information Networks, Articles 9 and 10 of the Provisions on the Ecological Governance of Network Information Content, and Article 12 of the Provisions on the Administration of Internet News and Information Services stipulate the platforms' content management responsibilities and implementation mechanisms. The aforementioned platform regulations focus more on the platform's management of users and content, providing some guidance for the management behavior of social platforms, but their drawbacks are also obvious. China's regulation of platforms focuses on preventing the dissemination of illegal content, and establishes mandatory obligations for social platforms on content management and illegal content reporting. The vertical character of public law norms is prominent, rigid and high-powered governance, but the lack of regulation of the private power of social platforms and the protection of users' rights and interests, which is not the same as the legislative purpose of the EU Digital Services Act, which tries to regulate and balance the private power of platforms and the civil rights of users. This kind of provision, which does not distinguish between the type and scale of platforms and requires them to bear the responsibility of content auditing, will lead to the paradox of platform regulation: if the platform fails to fulfill its responsibility of auditing, the law will be too harsh; if it does not pursue the responsibility, the obligation of content auditing will be reduced to a mere formality.
Secondly, in response to the abuse of private power by social platforms, China's law has formulated the obligations of information disclosure, improvement of notification-deletion procedures, and proper handling of disputes on platforms for the audit of digital content on platforms, but the provisions are not specific enough. First, Articles 28 and 30 of the Measures for the Supervision and Administration of Network Transactions do not set objective, prudent and proportionality requirements for digital content auditing measures on social platforms, and only require platforms to disclose user service agreements, transaction rules, and timely publicize measures such as warnings, suspension or termination of services to operators. Article 9 of the Guidelines for Standardizing Format Terms of Contracts for Online Trading Platforms is relatively specific, requiring platforms to remind users in a prominent manner of format terms that have a material interest and affect their rights and interests, and not to set inconvenient links to the terms or hide the contents or fulfill the obligation of reminding them only by way of further reading by technical means. Social platforms are not required to disclose the algorithms and manuals actually used to review content, and it is ambiguous whether measures to impair the visibility of users' content or deprive them of their revenue rights need to be disclosed. Article 16 of the Regulations on Ecological Governance of Online Information Content requires platforms to establish a convenient and effective complaint and reporting mechanism, and to accept and handle user complaints and reports in a timely manner. This provision is in fact a reiteration of the current practice of social media platforms, and does not impose due process and fair adjudication requirements on their quasi-judicial power, which makes it difficult to fundamentally solve the problems of non-transparency and non-neutrality in the platform's dispute handling procedures. The reason for this is that China's law pays insufficient attention to the characteristics of different platform services and forms of abuse of power, making it difficult to regulate the management of digital content on social platforms in a targeted manner, which is significantly different from the categorization and governance idea of the EU Digital Services Act, which focuses on digital content regulation mainly applicable to social platforms, and competition regulation mainly applicable to e-commerce and search engine platforms.
Thirdly, China is currently mulling legal regulation of the responsibility of digital content gatekeepers of mega platforms. The State Administration for Market Supervision and Administration (SAMSA) 2021 has issued the "Guidelines for the Classification and Rating of Internet Platforms (Draft for Public Comments)" and the "Guidelines for the Implementation of the Main Responsibility of Internet Platforms (Draft for Public Comments)", which attempt to regulate the internal governance of mega-platforms, risk assessment, risk prevention, and management of platform users and content. Articles 5 to 12 of the "Guidelines for the Implementation of Main Responsibilities of Internet Platforms (Draft for Public Comments)" suggest that mega platforms should carry out a risk assessment at least once a year, which mainly includes the risks of spreading illegal content, infringing on the legitimate rights and interests of consumers, and infringing on public order and national security that may be caused by the services of Internet platforms, and should focus on examining the content auditing system, advertisement targeting and recommendation system, content recommendation and distribution system, platform security and stable operation system, and the content management system. The platform shall, in accordance with relevant laws and regulations, implement the real-name system for network users, establish an effective management system for the behavior of users on the platform, ensure that user behavior is lawful, compliant, and abides by social morality, and establish an effective content management system in accordance with the characteristics of the platform itself to avoid the dissemination of illegal and illicit information on the platform. The platform has established an effective content management system based on its own characteristics to avoid the dissemination of illegal information on the platform. Although these contents refer to the relevant EU laws, it is difficult to play a role in regulating mega social platforms. Firstly, the two guidelines have not been implemented, and as departmental guidance documents, they are not legally enforceable. Second, the guidelines' regulation of digital content does not address the particular risk of abuse of mega social platform services by third parties, and lacks provisions to protect users' rights and interests, such as the requirement that platforms should not discriminate against users in their advertising and content recommendation systems, and that platforms should notify users in advance of the reasons and justifications for enforcing the law on user content. Finally, most of the proposed provisions in the Guidelines are in principle and their operability is doubtful. The lack of liability for violations prevents the Guidelines from truly constraining mega-social platforms from responding to systemic risks in a prudent manner and accepting third-party supervision.
4.1.2 Private Law Norms
In terms of private law norms, the Civil Code stipulates the validity of the form terms of the user service agreement of social platforms and the right of users to counter notification in the "notify-take-down" program.
Firstly, a large number of exemption clauses formulated by social media platforms in user service agreements are mainly regulated by the Civil Code. To address the formal validity of user service agreements, Article 496 of the Civil Code requires the provider of a form contract to take reasonable means to draw the other party's attention to clauses exempting or limiting its liability. Regarding the substantive fairness of the platform user service agreement, Articles 496~497 of the Civil Code stipulate that unreasonable clauses formulated by the provider of the form clause to exempt or reduce its own responsibility, increase the responsibility of the other party or restrict the main rights of the other party are invalid. Judicial practice shows that the People's Court generally adopts a narrow interpretation of the substantive fairness of the form contract. The exemption of data storage obligations or the agreement on litigation jurisdiction in the platform user service agreement generally does not fall within the prohibition of Article 497 of the Civil Code. Overall, the Civil Code's Contracts Section focuses on the formal validity of social platform user service agreements, and lacks clarity on the substantive validity of form terms and the specificity of platform user service agreements. Given that most platforms have already drawn users' attention to the disclaimer by adding black and bold fonts, the formal validity is not the core of the dispute between the platforms and the users, and it is impossible to respond to the question: if the users do not have a choice or do not realize that they have the right to make a choice, how can we conclude that the platform user service agreement represents their will or autonomy?
Secondly, Article 1196 of the Civil Code establishes the right of users to counter-notification from the perspective of civil rights and interests, preventing rights holders from abusing the "notify-take-down" procedure to infringe upon the freedom of expression of platform users, and requiring platforms to terminate their user content management measures in a timely manner in the absence of a complaint by the rights holder to the relevant authorities or the initiation of a legal proceeding. However, this right is actually a reiteration of the current practice of social media platforms, and there is no specific behavioral regulation on how social media platforms can improve the "notice-and-takedown" procedure.
In the final analysis, China's legal regulation of social platforms has just begun, and the main goal is to disseminate the legality of content rather than to regulate the private power of platforms. Therefore, in the public law system, there is a lot of ink on the platform's authority to manage users and content, especially focusing on the platform's obligation to report to the competent authorities, and there is a lack of behavioral norms to constrain the authority of social platforms in content management and provisions on the mechanism of safeguarding the rights and interests of users. There is a lack of consideration of the unilateral willfulness and lack of agreement in the service agreement of social platforms.
Social platform regulation expects to get the result of restraining the spread of illegal content, but ignores the process of content management, and the path of public-private partnership is even more neglected. Social platform governance faces the dilemma of content bias, ambiguous roles and rights and obligations of governance subjects, and lack of supervision of platform information behavior governance.
4.2 Institutional Improvement of Public-Private Dualistic Regulation of Private Power on Social Platforms in China
The EU platform legislation responds to the inadequacy of the traditional public-private dualistic institutional system to regulate the private power of platforms, breaks the stereotypes of the private subject to formulate an accountability mechanism to constrain the implementation of its private power, and restricts the platform's abuse of the freedom of the user service agreement, and makes up for the shortcomings of the platform's self-governance and the public supervision with the path of public-private governance, which is the same regulatory idea applicable to China. However, it should also be noted that the EU platform legislation reform is influenced by its regional human rights protection concept and the consideration of reversing the monopoly advantage of U.S.-funded data technology enterprises, while China's current stage is to encourage the development of the digital economy and support the small and medium-sized enterprises (SMEs) and micro-merchants' entrepreneurship and innovation as the main goal, and the overly strict legal liability in the EU Digital Services Act, such as a fine of 1 percent of the global business revenue for non-compliance with the disclosure of information by the mega platforms, violation of mandatory obligations, or being fined by the EU. The EU Digital Services Act's overly stringent legal liabilities, such as a fine of 1% of global revenue for non-compliance of information disclosure by mega platforms, and a fine of 6% of global revenue for violating mandatory obligations or voluntary commitments enshrined in the EU Decision, are too severe for Chinese platform companies, and overly burdensome disclosure obligations may lead to too high operating costs for non-mega platform companies.
First, for social platform regulation to be effective, it must respond to the nature of private power, with responsibilities commensurate with its power, and restrictions based on principles such as due process and accountability.
First, the Provisions on Ecological Governance of Network Information Content and Measures for the Administration of Internet Information Services should address the characteristics of abuse of private power on different types of platforms, strengthen the regulation of improper digital content management on social platforms, further standardize public participation and information disclosure in the formulation of platform guidelines, and stipulate that platforms should disclose the text of the guidelines and the user service agreement in advance, solicit opinions from users and the public, and report to the competent authorities for the record when formulating or revising the platform guidelines. and to the competent authorities for the record. In view of the fact that in practice the rules announced by platforms and the actual implementation of the rules may be very different, algorithmic transparency mechanisms such as algorithm filing, algorithm evaluation, algorithm accountability, etc. have become an important path to regulate self-preferential treatment of platforms, and the disclosure of social platforms to the outside world should include algorithmic systems and manuals affecting the auditing of their content, but given that such content usually constitutes the platform's internal commercial secrets, it is appropriate to limit the disclosure of the main body of the disclosure to the most However, considering that these contents usually constitute the platform's internal commercial secrets, it is appropriate to limit the subject of disclosure to the mega platforms that are prone to algorithmic manipulation, and the disclosure target can also be limited to their competent authorities.
Secondly, in order to restrain the abuse of private power in the management of social platforms, platform authorities may investigate the current situation of improper exercise of private power by social platforms, and platform regulations such as the Provisions on the Ecological Governance of Network Information Content may formulate legal norms to regulate the platforms' behavior of content auditing, requiring social platforms to take measures to restrict the release of user services and digital content in a prudent, objective and proportionate manner, and to take reasonable consideration of relevant public interests and individual rights and interests in choosing the treatment of the users. When choosing measures to deal with users, they should reasonably consider the relevant public interests and individual rights and interests, and cautiously use measures such as permanent blocking and termination of account services to ensure that the means and objectives of their implementation are in line with the principle of proportionality. At the same time, the notification and basis of the platform's enforcement measures should also be incorporated into regulations such as the Provisions on Ecological Governance of Network Information Content and the Measures for the Administration of Internet Information Services, requiring the platform to notify the relevant users of its decision before taking the decision to remove the content or suspend the service, and to provide clear and specific reasons in the notification, including the basis of the platform's decision, the reasons why the content was deemed illegal, and the time period and scope of the implementation of the measures, The way for users to challenge and dispute resolution. In order to reduce the cost of information disclosure by the platform, the Internet Information Service may provide a template or guidelines for the notification of the platform's content review measures.
Again, social platforms should further clarify the right to remedy when users' rights and interests are impaired. The Provisions on Ecological Governance of Network Information Content and Measures for the Administration of Internet Information Services should make it clear that social platforms should provide users with an online complaint system that can be conveniently accessed within six months after their rights and interests have been impaired, and set out the scope of the platform's internal dispute resolution mechanism, which includes the removal of user content or disconnection, suspension of the platform in whole or in part, weakening the visibility of user content, and the use of the platform's content as a means to resolve disputes. It also sets out the scope of cases under the platform's internal dispute resolution mechanism, including disputes over deletion of user content or disconnection of links, suspension of all or part of the platform's services, weakening of user content visibility, suspension or termination of user accounts, and deprivation of the user's right to income, and requires the social platform to make public the procedures and rules of its online complaint system, so that the social platform shall restore the content or services without delay if the user has provided sufficient evidence that the user's content is lawful or has not violated the social platform's service agreement. With regard to personality interests such as users' privacy and portrait rights, as well as consumer interests, it should be stipulated that social networking platforms should not increase the cost of redress for users by designating jurisdiction and limiting the right of users to file lawsuits in their service agreements.
Finally, China's mega-platform legislation has not yet been put into effect, and the current platform law does not differentiate between the size and type of platforms, and provides one-size-fits-all management for platforms. It is recommended that the following adjustments be made in future revisions of laws and regulations related to the regulation of the digital content of social platforms: (1) At present, the "Provisions on Ecological Governance of Network Information Contents," the "Measures for the Administration of Internet Information Services," and other requirements require that different types and sizes of platforms assume the responsibility of auditing and reviewing their content. If this provision is strictly enforced, it will hinder the innovation of digital content management and business model of platforms, and raise the threshold for small and start-up platforms to enter the market. Therefore, the platform legislation should differentiate between mega platforms and ordinary platforms and limit the subject of mandatory content review responsibility to mega platforms. In this way, the advantage of installing automatic content review systems on mega social platforms can be utilized and the impact of the scale of illegal content dissemination can be prevented. (2) At this stage, China's social platform regulation is still aimed at promoting platform economic innovation and encouraging small and medium-sized enterprises (SMEs) to enter the market, and the platform legislation can formulate exemption clauses for small and medium-sized enterprises or start-ups, and allow users to apply for exemptions from disclosure and other obligations during the period of small-scale and small-scale assets or start-ups of the platform enterprises. (3) Promote formal legislation on regulation of mega platforms as soon as possible, differentiate between different types of platforms in terms of digital content management and the risks of manipulation by dominant market forces, improve the systematic risk assessment and mitigation obligations of mega social platforms in digital content regulations, focus on preventing information manipulation in the advertisement recommendation and content auditing systems, the risks of harming the public interest and the rights and interests of users, and formulate anti-misuse measures for the advertisement and content recommendation systems. The mega social platforms are prohibited from releasing targeted advertisements or content for the purpose of interfering with or manipulating users' decision-making. Mega platforms should retain and disclose information about advertisers in their daily operations, the content of advertisements as well as the targets and duration of advertisements, and if advertisements are targeted at a specific group of people, they should disclose the criteria for selecting the group of people or the target group to which the advertisements are to be sent. Mega platforms should also establish a stricter compliance system than ordinary platforms, such as setting up a compliance officer, risk assessment and information disclosure subject to audit by a third-party organization. (4) The legislation should clarify the supervisory authority such as the right to investigate violations of mega platforms by the regulatory authorities, authorize the Internet information department to conduct on-site investigations of mega platforms suspected of violating the law, the right to question, the right to obtain data, the right to inspect on-site equipment and articles, etc., and allow the regulatory authorities to hire independent investigators or external experts to assess the platforms' dissemination of illegal content, manipulation of data, and infringement of users' rights and interests, etc., as needed. In the event of a major public opinion manipulation crisis, the Internet information department of the relevant mega-platform may carry out a special investigation and order the platform to take the crisis management measures suggested by the regulator or third-party organization, and the platform shall bear the corresponding administrative responsibility if it fails to cooperate in the aforementioned investigation activities. The provision of investigative power and mandatory liability of the competent authorities will make up for the lack of enforceability of the obligations of mega social platforms.
Second, to address the problem of social platforms abusing their private power to set unfair terms in platform user service agreements, legal norms for platforms oriented to the protection of users' rights should be established. The vertical control of social platforms, especially mega-social platforms, and users, as well as the clustering effect of the network, actually block the opportunity for users to withdraw from the platform services. This means that from the perspective of contract law on the platform private power to intervene in the civil agreement behavior to adjust the effect is always limited, can be appropriate to break the division of public law and private law, direct intervention in the platform user service agreement. China's platform legislation as a public law norms, can be further improved within the scope of the Civil Code does not conflict with the platform user agreement and guidelines of the relevant provisions. For example, the relevant regulations can clarify "the provider of form terms excluding its own responsibility, aggravating the obligations of the other party and excluding the main rights of the other party" in the field of platforms, differentiate between paid and non-paid obligations of platforms, prohibit social platforms from excluding or unreasonably restricting obligations that constitute the fundamental attributes of the platform's services, such as data storage, in the case of non-paid services, and prohibit social platforms from excluding or unreasonably restricting obligations that constitute the fundamental attributes of platform services, in the case of non-paid services. It prohibits social platforms from excluding or unreasonably limiting their obligations that constitute fundamental attributes of the platform service, such as data storage, under paid services without a reasonable justification, and displaying a fair exclusion of their obligations that constitute fundamental attributes of the platform service under non-paid services. In addition, platform regulations should also prohibit platforms from designating jurisdiction and limiting the right to sue in user service agreements for disputes involving privacy, personality rights, consumer rights and other rights.
Regarding the dispute between the platform and the user, academics have explored the cognitive ability of rational consumers to the format terms of the user service agreement of the network platform and whether the platform's tips are dynamically balanced with the recipient's ability to comprehend; the People's Court has focused on examining the formal legitimacy of the format terms of the user service agreement, and insufficiently examined the validity of the substance of the terms, thus ignoring the impact of the platform's private power on the agreement in the process of reaching the agreement on the social networking platform's service with the user.When dealing with disputes over platform user service agreements, the people's courts should not only examine from the formal aspect whether the platform has given a clear indication of the exemption clauses, but should also consider the rights and interests of the users involved in the form clauses from the substantive aspect. For example, whether there is a possibility of negotiation with the user and whether the platform format terms have been negotiated, whether they deprive or restrict the user's legitimate rights and interests under the Law of the People's Republic of China on the Protection of Personal Information and the Copyright Law of the People's Republic of China, whether they exempt the platform from the responsibility of causing damage to the user's rights and interests by willfulness or gross negligence, and whether they limit the user's right of action; and whether, in the case involving the jurisdiction of the litigation, their cause of action involves In the case of litigation jurisdiction, whether the cause of action involves the public interest or the user's personality interest, etc., which is not suitable for the agreement jurisdiction.
Third, the regulation of private power on social platforms in China should optimize the management path of public-private partnership on social platforms. Public-private governance means that in the regulation of private power of platforms, the high-powered regulation by regulatory authorities and the private governance by the public and third-party organizations should not be neglected. (1) Internet information departments at all levels can research the governance practices of social platforms of different sizes, solicit information on how users' rights and interests have been compromised by the platform services, and organize social platforms, users and other stakeholders to jointly formulate the best practices for digital content auditing and governance, and recommend that the platforms adopt them in practice. For clauses that significantly harm the rights and interests of users, platform regulators can set up a "blacklist" system, requiring social platforms not to include such clauses in their user service agreements and platform guidelines. (2) Advertising and content recommendation rules are the most serious areas of algorithmic discrimination and information manipulation on social platforms. Platform authorities can formulate guidelines for advertisement release, convene different social platforms and stakeholders to work together to formulate rules for advertisement release, and restrict social platforms' information access, locking or interfering with their product and service preferences. (3) Internet information departments at all levels can also cooperate with the public and non-governmental organizations (NGOs) to set up a complaint channel for platform users on their official websites, so that regulators can obtain information about social platforms that abuse their private power to illegally review content and infringe on the rights and interests of users; and introduce a system of trusted complainants in the "notify-take-down" procedure, listing social groups such as consumer associations or NGOs as trusted complainants, so as to ensure that social platforms are not subjected to any abuse of their private power. In the "notify-take-down" procedure, a system of trusted complainants is introduced, in which social groups such as consumer associations or non-governmental organizations are listed as trusted complainants, and platforms are required to prioritize the handling of notifications from trusted complainants. (4) The autonomous departments of the platform industry should also formulate autonomous norms for social platforms, organize platforms to make voluntary commitments on content auditing, advertisement publishing, content recommendation and dispute resolution, and supervise the implementation of their commitments.
The regulation of private power of social platforms is a systematic project, and public and private laws are not clear-cut paths, but rather interact with each other. In order to prevent social platforms from abusing their private power, the legal regulatory means and measures can break the boundary between public law and private law, and optimize the path of public-private cooperation to achieve the best regulatory effect.
5.Conclusion
In the era of digital economy, it becomes an important proposition for platform regulation to reconcile the conflict between private power and citizens' individual rights, and balance the tension between governance effectiveness and rights protection. The user service agreement, content audit policy and algorithms of social platforms have shaped the control ecology of the network field, which not only establishes the user behavior and constraint standards in the cyberspace, but also changes the basic cognition and concept of law. The intersection and overlap between the implementation of power and the exercise of rights on social platforms means that under the current dichotomy between public and private law, it is difficult to effectively regulate the private power of platforms. In this regard, the latest EU platform legislation does not confine itself to the identity of the platform's private subject, obliges it to undertake public duties such as information disclosure and accountability for the exercise of power, restricts the application of the principle of autonomy in the platform's service agreement, and adopts public-private governance to strike a balance between platform autonomy and power supervision. The implementation of the EU Digital Services Act means that EU social platform regulation has shifted from network intermediary legislation based on liability exemptions to platform legislation centered on duty of care, with platform ex ante obligations to make up for the deficiencies of ex post regulation mode. China's law on social platform private power regulation currently focuses too much on the control of illegal online content, and lacks the understanding of the abuse of platform private power and the insufficient protection of users' rights and interests, which can be inspired by the setup of platform liability in the EU Digital Services Law, the adjustment of the form terms of the platform service agreement in the contract law, and the public-private partnership path of digital content governance.