The Substantial Distinction Between the Personal Information Right and Other Personality Rights
Abstract: As an independent specific personality right, the personal information right should be distinguished from other traditional personality rights such as the right to privacy, rather than frequently be claimed simultaneously in the current judicial adjudication. The root reasons for the muddled application concerning the personal information right and traditional personality rights are as follow: mistakenly considering that the coexistence of the two can be solved solely by the doctrine of concurrence of rights, overlooking the digital society is a necessary premise for the application of personal information right, as well as failing to realize the key
problem and institution concern of the personal information right. As the object of right, the substantive characteristic of the personal information is personally identified or identifiable by algorithm, which determines the legal attribute of personal information right and the particularity of the rule, and thus becomes the substantive factor to distinguish the personal information right and traditional personality rights. Accordingly, not all identified or identifiable information is the object of the personal information right, and not all disputes related to personal information should be solved by the application of personal information right. The object of the personal information right should be limited to the information identified or identifiable by algorithm. Also, the disputes with algorithmic technology could be solved by the application of personal information right．
Key Words: Personal Information Right; Personality Right; Personally Identified or Identifiable by Algorithm; Personally Identified or Identifiable by Nature
Since the establishment of the protection of personal information in Article 111 of the original General Principles of the Civil Law, there has been a gradual increase in personal information disputes in China. A phenomenon presented in cases involving personal information is that the rights holder often claims that their privacy/reputation rights and personal information rights have been infringed upon at the same time, and the court always states in the judgment reasoning whether the behavior involved in the case has infringed on privacy/reputation rights and personal information rights. An important question raised by this phenomenon is: if the right to personal information always overlaps with traditional personality rights such as privacy, why does the right to personal information become an independent and specific personality right? In short, can the frequent overlap between personal information rights and traditional personality rights be resolved through right competition? How can we effectively distinguish between the two if they cannot be resolved through competition of rights? What is the theoretical basis for distinguishing between personal information rights and traditional personality rights more fundamentally? The overlapping issue of personal information rights and traditional personality rights can lead to theoretical and practical difficulties, after all, traditional personality rights such as name, reputation, honor, and privacy are actually presented in the form of "information", and the infringement of these rights is often manifested in the public or improper use of relevant "information". The overlap between the two will lead to the "pan personal informatization" of personality rights protection in judicial practice, and cases of infringement of traditional personality rights are often judged under the name of "personal information infringement". At the same time, if the problem of overlap between the two cannot be fundamentally solved, it will be difficult to prove the independent value of the right to personal information as a specific personality right, and basic issues such as the content of its rights and protection boundaries cannot be accurately portrayed. Moreover, it will also erode the boundaries of traditional personality rights, leading to chaos in the value and logical system within personality rights. This is also the reason why this article attempts to clarify the fundamental differences between personal information rights and traditional personality rights at the basic theoretical level.
1 The Myth of the Coexistence of Personal Information Rights and Traditional Personality Rights
Since personal information has been legally recognized, there has been a continuous phenomenon of "coexistence" of infringement of personal information rights and infringement of traditional personality rights in the same case. Please provide examples to illustrate.
1.1 Case of posting judgment in the community
Zhai and Xu have a marital relationship, and the property management company sued Zhai for paying property fees. Xu, as Zhai's authorized litigation representative, participated in the lawsuit. After the judgment, the property management company will post the judgment at the entrance of the community and other locations. At the time of posting, the property company has replaced the words of Zhai and Xu in the judgment with "3-14-4", but the birth date, ID card number and residential address of the two people have not been hidden. Xu filed a lawsuit against the property management company for infringement of personal rights. The first instance court believes that the public can access legally effective judgments, and the facts stated in the judgment in this case do not contain personal privacy content that is not suitable for public disclosure, which will not reduce Xu's social evaluation. Therefore, the defendant does not constitute an infringement of the plaintiff's personality rights. However, the second instance court held that according to Article 111 of the original General Principles of the Civil Law, the personal information of natural persons is protected by law. Therefore, the defendant did not hide the plaintiff's ID card number, date of birth and address when posting the judgment, which infringed the plaintiff's right to legal protection of personal information.
This case is a typical traditional personality rights dispute. Prior to the legal provision of personal information rights, it is not uncommon to post other people's information in offline areas such as residential areas, leading to personality rights disputes such as privacy and reputation rights. According to the constituent elements of traditional personality rights, if there are insults, slanders, and other remarks in the posted content, infringement of reputation rights can be claimed; If there is private content, it can be claimed to infringe on privacy rights; If none of them exist, it is difficult to claim infringement. The person being disclosed cannot demand responsibility from the perpetrator solely because their information has been disclosed, as the flow of information is a social norm and one of the important interests that the law needs to protect. The rules on the right to reputation, privacy, and other rights are the results of the measurement of interests that have gradually accumulated through long-term practice of the law. If we bypass the analysis of the constituent elements of traditional personality rights and directly claim infringement of rights on the grounds of unauthorized use of personal information, it is equivalent to ignoring the institutional design of traditional personality rights and overturning a series of values and interests in personality rights law.
1.2 Online release of judgment case
The "Belta case" and the "Huifa Zhengxin case" are two cases that have attracted widespread attention in the field of personal information. The circumstances of the two cases are similar: the defendants are both companies that provide information retrieval and other services. The defendants will upload court judgment documents from the China Judgment Document Network to their own websites for customers to search. The judgment documents on the defendant's website include litigation documents between the plaintiff and a third party, which involve information such as the plaintiff's name and dispute process. The plaintiff filed a lawsuit on the grounds of infringement of personal information, reputation rights, privacy rights, etc. The court also conducted a detailed analysis of the issue of personal information involved in the case. In the "Belta case", the first instance court held that the defendant Belta Company's actions violated the right to personal information because its act of reproducing the legal documents involved in the case for profit without the consent of the parties involved constituted illegal use of others' personal information. The second instance court also held that after the plaintiff contacted Belta Company to request the deletion of the document, Belta Company still refused to delete the document on the grounds that the Chinese Judgment Document Network had already publicly disclosed the disputed document, which constitutes illegal public use of the plaintiff's personal information. Similarly, in the "Huifa Zhengxin Case", the court also analyzed in detail whether the defendant's behavior constituted an infringement of the plaintiff's personal information rights, including whether it was an illegal use of personal information. However, its conclusion is contrary to the "Belta case", where the court held that the defendant's actions did not infringe on the plaintiff's personal information rights. Therefore, the two cases are often regarded as typical cases of "different judgments in the same case".
The commonality of the two judgments is that both courts have classified them as personal information dispute cases and conducted relevant analysis in the judgments. But the problem is that disputes involving publishing information about others online without permission are not uncommon. Traditional online personality rights infringement disputes in China involve personal information. In the "Wang Fei case", known as the first case of "human flesh search" in China, the defendant disclosed the plaintiff's name, work unit, extramarital affairs, and other information on the website. For a long time, such cases can be resolved by applying traditional personality rights such as privacy and reputation rights, and personal information is often the determining factor for determining whether it constitutes infringement of privacy and reputation rights. The court stated in the judgment of this case that, In social life, citizens often take the initiative to inform others of personal information such as their name, work unit, and home address for the sake of communication. This personal information is sometimes known and utilized by others through certain channels. Whether the disclosure and use of such personal information constitutes an infringement of privacy rights should be determined by the actor's method of obtaining, disclosure method, disclosure scope, disclosure purpose, and disclosure consequences Comprehensive recognition of the element If we can skip traditional personality rights such as privacy and apply personal information rights directly, it means that the traditional personality rights system will be ignored. In fact, the publication of personal information in the case online does not necessarily mean that personal information rights are applicable. As for what conditions should be met to apply personal information rights, it is precisely the content that will be discussed in the following text.
1.3 Pang's Information Disclosure Case
This case has been widely discussed in the field of personal information. The plaintiff Pang entrusted Lu to order a ticket for China Eastern Airlines through the "Qunar Network" platform, using Lu's mobile phone number for booking; Two days later, Pang's phone number received a fraudulent text message regarding the cancellation of the flight. The plaintiff claimed that the defendants Quna Company and China Eastern Airlines Company had leaked their privacy information (including name, phone number, itinerary arrangement, etc.), and demanded that the two defendants bear joint and several liability for apologizing and compensating for mental damage and comfort money. The first instance court held that the plaintiff failed to prove that the two defendants had leaked their privacy information, and therefore should bear the adverse consequences of being unable to provide evidence, and rejected the plaintiff's lawsuit request. The second instance court believes that the evidence provided by the plaintiff has shown a high possibility of the two defendants leaking the plaintiff's personal privacy information, and the defendant has failed to provide evidence to overturn it. According to the high probability rule of factual determination, it should be determined that the two defendants leaked the plaintiff's information. The names, phone numbers, and itinerary arrangements involved in the case all belong to personal information as stipulated in Article 111 of the original General Principles of the Civil Law. Although the plaintiff's name and mobile phone number should not be kept confidential when viewed in isolation, when combined with private information (itinerary information), the overall information becomes private information because it contains private content. Based on this, the second instance court ruled that the two defendants violated the plaintiff's privacy rights.
There are many confusing aspects in the reasoning of the second instance court's decision in this case. Since the court ultimately determines that the two defendants have infringed on the plaintiff's privacy rights, it should analyze the constituent elements and legal consequences of infringing on privacy rights in the reasoning. The personal information such as the plaintiff's itinerary should be used as a factor in determining whether the infringement of privacy rights is valid. However, the court did not argue in this way, but based on seemingly opposite reasoning logic, pointed out that, given that the original General Principles of the Civil Code had established the protection of personal information, in order to effectively protect personal information, the protection of personal information can be indirectly achieved through the right to privacy. The reasoning behind this' noise and dominance 'is truly puzzling. The right to privacy predates the right to personal information in history. In the face of disputes, the right to personal information should be applied as a "filler" when privacy is difficult to effectively resolve. Why would they reverse the two and solve the problem of personal information protection through privacy rights? From the perspective of the application relationship between personal information rights and privacy rules, privacy rules should be applied first (Article 1034 (3) of the Civil Code), and there is no need to rely on personal information to argue and protect privacy rights. Moreover, this argument also violates the applicable logic of two systems: for the specific protected information content, infringement of privacy can be proven as infringement of personal information, which is called lifting the weight to show lightness, and is valid in the legal application logic; However, it cannot be said that infringement of personal information constitutes infringement of privacy, as the degree of infringement of privacy requires higher requirements. It can be seen that although this case is widely regarded as a typical case of personal information disputes, from the perspective of the referee's reasoning, this case can be resolved by applying the right to privacy, and there is no need to discuss personal information. The prominent emphasis on personal information protection in this case has actually caused confusion about prioritization.
The mixed and unclear reasoning phenomenon presented in this case is not an isolated case in China's judicial practice. In cases involving personal information in China, the plaintiff often advocates for both the right to personal information and the right to privacy/reputation, and the court's reasoning often analyzes whether the accused behavior infringes on various specific personality rights. In the face of these cases, if both traditional personality rights and personal information rights are established, there is a suspicion of duplicate protection when applying personal information rights protection in situations where traditional personality rights have already been protected; If traditional personality rights cannot be established, but broad personal information rights can be established, then there will be a suspicion of using personal information rights as a substitute for traditional personality rights to achieve a balance of interests; If the traditional personality right is established, but the personal information right is not established, it means that the constituent elements of infringing on the personal information right should be subject to certain restrictions, at least not as in current judgments, where unauthorized personal information processing behavior is generally recognized as infringing on the personal information right. The question is, how should this phenomenon be restricted and what is the theoretical basis for the restriction rules? To answer these questions, it is necessary to clarify the root cause of the confusion between personal information rights and traditional personality rights.
2 The Theoretical Root of the Confusion in the Application of Personal Information Rights and Traditional Personality Rights
The reason why it is difficult to distinguish between cases of infringement of personal information rights and infringement of traditional personality rights in practice is that there is overlap in the presentation and infringement methods of personal information rights and traditional personality rights. That is, traditional personality rights such as privacy rights and reputation rights are presented in the form of "information", and the act of victimization often manifests as the processing of "personal information". At the theoretical level, the direct reason for the academic community's neglect of the coexistence of the two lies in the mistaken belief that the problem can be solved using the theory of competing rights. The fundamental reason is that the theory fails to fully reveal the digital environment in which personal information rights exist, fails to effectively clarify the core issues and institutional care of personal information rights, and fails to truly grasp the substantive elements that distinguish personal information rights from traditional personality rights.
2.1 Misbelieving that competing rights can solve the problem of coexistence of personal information rights and traditional personality rights
At present, the academic community generally believes that the overlap between personal information rights and traditional personality rights can be resolved through the use of competing rights. The content of personal information is very rich, including private information, portrait information, name information, and other content, and these personal information content also belongs to the protection object of specific personality rights such as privacy, portrait right, name right, etc. Therefore, the infringement of personal information in the Personal Information Protection Law inevitably conflicts with the protection of specific personality rights infringement stipulated in the Civil Code However, it is not appropriate to attempt to apply the theory of alternative competition to solve the problem of coexistence of the two, as this will result in traditional personality rights being elevated by personal information rights.
It is generally believed that the key to defining personal information is identifiability, which means that any information that can identify a specific natural person individually or in combination with other information is personal information. However, traditional personality rights are usually presented in the form of information and are inevitably associated with specific natural persons, otherwise the so-called personal interests such as name, privacy, reputation, etc. would not be discussed. If the problem of coexistence of the two is solved through the competition of rights, it will mean that the right to personal information can almost fully cover the scope adjusted by traditional spiritual personality rights. However, there are significant differences between the personal information rights rules designed by law and traditional personality rights rules. For example, according to Article 69 of the Personal Information Protection Law, infringement of personal information rights is subject to presumption of fault liability, while infringement of traditional personality rights is subject to fault liability. The protection of personal rights and interests is important, but values such as freedom of speech are also precious. The principle of liability is a decision made by the law after weighing various values and interests. Overall, when designing the personal information rights system in China, the information subject is given a moderate tilt, while the traditional personality rights system is not designed in this way. If the theory of competition and cooperation is adopted to solve the problem of the coexistence of personal information rights and traditional personality rights, it can be reasonably speculated that the right holder will choose personal information rights more than traditional personality rights to file a lawsuit, and the neglect of traditional personality rights by personal information rights will become an inevitable result.
But the outcome of this choice should not be expected by the law. The reason is that, in terms of institutional design concepts, applying the tilted protection concept of personal information rights to the adjustment of traditional personality rights disputes is not suitable in terms of value selection. Because in the concept of private law, expanding rather than limiting freedom of conduct is the fundamental principle. Secondly, whether from the perspective of comparative law or the development history of personality rights in China, the law has always taken a cautious and gradual step in protecting personal interests. If all types of "information" are protected under the name of personal information protection, it will mean a change in the traditional cautious attitude of the law, providing a wide and universal indiscriminate protection for various "personal interests" like a flood of water. Thirdly, in terms of the specific implementation effect of the system, replacing the traditional personality rights system with the personal information rights system faces serious rationality challenges. The typical manifestation is that the principle of presumption of fault is adopted for infringement of personal information rights. If the principle of presumption of fault is generally adopted for disputes adjusted by traditional personality rights, it is undoubtedly a significant adjustment to the balance of interests of the parties. However, this adjustment cannot and is difficult to provide legitimacy arguments in theory. Fourthly, replacing the traditional personality rights system with the personal information rights system will also lead to confusion in the theoretical system of personality rights. Not only will the boundary between personal information rights and traditional personality rights be blurred, but the balance of interests, exquisite institutional design, and theoretical accumulation accumulated by traditional personality rights over the years will also be replaced by the currently relatively rough personal information rights system.
The theory of alternative competition of rights cannot solve the problem of the coexistence of personal information rights and traditional personality rights. Of course, facing the problem of coexistence, another possibility is to apply both personal information rights and traditional personality rights (rights aggregation) simultaneously. In fact, as victims who are mostly ordinary people rather than legal professionals, it is a common practice in real life to claim that both personal information rights and traditional personality rights are infringed upon in litigation. They hope to receive more comprehensive legal protection, but judges cannot make judgments based on the aggregation of rights or the selection of competing rights based on the parties' petitions. Providing support at the same time not only faces the majority of challenges encountered in choosing a competing entity, but may also provide repeated protection to victims. This also indicates that the root of the problem lies not in the claims made by the parties, but in the fact that the judges have not theoretically clarified the essential differences between the infringement of personal information rights and traditional personality rights.
2.2 Neglecting that the digital society is a necessary prerequisite for the application of personal information rights
Since the coexistence of personal information rights and traditional personality rights cannot be solved through rights competition or rights aggregation, how to reasonably distinguish the boundaries between the two has become a theoretical question that must be answered. The establishment of the boundaries of personal information rights must be based on the normative purpose of legal protection of personal information, and the analysis of its normative purpose must be based on the historical development process. If the definition of personal information is only based on "identifiable" as the core element, then the existence of personal information will be as long as human history. Even at the beginning of human society, there were personal information that differentiated between "you, me, and him", but such information did not immediately enter the legal field. The law first recognizes and protects material personal interests such as life, body, and health, as well as spiritual personal interests such as freedom and reputation. Modern law truly recognizes the need for specialized protection of personal information due to the arrival of the digital society. The third industrial revolution, marked by information technology after World War II, led to the birth of the digital society and accelerated its development process, completely changing the way personal information is recorded, disseminated, and utilized in traditional offline societies. In the traditional offline society, the "platform" for recording information is the brain of natural people, and the transmission method is mostly word of mouth, and the utilization method is mostly limited to a small amount and difficult to scale use. Although the emergence of paper, printing, cameras, broadcasting, and other technologies greatly expanded the recording platforms and dissemination channels of information, the utilization of information still cannot be separated from the "personal" processing of the human brain. Both the scale and depth of information processing are limited by the natural attributes of the human brain, making it difficult for personal information to be collected and utilized on a large scale, and becoming an important factor affecting the interests of the subject.
In the digital society, personal information is widely collected and utilized, which has attracted legal attention to both personal dignity concerns and the enormous commercial value presented by personal information. The Data Privacy Act originated in the 1960s to address the processing of personal data by computers. Therefore, the United States passed a series of written laws and gradually formed the basic principles for protecting data privacy in the digital age, namely the Fair Information Practice Principles (FIPPs). Subsequently, in order to ensure the cross-border flow of data, the United States began to promote its rules to the world. Since the mid-1970s, European countries have been protecting privacy interests under the leadership of the United States. It can be said that at the beginning of the proposal for personal data protection at the international level, there was a strong goal of unifying national data protection rules to prevent countries from acting independently and hindering the cross-border flow of data and the development of the digital economy. Without cross-border data circulation, there would be no international issue of personal data protection, and the issue of cross-border data circulation stems from the arrival of the digital society. In this sense, the digital society is a prerequisite for the generation and existence of personal information legal systems. Without the digital society, personal information will not enter the legal field of vision; Without the digital society, it is difficult to legally grant the right to personal information; Without the digital society, the law only needs to protect personal information with traditional personality rights (including general personality rights in special circumstances). In short, without a digital society, there would be no legal right to personal information.
2.3 Failure to clarify the core issues and institutional care of personal information rights
Although the institutional foundation of personal information rights lies in the digital society, not all personal information disputes that occur in the cyberspace should be subject to personal information rights. The reason why the right to personal information can become a specific personality right is not only due to the arrival of the digital society, but also due to the difficulty of traditional personality rights in effectively responding to the digital society. Otherwise, it is only necessary to expand the application of traditional personality rights to the cyberspace. From a historical perspective, at the beginning of the emergence of the Internet, traditional personality rights could still effectively respond to the demand for personal information protection in the cyberspace. The popularization of the Internet in China began around the year 2000, when the Supreme People's Court's Interpretation on Several Issues Concerning the Application of Law in the Trial of Cases Involving Computer Network Copyright Disputes (2000) was issued, which followed the traditional infringement system to solve network infringement problems. At this time, traditional personality rights such as the right to reputation are used to protect personal information in the online environment without any sense of "violation", so it is not necessary to establish the right to personal information as a specific personality right.
The issue of personal information protection has attracted social attention and entered the view of legislators, following the widespread application of data collection and mining technologies such as big data and cloud computing. Big data has the characteristics of large data volume, multiple types, and fast updates. With the support of data processing technologies such as cloud computing, information technology has created many new possibilities represented by artificial intelligence. The impact of information technology on society has also deepened from the initial simple spatial transformation to a profound impact on human material and spiritual levels. At the material level, information technology has brought enormous wealth to humanity. Traditionally, personal information that is difficult to possess property value is automatically collected, processed, mined, matched, and pushed through computers, achieving the realization of information value. Information technology enables personal information that traditionally does not have or is difficult to achieve property value to objectively possess property value. At the spiritual level, information technology also brings enormous challenges. Free will is an important value pursuit cherished by humans and protected by law, but artificial intelligence technology is full of erosion of free will, typically manifested as humans being deceived, shaped, controlled, and even enslaved by data. The Facebook Cambridge analysis incident in the 2016 US presidential election and the phenomenon of "big data killing" in China's business are examples. Therefore, one of the core issues of information privacy concerns is the power that commercial entities and governments have in terms of personal autonomy and decision-making. Privacy is also concerned about the restrictions that rules that allow commercial entities and governments to access personal information may impose on personal autonomy and decision-making. Obviously, this series of issues is far beyond traditional personality rights such as privacy rights to solve.
The core issue of the personal information rights system should be to respond to the opportunities and challenges brought by data processing technologies represented by artificial intelligence. Opportunities are mainly reflected in the realization of the value of personal information property, while challenges are mainly the threats that improper processing of personal information poses to personal information security, as well as to human free will and dignity. The core concern of personal information rights is to maintain individuals' independent control and decision-making over their personal information, and to realize the property value of personal information. This is significantly different from traditional personality rights. If personal information is published in cyberspace but has not been processed by algorithmic technology, traditional personality rights should still apply; On the contrary, personal information rights should apply. Therefore, "algorithmic recognition" is the essential feature of personal information.
3 Algorithm recognition is the essential element that distinguishes personal information rights from traditional personality rights
From the evolution process of personal information rights, it can be seen that their emergence is precisely due to the arrival of the digital society, especially the automated decision-making of personal information using algorithm technology, which has led to the transformation of traditional online scattered, weakly related to personality, and difficult to have property value personal information into large-scale, closely related to personality, and objects with huge property value. The legal manifestation of this change is that the object attributes, right attributes, and specific design of protection rules of personal information have been deeply imprinted with the "algorithm" of the digital age.
3.1 The characteristic of personal information as a right object is algorithm recognition
Different rights may arise on objects with different attributes. Therefore, to truly distinguish between personal information rights and traditional personality rights, it is necessary to first clarify the differences between the two at the object level and their impact on the two rights. In traditional society, personal information is difficult to become the object of personality rights. Traditional laws only protect certain personal information that is closely related to the dignity of the subject's personality, such as name, portrait, reputation, privacy, etc. At this point, rather than protecting certain personal information, the law is protecting the personal values of freedom and dignity that are carried on top of personal information. Therefore, the object of traditional personality rights is the social evaluation interests that reflect personal dignity. The traditional definition of the object of personality rights is often based on the significance of a certain interest to personal dignity. Personal information, as a factual objective existence, is difficult to become the object of personality rights. The reason why personal information can go from "behind the scenes" to "in front of the stage" and become one of the objects of personality rights is precisely due to the arrival of the digital society, especially the application of algorithm technology. The collection, storage, and use of information in the digital society have undergone a qualitative change, leading to a significant change in the meaning of personal information to human dignity, at least reflected in the following three aspects:
3.1.1 The way of identifying individuals has shifted from traditional natural recognition in the human brain to algorithmic recognition in computers. Recognizability is a necessary prerequisite for personal information to represent the personality attributes of the subject. In traditional offline society, personal information can be obtained and saved through various means such as oral communication, symbols, text, pictures, and even photography and video recording. The recognition of this information relies on the human brain, among others. In the digital society, electronic form is the main form of personal information. By adopting structured processing of electronic personal information, all information is restored to some basic measurement standards, thus achieving algorithmic recognition of personal information. For example, the ways in which the human brain and algorithms recognize facial information are vastly different: when recognizing facial information, the human brain achieves it through visual features of the face, such as the density of hair and the size of eyes; But the algorithm for facial recognition is achieved by calculating the positions and lengths of various parts such as hair and eyes in the coordinate system.
3.1.2 Social evaluation based on personal information has shifted from traditional natural evaluation to algorithmic evaluation. In traditional offline society, not only does individual recognition rely on the human brain, but the evaluation of individuals based on information also relies on the human brain to achieve. If a company evaluates job seekers based on their resume information or interview performance, the examiner needs to make judgments based on their own experience, preferences, and other factors. In the digital society, personal information is automatically processed through algorithmic technology, and the creditworthiness of individuals, whether they meet relevant requirements, and what kind of treatment they can receive are all automatically generated through algorithms. If the formation of personality evaluation in traditional offline society is a "natural evaluation" based on the evaluator's own experience and the evaluaee's information, then the formation of personality evaluation in digital society is an "algorithmic evaluation" based on algorithms and personal information, that is, a personality portrait calculated based on algorithms. Algorithm evaluation may be consistent or inconsistent with natural evaluation, and may even lead to biases in others' personal cognition. In addition, the impact of evaluation on individuals is not entirely the same. In traditional offline society, the decrease in social evaluation first causes personal mental pain, followed by the potential loss of benefits due to social evaluation; In the digital society, the direct result of algorithm evaluation is that individuals are treated differently in terms of benefits such as automated recommendations, rather than mental pain.
3.1.3 The property value of personal information has gone from being difficult to achieve to being achievable. In traditional offline society, the collection of personal information is often sporadic, scattered, localized, and phased, and the processing of information also relies on the human brain. However, the scale of information that the human brain can process at the same time is easily limited by natural human physiology, and information dissemination is prone to errors, especially when the transmission chain increases, which also makes it difficult to truly realize the property value of offline personal information. In the digital society, information collection is continuous, comprehensive, permanent, large-scale, and even ubiquitous, all the time, achieving algorithmic recognition and automated decision-making of massive amounts of information. Due to the universality of technology, information dissemination in the digital society generally does not make mistakes due to technology. If blockchain technology is adopted, information may not even be altered. Therefore, the property value behind personal information can be discovered and truly realized in the digital society. The high market value of current internet companies is an affirmation of the value of personal information assets. Although the property value of personal information is mostly reflected in large-scale data, as large-scale data is composed of individual pieces of information, each piece of personal information participating in it actually has its independent value, although its property value may be extremely small.
In summary, the significance of personal information to individuals has undergone significant changes in the digital society. Traditional personal information is difficult to become the object of personality rights. In the digital society, algorithmic technology has brought about changes in the way personal information is recognized and evaluated, as well as the realization of the value of personal information property, making traditional personal information that is "insignificant" in terms of personal dignity a "pivotal" position in the digital society. As pointed out by the Federal Constitutional Court of Germany in the "Census Case", "under the conditions of automated data processing, there is no longer unimportant data." Therefore, personal information, in the sense of the object of personal information rights, should be reflected in the personal information recognized and evaluated by algorithms, which is personal information that can achieve property value.
3.2 The legal attribute of personal information rights is rooted in algorithm recognition
The right to personal information, as an emerging personality right, is recognized by law precisely because of the arrival of the digital society. It can be said that without the application of algorithmic technology, personal information rights cannot exist, and the legal attributes of personal information rights are closely related to algorithmic recognition.
3.2.1 The right to personal information is a specific personality right. Personal information is information that indicates specific natural person characteristics, reflects individual evaluation and social identification of individuals, and has spiritual personality attributes. However, objectively indicating the information of a specific natural person does not necessarily mean that the law recognizes it as a specific personal right to be protected. Traditional laws only protect important personal interests such as reputation and privacy. On the contrary, if a new personal information protection that is different from traditional personal rights is embedded in China's legal system, it is necessary to find its unique difference from existing protection, that is, the new protection of personal rights triggered by computer applications. The basic concept of summarizing new personal rights is algorithmic technology processing. The application of algorithmic technology has led to the traditional scattered, local, and difficult to generate significant personal information, which has become a source of information that can be read, calculated, and used for automated decision-making in the digital society. Algorithm technology has laid the foundation for personal information to become a specific personality right. The personal interests involved in personal information are not necessarily related to the interests protected by traditional personality rights such as privacy, and of course, personality interests such as privacy are not excluded. In short, algorithmic evaluations based on personal information do not necessarily diminish the natural evaluation of the information subject, nor do they necessarily involve the privacy interests of the information subject, nor are they related to the interests of traditional portrait rights. Of course, in exceptional cases such as obtaining personal private information through algorithms, there may also be overlap between personal information rights and privacy rights.
3.2.2 The right to personal information is a personality right that naturally contains property value. Generally speaking, personality rights do not include or should not include property value. Even if there is a possibility of a certain personal interest becoming property in reality, the realization of the value of that property is not allowed by law, otherwise personal dignity will become a tradable existence. But the property genes contained in personal information are an inherent and difficult to detach attribute, and their emergence stems from the application of algorithmic technology. In the digital society, through algorithmic analysis of personal information, personalized services that belong exclusively to the information subject can be calculated, thus solving the problems of information asymmetry in business practice and having enormous property value. In terms of business practice logic, the inherent property value attribute of personal information is more fundamental than its personality attribute. It is precisely because personal information contains property value that enterprises invest a large amount of capital in designing and optimizing algorithms, thereby mining their property value to a greater extent. Although the personality interest attribute of personal information is a superior protected object in law, from its generation logic, the property value contained in personal information is a more fundamental motivation. It can be said that without the property genes contained in personal information, there would be no personality interest attribute of personal information.
3.3 The particularity of personal information rights rules stems from algorithm recognition
The Personal Information Protection Law of China provides detailed rules on personal information rights, through which algorithmic recognition can be discovered. Here are some examples to illustrate.
3.3.1 Notification and consent rules for handling personal information. The rule of informed consent is considered a fundamental rule in personal information protection, but it is completely opposite to traditional legal rules and only applies to the digital society. In offline society, information is generally free to flow unless its dissemination infringes on the privacy and other personal interests of others. The traditional information order in German law is based on the freedom of information, which means that in principle, people can freely access, process, and use the information of others... Privacy claims are just 'individual protection islands in the sea of freedom of information exchange'. If the traditional offline society adopts the rule of informed consent, information flow will undoubtedly solidify, and human society will also be unable to function. However, in the digital society, it is necessary for individuals to have the right to self control information, because through algorithmic technology, information processing in the digital society can be present and ubiquitous at all times. If individuals are not given self-control over their information and allowed to process it by information processors, they will become transparent individuals in the digital society, and their personal and property interests will face enormous risks of infringement. Therefore, in traditional personality rights disputes, victims have the obligation to prove the illegality of the dissemination of relevant information, while in personal information rights disputes, information processors are responsible for proving the legality, legitimacy, and necessity of their information processing behavior.
3.3.2 Infringement of personal information rights shall be subject to presumption of fault liability. Article 69 (1) of the Personal Information Protection Law establishes the presumption of fault liability for infringement of personal information rights. This is different from the principle of fault liability for infringement of traditional personality rights. The presumption of fault liability for infringement of personal information rights is due to the strong professionalism and technical nature of personal information processing activities, making it difficult for individuals to understand the fault of the personal information processor, let alone provide evidence to prove it. However, personal information processors are closer to the evidence and have strong expertise. Adopting presumption of fault is beneficial for strengthening the burden of proof of information processors and providing effective relief to victims. Therefore, the use of algorithmic technology by data companies is a legitimate reason to assume fault liability for infringement of personal information rights. Otherwise, as in the "Wang Fei case" mentioned earlier, although the plaintiff's extramarital affairs and other information were published online, due to the lack of algorithmic technology, the principle of fault liability should still be applied in accordance with traditional personality rights infringement.
3.3.3 The universality of compensation for infringement of personal information rights. The traditional remedies for infringement of spiritual personality rights mainly focus on the spiritual interests of the victim, so the main methods of liability are to eliminate the impact, restore reputation, and apologize. Compensation for damages is only applicable in cases where serious mental damage has been caused. However, compensation for damages that infringe on personal information rights is not based on serious mental damage, but should be universally applicable. The amount of compensation should be determined based on the losses suffered by the individual or the benefits obtained by the personal information processor; If the loss or benefit is difficult to determine, the compensation amount shall be determined based on the actual situation (Article 69 of the Personal Information Protection Law). It can be seen that both the universality of compensation for damages and the calculation method of compensation for damages, compensation for damages that infringe on personal information rights exhibit the characteristics of property compensation. The reason for this is precisely because the right to personal information is a personality right that naturally contains property interests.
4 The practical application of distinguishing between personal information rights and traditional personality rights
The digital society is the space where personal information rights can exist, and algorithm recognition is the essential element that distinguishes personal information rights from traditional personality rights. In practical cases involving personal information, the applicability of personal information rights can be determined based on the application of algorithmic technology.
4.1 Circumstances where personal information rights should not apply
There are two main types of cases where the right to personal information should not be applied.
4.1.1 Disputes that completely occur in offline society should not apply to personal information rights, but to traditional personality rights. As mentioned in the case of posting judgment in the community, the second instance court mistakenly believed that as long as personal information was involved, personal information rights could be applied. In fact, the information involved in this case does not and should not be protected by personal information rights. Some scholars believe that although the right to personal information originated in the digital society, once it arises, it is no longer limited to the digital society, but can be extended to apply to offline society. Furthermore, disputes involving personal information in offline society can also apply to the right to personal information. This viewpoint is essentially consistent with the previous viewpoint of "mistakenly believing that the competition of rights can solve the problem of the coexistence of personal information rights and traditional personality rights". However, since the coexistence of the two should not be solved through the competition of rights, the extension of personal information rights to traditional offline society cannot be established. It is undeniable that in the dual layered space and virtual and real isomorphic digital society, there has been an extension of the role from a natural person to an information person. Even purely offline behavior involving personal information is affected by the arrival of the digital society. However, the impact of digital society on traditional offline society should be addressed by expanding the protection scope of traditional personality rights, rather than applying personal information rights to traditional offline society. The application conditions of personal information rights are mainly reflected in the "electronic" elements of personal information. Both Article 1034 (2) of the Civil Code and Article 4 (1) of the Personal Information Protection Law define "electronic or other means of recording" as one of the elements defining personal information. The term 'other methods' here should be interpreted as something similar to' electronic ', which is an information carrier that can be recognized and calculated by computers to reserve space for future technological development. It should not be interpreted as including traditional offline information recording methods such as paper.
When the internet is only used as a space for publishing personal information, personal information rights should not apply. A typical scenario is to publish information about others in cyberspace with the intention of insulting, defaming, or gaining social attention. The plaintiff in the "Wang Fei case" not only sued Zhang, who directly released his extramarital affairs and other information, but also separately sued the platforms "Tianya Network" and "Daqi Network" that published the information. In the latter two cases, the online platform only served as a platform for publishing personal information and did not use algorithmic technology to analyze the plaintiff's information. Therefore, traditional personality rights such as privacy should be applied for resolution. The court also reasoned from the perspective of traditional personality rights, saying, "Daqi Network did not conduct technical processing on personal information and photos such as the names of the parties involved in this report, which infringed on Wang Fei's privacy and reputation rights.
The applicable conditions of the right to personal information are reflected in Article 72 (1) of the Personal Information Protection Law, which states that "if a natural person processes personal information due to personal or family affairs, this law shall not apply." In cases where the right to personal information is infringed upon in private law, this paragraph "personal or family affairs" should be interpreted broadly, including the act of an individual publishing personal information of others online for personal purposes such as insulting, defaming, etc, To avoid the application of personal information rights in these situations. This is consistent with experience in comparative law. Article 1 of the "Data Privacy Legal Principles" in the United States stipulates the normative purpose and scope of application of data privacy principles. According to this "principle", spreading rumors or engaging in online harassment on social media and other platforms does not apply the data privacy principle, but rather the privacy infringement rules provided in the "Restatement of Tort Law (Second Edition)". The rule design of data privacy principles is aimed at the relationship between professionalism and commercialization. Article 2, paragraph 2 (c) of the EU GDPR also provides a similar provision, which does not apply to "pure personal or cohabitant behavior of natural persons". According to point 18 of the GDPR "instructions", such behavior includes communication and holding addresses, or social and online behavior in such activity scenarios. GDPR is not applicable in situations where there is no professional or commercial conduct.
It is worth pointing out that there may be different interpretations of the relevant judgments of the European Court of Justice. In the Lindqvist case, the defendant Lindqvist posted personal information of colleagues (such as name, job division, etc.) on the website. The court held that the defendant's disclosure of data on a website that was accessible to unspecified individuals did not apply the defense of "pure personal or cohabitant behavior". Similarly, in the Jehovah's Witnesses case, missionaries recorded who spoke or did not speak to them during door-to-door preaching. The court believes that the missionary act has exceeded the internal preaching of believers, and therefore does not belong to "pure personal or cohabitant activities". It can be seen that the European Court of Justice has adopted a strict interpretation of "pure personal or cohabitant activities" in both cases, and non professional or non commercial behavior may still be subject to personal data laws. The above cases cannot negate the viewpoint of this article, as they are not civil dispute cases. Therefore, the court's interpretation of "pure individual or cohabitant behavior" is for a completely different normative purpose from the issues discussed in this article. In the Lindqvist case, Lindqvist was fined 4000 Swedish kroner and prosecuted by prosecutors for not informing the Swedish personal data protection regulatory authorities before publishing information. The issue faced by the European Court in this case is whether Lindqvist has violated personal data protection rules in terms of administrative penalties and criminal accountability. Similarly, in the Jehovah's Witnesses case, it was the Finnish Privacy Commissioner who filed a lawsuit attempting to prohibit the actions of missionaries, and the relevant ruling was also made by the Supreme Administrative Court of Finland. Therefore, this case involves the issue of administrative protection of personal data. It should be emphasized that personal information rules have different normative purposes in criminal law, administrative law, and civil law, and the understanding of related concepts and terms varies. For example, the crime of infringing on citizens' personal information, which constitutes one of Article 253 of the Criminal Law, is not limited to the "algorithm recognition" of personal information. Personal information provided solely in the form of electronic documents, or even in traditional paper form, should be included in the meaning of "personal information" in this article. In fact, the definition of personal information in public law is usually broader than in private law. The EU Court of Justice's broad understanding of the scope of application of personal data protection rules in criminal and administrative cases does not mean that it should also apply to civil cases. On the contrary, when it comes to the application of personal information rights or traditional personality rights, the interpretation of algorithmic identification of personal information should be adopted, and the term "personal or family affairs" in the Personal Information Protection Law should also be understood in a broad sense.
4.2 Typical situations of applying personal information rights
The personal information processed by algorithm technology is a typical case where personal information rights are applicable, such as "Ling Moumou v. Tiktok" and "WeChat Reading". In the case of "Ling Moumou v. Tiktok", the plaintiff Ling Moumou registered and logged in the Tiktok software provided by the defendant for the first time with her mobile phone number when she had no other contacts in her mobile phone address book. After logging in, under the item "People you may know" of Tiktok software, there are 30 Tiktok users, and the plaintiff has social relations with 20 of them (WeChat friends, QQ friends, etc.). The reason why the defendant can accurately recommend friends who have social relations with the plaintiff is that the 20 people authorized the defendant to collect the plaintiff's mobile phone number in the address book. After matching the plaintiff's mobile phone number with the mobile phone address book of the above 20 Tiktok users, the defendant recommended the 20 Tiktok users to the plaintiff. In view of the fact that the defendant obtained his name, mobile phone number and other information from the plaintiff's friends' address book without the plaintiff's consent, the plaintiff sued the defendant to stop collecting, storing and using the plaintiff's name and mobile phone number before the plaintiff registered Tiktok account, and delete the plaintiff's personal information collected and stored without the plaintiff's explicit authorization, including name, mobile phone number, social relations Geographic location information, etc. Similarly, in the "WeChat Reading Case", plaintiff Huang logged into WeChat Reading software through WeChat authorization and found that WeChat Reading obtained its WeChat friend list and automatically followed WeChat friends who shared the software, and their reading information (including reading duration, bookshelf, reading materials, etc.) was visible to these friends by default; Even if the plaintiff did not follow each other with their WeChat friends on the WeChat reading app, the WeChat friends could still see their reading information through the WeChat reading app. The plaintiff claimed that the defendant Tencent had infringed on their personal information and privacy rights, and requested to stop the WeChat reading software from obtaining and using their WeChat friend information, terminate the automatically generated WeChat reading followers, and stop displaying their WeChat reading information to their WeChat friends.
The defendants in both cases used algorithm technology to collect and analyze the plaintiff's personal information, and based on this, made automated decisions such as matching and recommendation. This is precisely the intelligent digital environment that the rules of personal information rights aim to address, and it is a typical case of applying personal information rights. The court judgments in both cases also correctly summarized the core focus of the dispute as whether the defendant's actions violated the plaintiff's personal information rights. The algorithm processing of personal information not only includes that personal information has been processed by algorithm technology, but also includes that personal information will be processed by algorithm technology. As long as the information processor has the ability to process personal information using algorithmic technology, even if the dispute itself is not closely related to algorithmic technology, such as disputes arising solely from information collection, the right to personal information should still apply. At this point, the right to personal information reflects a significant "preventive" function. This also aligns with the concept that the German academic community interprets the rules in the Federal Personal Information Protection Act as pre emptive norms to prevent the misuse of personal information.
4.3 Can the situation of unclear personal information rights be applicable and its handling
In practice, there are still two main situations where the application of personal information rights is unclear. In the first case, the information processing technology adopted is not complex, such as simply summarizing or categorizing personal information (such as in alphabetical order), or personal information is not the direct object of the processing behavior, but only "incidental" when dealing with other objects, such as the "online judgment case". In the second scenario, the information processing technology adopted is unclear and there are multiple possibilities. If the specific circumstances of the acquisition, dissemination, and utilization of personal information in the "Pang Information Disclosure Case" are difficult to determine, the application of personal information rights depends on the specific circumstances proven by the evidence.
4.3.1 In disputes arising from the inclusion of personal information in judgments such as the "Belta case" and the "Huifa Zhengxin case", the applicability of personal information rights is closely related to the technology adopted by the defendant. (1) If a company crawls through the judgment documents publicly available online in China and includes them on its own website without further technical processing, or only handles the classification and arrangement of case names, case numbers, and trial levels, then personal information rights should not be applied. (2) If, while providing the judgment, the enterprise also provides the enterprise's industrial and commercial registration information, administrative penalty documents, national list of dishonest defendants, etc., and summarizes these information on a platform to provide to customers. At this point, although the aggregation of different types of information helps to increase the attractiveness of the platform, it has not changed the way personal information is processed, so traditional personality rights should still be applied for processing. (3) If a company extracts party information from judgments, business registration information, administrative penalty documents, and the list of dishonest executors through technical means, and associates these information, such as when viewing the business registration information, the judgment and administrative penalty documents of the company's shareholders can be seen, which will be highlighted by the shareholders included in the list of dishonest executors, On the basis of data analysis, value-added services such as enterprise or individual credit rating and transaction risk warning are provided. At this time, the possibility of applying personal information rights increases significantly, as enterprises have automated processing of personal information. (4) If a company collects party information from sources such as judgments, business registration information, administrative penalty documents, and the list of dishonest defendants, forms a party portrait through algorithms, and uses this digital portrait to automatically recommend goods or services to the parties or customers, then it constitutes a typical scenario for applying personal information rights. It can be seen that the applicability of personal information rights to disputes involving personal information is not universal, but depends on the technology adopted by the enterprise, the data processing methods used, and the products and services provided. The core judgment standard is whether the enterprise has applied automated information processing technologies such as algorithms.
Based on the above analysis, a reasonable explanation can be given as to why there were different judgments in the "Berta case" and the "Huifa Zhengxin case". At present, the academic community mostly believes that the two cases are "different judgments for the same case", and the core of the disagreement is the issue of reasonable use of personal information. In fact, the reasonable use of personal information is not the key to distinguishing between the two cases, the real key lies in the factual differences between the two cases. In the "Belta case", the defendant provided services such as corporate credit reporting, corporate credit evaluation, credit management consulting, and corporate management consulting. Users can search for business registration, litigation related judgment documents, etc. through its website. That is, the defendant associated enterprise (including shareholder) information with judgment documents, allowing users to see and link to relevant judgment documents while viewing enterprise information, Therefore, the facts of this case belong to the aforementioned situation (3). On the contrary, in the "Huifa Zhengxin case", the defendant was a legal information website that provided information on laws and regulations, judicial cases, contract texts, law firms, lawyers, and judicial institutions. The arrangement of judicial cases adopts common methods such as cause of action, document type, and trial institution, and the facts of this case belong to the aforementioned situation (2). Therefore, the "Belta case" can apply to personal information rights, while the "Huifa Zhengxin case" should not apply to personal information rights. Based on this, although the judgment results in both cases are reasonable, there are shortcomings in the reasoning of the judges. In the Huifa Zhengxin case, the court's reasoning section mainly focused on analyzing whether the defendant's handling of personal information was legal, but these analyses were not necessary. What the court really needs to clarify is that the right to personal information should not be applied in this case, and then analyze whether the defendant has infringed on the plaintiff's traditional personality rights such as reputation rights. In the "Belta case", the conclusion that the court applied the right to personal information is acceptable, but the reason is only because the information involved in the case is related to the plaintiff, without realizing that the defendant's "value-added" service behavior of associating the enterprise with the judicial documents is the determining factor that leads to the applicability of the right to personal information in this case.
4.3.2 The difficulty of applying personal information rights due to unclear case facts is another situation. In the "Pang Information Disclosure Case", there are countless possibilities for the plaintiff Pang's phone number and other information to be leaked. It may have been leaked due to improper storage of Pang's own information, or it could have been leaked from Lu, who booked the tickets on his behalf, or it could have been leaked from the booking platform Quna Company or China Eastern Airlines. Even if it is believed that it was leaked by Quna Company or China Eastern Airlines, there are still various ways of leakage. For example, it may be that employees of the company, after seeing the plaintiff's information, record or provide it to others orally, handwritten, photographed, screenshot, etc., or it may be due to hacker intrusion and data leakage, or it may be due to the company's data products processed through algorithms and other methods. Among the many possibilities mentioned above, some situations should not apply to personal information rights, while others can apply to personal information rights. For objective facts that cannot be determined, such as how the plaintiff's information was leaked and utilized, should traditional personality rights or personal information rights be applied in such cases?
From the perspective of protecting victims, adopting personal information rights is more beneficial for victims, but emphasizing the protection of victims too much may also go against the basic principles of tort law. Civil law aims to safeguard human freedom and dignity, with autonomy of will and freedom of conduct as its basic value choices. This is reflected in tort law, which prioritizes freedom of conduct when balancing freedom of conduct and victim protection. The classic expression of this choice is that "damage should stay where it occurred". As for personal information disputes, it is not appropriate to assume that the enterprise has "fault" due to the unclear channels and methods of information leakage; Otherwise, if the fact of information leakage is directly recognized as the "fault" of the enterprise, it will in fact make the enterprise bear no fault liability. Before the judicial judgment and academic evidence fully prove that the right to personal information should be applied in such situations, based on the basic concept of tort law, applying traditional personality rights to solve such disputes should be a relatively stable solution. Therefore, the "Pang information leakage case" should not apply to personal information rights. Although the judgment conclusions of the two trial courts in this case are different, they both made reasonable arguments based on traditional infringement of personality rights. However, the Supreme People's Court listed this case as the "first batch of typical cases involving the Internet", which misled the theoretical and practical circles to consider this case as a typical case for personal information protection.
The frequent overlap between the right to personal information and traditional personality rights in judicial decisions indicates that the substantive and academic communities have not yet clarified the essential differences between the two. The reason behind this is the failure to understand why the right to personal information has become a specific personality right independent of traditional personality rights in law. The current emphasis on the "identifiability" of personal information not only fails to undertake the task of distinguishing between personal information rights and traditional personality rights, but also easily leads to the misconception of applying personal information rights to offline, online, and algorithmic related personal information disputes. Only by realizing that the digital society is a prerequisite for the existence of personal information rights, and realizing that algorithm recognition is the technical foundation for generating personal information rights, can we truly understand the essential difference between personal information rights and traditional personality rights. Not all "identifiable" information is personal information within the meaning of personal information rights, and not all disputes involving personal information can be protected by personal information rights. Only "algorithm recognized" information is personal information within the meaning of personal information rights, and only personal information disputes that use algorithm technology can be resolved by personal information rights. Algorithm recognition "is the essential element that distinguishes personal information rights from traditional personality rights.
Although the definition of personal information based on the basic feature of "algorithm recognition" meets the theoretical and practical needs of distinguishing personal information rights from traditional personality rights in civil law, this conclusion may not be applicable to all scenarios involving personal information, especially in public law protection situations. Personal information has been used in different meanings for different regulatory purposes. Both China's Personal Information Protection Law and the EU GDPR are comprehensive legislation on personal information, and it is appropriate to adopt a broad definition of identifiable personal information in these laws. However, when applying personal information rules, more accurate definitions still need to be made based on specific scenarios. This article attempts to prove that, at least in terms of the internal distinction between personal information rights and traditional personality rights in civil law, personal information should be limited to "algorithmic recognition". I hope this conclusion can provide some inspiration and reference value for the understanding of personal information in other scenarios and the application of relevant rules.