Glacier Kwong Keynote 2019: A new framework for personal data in the age of surveillance (transcript)
Protests 1989 in Leipzig, Protests 2019 in Hong Kong
Guten Morgen, es ist mir eine Freude, Ihr Keynote Speaker zu sein. Vielen Dank OTMR und Spirit Legal, dass Sie mich eingeladen haben, vor Ihnen allen zu sprechen.
Ich bin Glacier Kwong. Ich komme aus Hongkong. Ich bin eine Aktivistin für politische und digitale Rechte. Ich lerne noch immer Deutsch und kann nur ein wenig Deutsch sprechen. Deshalb würde ich meine Rede auf Englisch halten, danke für euer Verständnis.
Today I would love to talk about a few things regarding Hong Kong, digital rights, surveillance, technology and maybe try to provide solutions to the problems. I am very privileged, to be able to stand in front of all of you instead of batons and teargas, triad members and riot police, but standing in front of all of you to share about digital rights.
Recently, my fellow activist friends and I have been saying “Hong Kong is the new Berlin, in the new Cold War”. I have been reading a lot about the Cold War, Berlin, and Germany. And it is very difficult for me to ignore all the similarities I see throughout the course of history.
In 1989, right here in Leipzig, its citizens following the traditional prayers for peace in the Nikolai church, initially marched as hundreds, later hundreds of thousands to the inner city ring - were no longer satisfied with superficial changes. With their courage and their strong will they made history. The moving images from the Monday Demonstrations in Autumn 1989 reached every corner of the world. These depicted determined people within an unyielding society, who demanded the fundamentals of democracy peacefully. Through this, national history was made in Leipzig and the foundations of German Reunification were laid. A historic event, without which today's European Union would not have come into existence. In the same year, the Berlin Wall fell, and Germany Reunification became part of the history we study. The fall of the wall was followed by the unification of Germany, freedom and democracy were regained. Since then, Germany and Berlin have been rooting from core values like human rights, fundamental freedoms, and democracy.
Germany since then has been one of the major states in the world that embrace diversity, freedom, and democracy. It has set a good example of transitional justice for Hong Kong and the rest of the world. Hong Kong has much to learn from Germany successfully defending freedom and democracy. Yet Hong Kong is not that fortunate. In 1989, the June Fourth Massacre happened in Beijing, China. This was the first time Hongkongers realized the core of the China regime is a dictatorship regime.
In Hong Kong, Britain and PRC signed the Sino-British Joint Declaration in 1984 and scheduled the handover of Hong Kong in 1997. Hongkongers were promised, “One Country, Two System” and “High Autonomy” that would remain unchanged for 50 years. It was said “Hong Kong people are to rule Hong Kong”, freedom and democracy seemed close. “One Country, Two System” and “High Autonomy” turned out to be empty words, freedom and democracy are further and further away. PRC declined Hongkongers’ urge for democracy and depriving Hongkongers of their fundamental rights. PRC has been tightening its grip on Hong Kong's freedom of assembly, freedom of speech, right to be elected are gradually taken away. Finding PRC and the Hong Kong Special Administrative (HKSAR) Government not legitimate, Hongkongers have stood up to fight against them.
All means within the system were exhausted. Peaceful protests were ignored, the requests of opening a dialogue were rejected. In 2014, a massive civil disobedience protest, known to the world as “Umbrella Revolution” took place. Yet Hongkongers did not succeed in their quest for universal suffrage, and fundamental rights were further eroded. Candidates of the legislative are being rejected from running for office due to their political view, those who were elected were being disqualified. In 2016, the act to protect street hawkers in Chinese New Year sparked off the incident known as the “Fishball Revolution”. Three Hong Kong activists were forced into exile in Taiwan and Germany, while dozens are put into jail. In 2019 the anti-extradition bill protests have led to more Hongkongers going into exile and many being oppressed.
In Hong Kong, where “One Country, Two Systems” is supposed to take place, we already see attempts to implement mass surveillance over Hongkongers. 50 multi-functional smart lampposts, equipped with cameras and other technologies, are being installed. In a recent protest, few of them are being pulled down. It was found that the chips that power the lamp posts are manufactured by a Chinese company that is actively involved in the surveillance scheme in China.
In my experience, the key to battling state surveillance is education and raising awareness. To be honest, learning the protocols or learning about how to use PGP, 2FA is not hard at all. You can easily find trustworthy tutorial online in mere clicks. What is hard is how do you use it consistently, and make sure the others use it as well. If there is no awareness in a community for security, situations like the one who insisted on using PGP has no one to talk to on email, because nobody uses it. Teaching citizens how to use the technology is easy but persuading them to use it every day despite the trouble is the hardest thing.
Another thing that ought to protect us from state surveillance is laws and regulations. There are no regulations on data transmission from Hong Kong to China. Being capable of facial recognition, these lamp posts are accused of secretly gathering personal data of Hongkongers and sent them to China without our knowledge or consent. The mere fear of it happening will lead to Hongkongers changing their behavior, like wearing masks and hiding their identity with black-bloc to avoid troubles. Although there is no concrete proof for violations of privacy, China being a totalitarianism regime is known for its notorious human rights violations, the worries and fear of us are understandable. This fear of surveillance has already created a chilling effect on Hong Kong and hinders our freedom of assembly and freedom of speech. This might happen to Germany as well.
China is not a country that plays by the rules. Being one of the parties to the Vienna Convention, China recently declared the Sino-British Joint Declaration to be invalid on its own because of the terms of this treaty promises freedom and democracy for Hong Kong and does not favor Chinese regimes’ interest. “One Country, Two Systems” was promised at the time of the handover in 1997, yet China has broken its promise too many times for it to be trusted.
On 4th October, the Hong Kong Government enacted the colonial-era emergency law that has not been used in more than half a century. The government’s passed the first law under emergency law to ban mask-wearing in public. The emergency law is similar to martial law in other places, which allows the Hong Kong Government to pass any laws it deems suitable without prior approval by city’s legislature. Now that it is enacted, I believe more will follow, the government can shut down the internet, block the free flow of information, allow the police force to access citizens’ device without a warrant etc. This is a blatant infringement to fundamental human rights. I wish I’d be able to provide our youngsters a better society to live in, so that they can be as carefree as other youngsters in Europe or in Germany.
But still, Hong Kong is very lucky when we compare ourselves with those in Xinjiang or Tibet. With the aid of technology companies throughout the world, China make use of big data and surveillance technologies to monitor its citizens every move, and if something goes “wrong”, they get arrested. It is revealed by journalists that there are “re-education camps” and “detainment facilities” in China, holding at least 1.5 million ethnic minorities prisoners.
Siemens, the large German conglomerate collaborates on advanced technologies in automation, digitalization and networking with China Electronics Technology Group Corporation, a state-owned military contractor that has developed a policing app used in Xinjiang that, according to Human Rights Watch, has led some people to be sent to the camps.
This app demonstrates that Chinese authorities consider certain peaceful religious activities as suspicious, such as donating to mosques or preaching the Quran without authorization. But most of the other behavior the app considers problematic are ethnic-and religion-neutral. The findings suggest the IJOP system surveils and collects data on everyone in Xinjiang. The system is tracking the movement of people by monitoring the “trajectory” and location data of their phones, ID cards, and vehicles; it is also monitoring the use of electricity and gas stations of everybody in the region. This is consistent with Xinjiang local government statements that emphasise officials must collect data for the app system in a “comprehensive manner” from “everyone in every household.”
The authorities have sought to justify mass surveillance in Xinjiang as a means to fight terrorism. While the app instructs officials to check for “terrorism” and “violent audio-visual content” when conducting phone and software checks, these terms are broadly defined under Chinese laws. It also instructs officials to watch out for “adherents of Wahhabism,” a term suggesting an ultra-conservative form of Islamic belief, and “families of those...who detonated [devices] and killed themselves.” But many - if not most - behaviors the app system pays special attention to have no clear relationship to terrorism or extremism. The app gathering information to counter genuine terrorism or extremist violence is not a central goal of the system.
The intrusive, massive collection of personal information through the app helps explain reports by Turkic Muslims in Xinjiang that government officials have asked them or their family members a bewildering array of personal questions. When government agents conduct intrusive visits to Muslims’ homes and offices, for example, they typically ask whether the residents own exercise equipment and how they communicate with families who live abroad; it appears that such officials are fulfilling requirements sent to them through apps such as the app. The app does not require government officials to inform the people whose daily lives are pored over and logged the purpose of such intrusive data collection or how their information is being used or stored, much less obtain consent for such data collection.
The Spanish telecommunications firm Telefónica has a joint venture with China Unicom that appears to use big data for tracking people. The company markets the software as a way to deliver location-based ads or monitor public transportation use, and while it says the data is anonymous, I reviewed an internal presentation that appears to have shown ID numbers unique to each cellphone user. It is easy to see how such software could be used by the authorities in Xinjiang to track minorities in real time, and it has already been deployed in the region. KfW, a German state-owned bank, provided 100 million euros ($111 million) in funding for the construction of a subway line that opened in 2018 in the regional capital, Urumqi, built with components from ABB, a Swiss engineering firm, and Airbus Defense and Space, the European aircraft manufacturer. Unilever and Nestlé both buy tomato products from a state-owned company in Xinjiang that could end up in the ketchup in kitchens across Europe. Neither company responded to questions about how products from Xinjiang are used.
As in other parts of China, the notorious social credit system is being put in place as well.
An elite primary school in Hangzhou, Zhejiang Province is making its students wear brainwave-reading headbands that can supposedly detect their attention levels in the classroom.
A university in eastern China has installed a facial recognition system at its entrance and in two classrooms to monitor the attendance and behavior of students. Saying that “It can effectively solve the management difficulties and low efficiencies in a traditional attendance system, and make it easier for managers to track their students,” this is a dystopian state - citizens being monitored everywhere.
Recently Germany has signed an anti-spying agreement with China. But such agreement is just a paper of empty promise, like “Treaty of Non-aggression between Germany and the Union of Soviet Socialist Republics,” back in World War Two. Even under the WTO framework, China is always making use of loopholes and excuses that it is still a developing country to get away with responsibilities it should bear as one of the major players in the world.
The fear of mass surveillance will already constitute an infringement of German Citizens’ privacy of behavior. Privacy of behavior refers to the liberty of individuals to behave without being under any type of observation (Finn & Wright, 2014). The AG Riesa ruled in a court case this year that without actual recording/monitoring, the feeling of being observed constitutes a violation of fundamental personal rights. The change of behavior to avoid being spied on would already constitute an infringement of privacy. Privacy is one of the core values that German citizens treasure, and it should be protected by all means.
Huawei holds a lot of patents and it offers the lowest costs in building 5G networks, declining it from participating will delay the roll-out of the 5G network. However, fundamental rights should always outweigh economic considerations. German and European citizens’ freedom from fear and the fundamental right to privacy should always trump the cost of money and time. There is no remedy for violations of fundamental rights and mass surveillance. The damage it creates cannot be undone, and the fear of surveillance will erode fundamental freedoms.
Hongkongers are fighting for freedom and human rights now, and it is an uphill battle. And there is a chance for the free world to be compromised as well because China is very eager in expanding its influence in other states’ economies to gain more leverage in the global arena. But however large economic benefits PRC can offer, Germany or Europe should never sacrifice privacy and fundamental rights in pursuit of profit. Engaging with Huawei is risking the freedoms and rights of German citizens value and inviting Germany to be subject to China’s influence.
Another issue I see as a digital rights activist is that we don’t actually know what we are trying to protect and how to protect it. In this day and age, it is weird to see individuals being very sensitive towards state surveillance but not noticing commercial surveillance. If you ask a random citizen, do you think the government should collect your behavioral pattern, have them analyzed and then profiled you, I think most of them will say “no, because it is an infringement of our privacy and my right to a private life”. However, they are willingly giving up their data for personalized services. It is rather weird. The state is bound by the constitution, the social contract we tacitly gave our consent to. But economic entities are bound by nothing but laws and regulations that are proved to be filled with loopholes.
Education is key. Yes, but how do we tell people about their data?
When we say privacy, we think of data protection, but what actually is data? Or more precisely what is personal data? We often refer data as “the new oil”. But personal data is far more useful and far more complex than just being the new fuel.
Recently, millions of photos posted on a hosting site, Flickr, under the Creative Commons License (CC License), are being taken by a Tech-Giant, IBM, to train their facial recognition AI (Metz, 2019), without requesting consent from those who are photographed. The CC License is a framework under which people can loosen restrictions on photos, text, video or other material that otherwise would be protected by copyright. The photos, on one hand, are copyrighted material of the photographer, on the other hand, is the personal information of those being photographed under the definition of the General Data Protection Regulation (GDPR). These photos are information relating to an identified or identifiable natural person.
In instance, personal data are described by ownership legal language under copyright laws. The photos that are also personal data are treated as property that can be transferred. Thus IBM, following the terms and conditions of the CC License, are able to use the photos for the sake of training its facial recognition AI. At the same time, the usage of such photos falls under the “processing of personal data” under GDPR. The act of using the photos to train the AI is operations performed on personal data or on sets of personal data. Here arises a problem of those who are in the photos, the personal data subjects, never intended or consented to have their photos taken to be data sets for training facial recognition technology. Arguments are made by the Creative Commons that the matter is a matter of copyright but not privacy. This incident is helpful in reflecting personal data are seen as property or commodities that are capable of being transferred. And it reflects the complexity of personal data and the lack of a framework in being able to describe personal data.
Legal dimension of a new framework for personal data
Data is often understood as representing the real world or attributes or characteristics, they are seen as neutral, objective, pre-analytic in nature, but they are in fact technically, economically, ethically, temporally and so on. Data cannot exist independently of context they are generated and interpreted within.
There are two dimensions of personal data: The Syntactic dimension where data are seen as codified information; and the semantic dimension where the content of the meaning is represented.
In EU’s GDPR Article 4(1), Personal data are any information which are related to an identified or identifiable natural person. In Recital 26 of GDPR, it stipulates that “Account should be taken of all the means reasonably likely to be used wither by the controller or by any other person to identify the individual directly or indirectly”. In the US, “Personal Identifiable Information is any data that could potentially be used to identify a particular person”.
The information or the meaning is an essential part of the data itself or attributed by context being an additional value-added by interpretation. This definition was being used by European Union since 1995. The information or the meaning is an essential part of the data itself or attributed by context being an additional value-added by interpretation. In the current legal system, there is a blurry definition of what personal data really is.
Personal data ownership
Personal data ownership is considered necessary so that the laws would acknowledge the de facto commodification of personal data which is the result of the market switching to behavioral marketing. Personal data becomes the new oil of the industry. With behavioral micro-targeting being the latest business model that beings tech giants billions of revenue, personal data play a vital role in the business in enabling analysis and micro-targeted content possible. Personal data is the fuel of the behavioral micro-targeting businesses and having personal data ownership would facilitate the business model and also acknowledge the commodification of personal data. Treating personal data as tangible items that can be owned is treating data like rival goods. Tangible items are easy to claim an exclusionary property rights, and blocks others from exercising control over the object.
It is also believed that data ownership can give data subjects control over data pertaining to him/ her. Giving ownership right to personal data, allowing the data subject to own their personal data enables data subjects to exercise control over their data against others. This would allow information self-determination. Information self-determination refers to the “the capacity of individual to determine in principle the disclosure and the use of his/her personal data.” With the presence of personal data ownership, the data subject can use the data fully, to access, store, share, sell, amend them, or to extract meaningful information out of personal data. Nowadays, de facto control over personal data is in place with the aid of technologies, legal right to data portability and the requirement of consent before processing personal data. It makes it desirable to introduce ownership of personal data to unleash the full potential of personal data.
It is suggested that Implementing personal data ownership rights that is similar to English land law system would allow data subjects the capacity to exercise control over the transfer of data and have control after the transfer take place. This system would set duration and purpose limitations. The data subject would be able to broadcast his/ her right, has the right to transfer data for remuneration, but there would be no waiver for personal data protection guarantee. It benefits the information industry to protect their investment in collection of personal data, because the law will recognize their property rights. Because it only recognizes “lesser” property rights in personal data, individuals are not forced to give up total control over their personal information. The model also allows further transfer of personal data.
However, it is also argued that personal data are not rival goods, it can be used and reused without the content and its value being lost. The value of personal data comes from service and auctions but not the trade or the sale of personal data. Because personal data fuels the algorithms and micro-targeting business model, it is useful and valuable. Personal data makes it possible for these technologies to be turned into services that bring revenue. The fact is that the same data can be collected through varies means separately and the manufacture of personal data are usually by-product.
Personal data as far as I can see, is a non-rivalry good, if it ever was a good it can be used and reused without any loss of the content. It can be extracted from multiple sources. It is impossible for one to not “OWN” one’s data.
Personal data as intellectual property
Because personal data is not tangible items, there are suggestions for treating personal data as intellectual properties.
Intellectual properties and related laws aim at protecting intellectual creations created by its author, but not the information. If i the intellectual property narrative is applicable to personal data, the question is “Who is its author?”
Personal data is usually collected, derived or implied by AI or machines but are the personal data the work created by AI or Machines? Does the right goes to the Algorithms or its human designer? I believe that personal data is by no means intellectual properties as there it is not an intellectual creation.
The latest Database Directive may be seen as giving a sui generis right to data. However, it is not advocating for data ownership or intellectual property rights, in Article 7 paragraph 1 of the Database Directive states that it protects only “substantive qualitative or quantitative investment in either obtaining or presentation of the contents to prevent extraction and/or re-utilization of the whole or substantial parts of the content of that database”. It is not protecting the data in the database, but the investment in database and the effort made to obtain the data.
Personal data as trade secrets
Definition of trade secrets: Art 2 Directive 2016/943:
a) it is a secret in the sense that it is not, as a body pr n the precise configuration and assembly of its components, generally known among or readily accessible to persons within the circles that normal deal with the kind of information in question
b) it has commercial value because it is secret
c) it has been subject to reasonable steps under the circumstances by the person lawfully in control of the information, to keep it secret
- Personal data do not fall into the definitions of trade secrets
- Single data do not usually have value in itself but in connection with other data
- And only if extradition by a data analytics tool is able to generate information
Personal data and consent
Under the current regime, the contract or consent model is the best for now. However, from a user perspective, it places huge burden on the shoulders or ordinary users. May I ask how many of you really read through all the privacy statements that you consent? Because I do not, I just scroll and scroll to the end and tap on “I accept” usually.
Everyday we receive numerous or privacy statements to go through, and it is too much to ask of an individual, without prior knowledge to the regime, to make truly informed consent about each processing request. Those statements, are necessarily, long and lengthy and sad to say, boring, in most cases.
And what is worse is now, in the field of behavioral economics of privacy, it demonstrates individual consent are prone to manipulation. Even if I do not agree to some terms in the privacy statement, I will accept it because I need that service. If my work requires access to Google, it is impossible for me to refrain from using it, because the cost might be my career. This power imbalance is one of the problems I see in our society.
Another point worth noting is the effects of data processing following that individual consent are not often limited to that individual only. An example would be one’s genetic data. Genetic data of me is also genetic data of my parents and my sister, when I consent to giving out my genetic data for processing, my family member’s genetic data are also being given out.
Trying to figure out what personal data is and hence build a workable legal framework around it, is hard work. And I have not yet found any answers to these questions yet. But through studying the subject, I found a successful framework should be able to do the followings.
A successful framework ought to be able to deal with competition law. The control over data will affect the behavioral micro-targeting business model and others related market. The control over data may result in anti-competitive behavior. Germany’s national competition regulator has ordered Facebook to stop combining user data from different sources without voluntary consent. The order applies to data collected by Facebook-owned platforms like WhatsApp and Instagram, but also third-party sources that Facebook uses to flesh out its advertising profiles, including those of non-users. Although this approach is highly debated, It is an example that illustrates the relationship between personal data and competition law.
The framework ought to consider cases of refusal of access to data sets in downstream market. Personal data demands driven by downstream productive activities that require the source as an input. It is difficult to predict what can be extracted from data sets, and it is hard to prove the indispensability for the downstream market and the particular ownership. Personal data can be extracted from multiple sources and derived and implied into other personal data. Personal data to be used and reused in very different contexts for very different purposes without necessarily losing its value. Dominance in this case would also be hard to prove. Hence this framework will have effects on competition in a secondary market. The emergence of new behavioral products will be hindered if access is not granted to accessing personal data.
Technological dimension of a new framework for personal data
This framework should also be compatible with current and future technological developments. But I doubt we are able to come up with such a framework now. Now the meaning is ‘attached’ to data buy modern machines is beyond the grasp of human mind. How AI and algorithms make their decisions is a total mysterious to us, and even their creator. But this is of vital importance if we want to create a system that can balance the business model and data protection.
Ethical dimension of a new framework for personal data
Personal data is an inalienable part of a human being. It is me in a very philosophical and poetic way.
It does not seem convincing for one to “not own” one’s personal data in real life. If one can owns his/her data, then by the same token, one can also not own his/her data. Legally speaking it is possible that one does not own his/her data, however, no matter the personal data of that individual is collected, derived or implied, the data are still the individual’s data, or at least are data that describe the individual, the data are still the data of the individual. The behavioral data of the individual, the preference of the individual and the age and the medical record of the individual, are all inalienable part of the individuals. There is no way to make one’s data irrelevant to the individual. It is, in real life, impossible to separate one from one’s data.
The last thing I want to address is how cooperates see personal data.
“Act in such a way that you treat humanity, whether in your own person or in the person of any other, never merely as a means to an end, but always at the same time as an end.” - Immanuel Kant
The problem I see in the current business environment is that users or individuals become the means themselves to serve the end of profit maximization. I am not trying to say it is capitalism’s fault, but that the mechanism behind it is hindering ourselves as human. We are being monitored and provided with what the algorithms think we want and what they want us to want. We are being manipulated to a certain extent. Yes, I know that there are debates about how effective micro-targeting is. But I believe this model in inherently incompatible with liberal democracy’s foundation: the ignorance of the good.
In liberal democracy, there is no shared, public knowledge of what the notion of good is for everyone. There is not one notion of good that is regarded as legitimate to force or persuade everyone to comply to, there is no certain ways people should use the freedom rather than using it in other ways. Freedom, in liberal democracy “has no transcendent good to realize and is entirely a matter of individual freedom (Ci, 2015)”. Individual freedom allows one to decide for oneself what is good, or it can put in another way that the good that is recognized is now the function of the exercise of individual freedom within set legal parameters as are agreed upon all to ensure everyone equal freedom. Values are considered to be equal, so do persons. Persons are free within the private domain, one look into oneself for the good but looking up to religion or a set notion of good or looking sideways to public opinion, as Tocqueville says.
Because behavioral micro-targeting knows our psychological triggers and a lot about as, it attempts to use these data and information and form persuasive techniques to persuade us to comply with their notion of good and do as we are persuaded. Their goal is to get us click on that advertisement that was tailor-made to draw my attention, or its goal is to persuade me into feeling a specific emotion, persuade me to have a certain thought. For Google, Facebook or Twitter, their goal is to persuade us into bringing them advertisement revenue, while our goal may not, and most likely not the same.
Behavioral micro-targeting is different from other technologies we had, not like a telephone. The latter does not constantly update itself with your preference and lure you into making a phone call to your friend, but micro-targeting does. It lures you to perform acts and model your behavior not according to your choices or goals, but to those who designed it. Firms monitors individual’s posting habits, interactions and other internet activity to determine to mood of the individual and calculates its emotions and arrange the most effective content to achieve its goal. Behavioral micro-targeting contents pretends to be a mirror of one’s and change the reflection one sees according to what one is feeling, have felt or will feel.
Micro-targeting or at least those CEO of the big companies claim it to be able to understand what one’s preference and interests are and decide what’s best to provide them, and thus claim knowledge to one’s notion of the good. It is implausible that the technologies are able to comprehend one’s notion of good life. And The algorithms behind are unknown to those who are subject to its influence. Often they are designed with specific purposes to alter one’s preference or decision making while claiming this is what he database suggests thus it must be for the sake of users. Micro-targeting technologies claim to know what one wants, what one’s own notion of good is and alter one’s preference is incompatible with Liberal Democracy’s ignorant of the notion of the good.
Our personal behavior data and Micro-targeting technologies may know what decisions we have made and how we have behave online, but they do not understand how they are made and why we behave in a certain way and more importantly, do not understand what is it like to be ourselves and thus cannot claim to understand our notion of good. Micro-targeting technology is restricted to the resources of its database, and those resources are inadequate to understand the notion of the good of any individual.
“For if the facts of experience-- facts about what is like for the experiencing organism-- are accessible only from one point of view, then it is mystery how the true character of experience could be revealed in the physical operation of that organism. The latter is a objective facts par excellence-- the kind that can be observed and understood from many points of view and by individuals with differing perceptual systems. There are no comparable imaginative obstacles to the acquisition of knowledge about bat neurophysiology by human scientists, and intelligent bats or Martians might learn more about the human brain we ever will” - Thomas Nagel
To describe an object or a phenomenon, the less it depends on a specifically human view point, the better and the more objective is our description for the subject matter. If you want to describe what a bat is, one can rely on the physical characteristics of bats, i.e. they are mammals, the have wings, they use sound waves to detect the surroundings. But we can never understand what it is like to be a bat. By the same token, to understand what a human is or how does human make decisions, we can observe the physical mechanism of the human body and human brain, observing the behavior pattern. However, the empirical data tells no one about anything about human experience -- what is it like to be human? What is it like to be a specific individual?
Human experience does not fit into this pattern. It is unlikely that we will be able to come closer to one’s notion of good by leaving behind one’s human point of view and provide a description of one and one’s notion of good without being able to understand what it is like to be the particular individual.
Being able to observe what triggers a user to hit the “like” button ought to be distinguished from what is it feels like to be triggered to hit the “like” button and why does something that can trigger someone hitting that “like” button. And only being able to experience and understand the latter two, is useful in comprehending what the notion of good is for someone. There is a linkage gap between the mental and physical term that might be referring to the same thing and there are no analogies with theoretical identification that can supply that linkage.
Therefore it is plausible to assume any other decision-making mechanism, such as algorithms of micro-targeting that can know why we do something. It is true that it can make predictions base of my past behavior and past preference I indicate online, but it can never understand how it feels like to be “me”. And the feeling to be “me” makes up a great part in how I determine the notion of good is, and to pursue my notion of good life. Not being able to understand that, algorithms of micro-targeting cannot claim to understand what I feel like and my notion of good, and hence are not adequate to make decisions for me.
In liberal democracy doctrines, citizens are shaping the discourse and being shaped by it at the same time. Yet with behavioral micro-targeting technology, it aims to shape citizen’s notion of good into capitalist good and does not allow room for citizens to shape the discourse but only being shaped. Thus micro-targeting technology is inherently not compatible with Liberal Democracy’s foundation - the ignorance of the good.
Thank you very much for your attention.
 Direct Quote from Judgement:
“Mit zunehmender Flughöhe und damit abnehmender Qualität der Aufnahmen sowie der bloßen Nutzung zur sicheren Navigation dürfte die Beeinträchtigung der Privatsphäre entsprechend geringer ausfallen. Zu beachten ist jedoch, dass selbst ohne Aufzeichnung ein Beobachtungsgefühl beim Aufgenommenen aufkommen kann. So hat das AG Potsdam eine Verletzung des Allgemeinen Persönlichkeitsrechts in einen Überflug ohne Kamera gesehen (Schmid, jurisPR-ITR 9/2016, Anm. 2).”
 Anwohner durfte Drohne mit Luftgewehr abschießen https://www.lto.de/recht/nachrichten/n/ag-riesa-freispruch-abschuss-drohne-gerechtfertigt-verletzung- personlichkeitsrecht/
 Rachel Metz, If your image is online, it might be training facial-recognition AI CNN (2019), https://edition.cnn.com/2019/04/19/tech/ai-facial-recognition/index.html (last visited Jul 28, 2019).
 Metz, R. (2019, April 19). If your image is online, it might be training facial-recognition AI. Retrieved July 28, 2019, from https://edition.cnn.com/2019/04/19/tech/ai-facial-recognition/index.html
 Merkley, Ryan. “Use and Fair Use: Statement on Shared Images in Facial Recognition AI.” Creative Commons, 13 Mar. 2019, creativecommons.org/2019/03/13/statement-on-shared-images-in-facial-recognition-ai/.
 2 CFR § 200.79 - Personally Identifiable Information (PII).
Bennett, C. (2016). Voter databases, micro-targeting, and data protection law: can political parties campaign in Europe as they do in North America?. International Data Privacy Law , 6 (4), 261-275. doi: 10.1093/idpl/ipw021
Bodó, B., Helberger, N., & Vreese, C. (2017). Political micro-targeting: a Manchurian candidate or just a dark horse?. Internet Policy Review, 6 (4). doi: 10.14763/2017.4.776
Chester, J., & Montgomery, K. (2017). The role of digital marketing in political campaigns. Internet Policy Review, 6 (4). doi: 10.14763/2017.4.773
Cohen, J.E. (2013).What Privacy is For. Harvard Law Review, 126 (7), 1904-1933. Retrieved from https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2175406
DOWNES,L. , A RATIONAL RESPONSE TO THE PRIVACY “CRISIS,” 26–31 (Cato Inst. Policy Analysis No. 716, 2013), available at http://www.cato.org/sites/cato.org/files/pubs/pdf /pa716.pdf; Judith Jarvis Thomson, The Right to Privacy, in PHILOSOPHICAL DIMENSIONS OF PRIVACY: AN ANTHOLOGY 272, 279–81 (Ferdinand David Schoeman ed., 1984).
European Commission. (2018). New Deal for Consumers: Commission strengthens EU consumer rights and enforcement . Retrieved from http://europa.eu/rapid/press-release_IP-18-3041_en.htm
European Commission. (2019). Behavioural study on transparency in online platforms - 2018. Retrieved from https://ec.europa.eu/info/publications/behavioural-study-transparency-online-platforms-2018 _en
EU General Data Protection Regulation (GDPR): Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ 2016 L 119/1.
Facebook. (2019). A Conversation with Mark Zuckerberg and Yuval Noah Harari [Image]. Retrieved from https://newsroom.fb.com/news/2019/04/marks-challenge-yuval-noah-harari/
Ferrari, A. (2019). Significant changes to consumer law are on the way: what companies need to know – IPT Italy. Retrieved from https://blogs.dlapiper.com/iptitaly/?p=58132
Floridi, L. (2006). Informational privacy and its ontological interpretation. ACM SIGCAS Computers and Society , 36(3), pp.37-40.
Floridi, L., Cowls, J., Beltrametti, M., Chatila, R., Chazerand, P., Dignum, V., Luetge, C., Madelin, R., Pagallo, U., Rossi, F., Schafer, B., Valcke, P. and Vayena, E. (2018). AI4People—An Ethical Framework for a Good AI Society: Opportunities, Risks, Principles, andRecommendations. MindsandMachines, 28(4),pp.689-707.
Gillespie, T. (2018). Custodians of the internet. Yale University Press.
Harari, Y. (2018). Homo Deus: A Brief History of Tomorrow . Harper Paperbacks.
Hess, A. (2017). How Privacy Became a Commodity for the Rich and Powerful. The New York Times Magazine. Retrieved from https://www.nytimes.com/2017/05/09/magazine/how-privacy-became-a-commodity-for-the-ri ch-and-powerful.html
International, D. (2017). DAMA-DMBOK (2nd ed.). [S.l.]: Technics Publications.
Janeček, Václav, Ownership of Personal Data in the Internet of Things (December 1, 2017). Computer Law & Security Review, 2018, 34(5), 1039-1052. Retrieved from https://ssrn.com/abstract=3111047 or http://dx.doi.org/10.2139/ssrn.3111047
Livingston, D., Hoffmann, J., Stuntz, W., & Allen, R. (2016). Criminal Procedure: Investigation and Right to Counsel. Wolters Kluwer Law & Business.
Mayer-Schönberger, V., & Cukier, K. (2013). Big Data: The Essential Guide to Work, Life and Learning in the Age of Insight . Hachette UK.
Merkley, Ryan. “Use and Fair Use: Statement on Shared Images in Facial Recognition AI.” Creative Commons, 13 Mar. 2019, creativecommons.org/2019/03/13/statement-on-shared-images-in-facial-recognition-ai/.
Metz, R. (2019, April 19). If your image is online, it might be training facial-recognition AI. Retrieved July 28, 2019, from https://edition.cnn.com/2019/04/19/tech/ai-facial-recognition/index.html
Nagel, T. (2019). How does it feel like to be a bat. The Phil osophical Review , 83 (4), 435-450. Retrieved from http://www.jstor.org/stable/2183914
Purtova, N. (2017). Does Property in Personal Data (Still) Make Sense?.
Purtova, Nadezhda, Property in Personal Data: Second Life of an Old Idea in the Age of Cloud Computing, Chain Informatisation, and Ambient Intelligence (July 16, 2010). TILT Law & Technology Working Paper No. 2010/017 Gutwirth et al (eds) Computers, Privacy and Data Protection: an Element of Choice (Springer 2011). Retrieved from: https://ssrn.com/abstract=1641027 or http://dx.doi.org/10.2139/ssrn.1641027
Rachel Metz, If your image is online, it might be training facial-recognition AI CNN (2019), https://edition.cnn.com/2019/04/19/tech/ai-facial-recognition/index.html (last visited Jul 28, 2019).
Rawls, J., & Kelly, E. (2003). Justice as fairness. Cambridge, Mass.: Harvard University Press.
Sophie in 't Veld, On democracy, 6 Internet Policy Review(2017), https://policyreview.info/articles/analysis/democracy (last visited Jul 11, 2019).
Hargreaves, S., & Tsui, L. (2017). IP Addresses as Personal Data Under Hong Kong's Privacy Law: An Introduction to the Access My Info HK Project. Ournal Of Law, Information & Science, 25(2). Retrieved from https://ssrn.com/abstract=3074243
Teresa Scassa, Data Ownership,No. 187, CIGI PApers, September 2018
The Royal Society. (2018). Data management and use: Governance in the 21st century: Discussions at a British Academy, Royal Society and techUK seminar on 3 October 2018. The Royal Society. Retrieved from https://www.thebritishacademy.ac.uk/sites/default/files/Data%20management%20and%20us e%20-%20Governance%20in%20the%2021st%20century.pdf
The Royal Society. (2018). Data ownership, rights and controls: Reaching a common understanding: A joint report by the British Academy and the Royal Society. The Royal Society. Retrieved from https://royalsociety.org/-/media/policy/projects/data-governance/data-ownership-rights-and-c ontrols-October-2018.pdf
Trute, H. (2017). Industry 4.0 in Germany and the EU: Data-Driven Economy between Property and Access .
Wagner, B., Kettemann, M., & Vieth, K. (2019). Research handbook on human rights and digitaltechnology. EdwardEglarPublishing.
Warren, S., & Brandeis, L. (1890). The Right to Privacy. Harvard Law Review , 4 (5), 193. doi: 10.2307/1321160
Wesche, T. (2016). The Concept of Property in Rawls’s Property-Owning Democracy. Analyse & Kritik, 35(1), pp. 99-112. Retrieved 11 Jul. 2019, from doi:10.1515/auk-2013-0109
Zuboff, S. (2019). The age of surveillance capitalism. New York: PublicAffairs.