WEBVTT 1 00:00:00.570 --> 00:00:01.110 Gabriela Zanfir-Fortuna: good job of. 2 00:00:03.929 --> 00:00:04.170 Bruno Bioni, Data Privacy Brazil: It. 3 00:01:07.380 --> 00:01:09.750 Clarisse Girot (FPF): Do I have like 10 microseconds. 4 00:01:10.620 --> 00:01:14.490 Clarisse Girot (FPF): Yes to tell my husband and son to stop making noise next door. 5 00:02:01.950 --> 00:02:14.340 Limor Shmerling Magazanik, Israel Tech Policy Institute: So we're almost like we're already live it's too we gotta wait a minute or two to allow the audience to connect and be with us. 6 00:03:24.810 --> 00:03:44.640 Limor Shmerling Magazanik, Israel Tech Policy Institute: Okay, so i'm going to start with a short introduction and some opening comments before we move to a speech or speakers and, at least for our local jurisdiction Israel tardiness is very well documented and accepted so I don't know about the other jurisdictions, we have today. 7 00:03:45.930 --> 00:03:46.500 Limor Shmerling Magazanik, Israel Tech Policy Institute: Please. 8 00:03:47.670 --> 00:03:55.740 Limor Shmerling Magazanik, Israel Tech Policy Institute: excuse the lateness so welcome everyone and i'm happy to say good morning good afternoon and good evening. 9 00:03:56.310 --> 00:04:07.500 Limor Shmerling Magazanik, Israel Tech Policy Institute: Because we have both speakers and audience from different time zone different jurisdictions different continents and it's marvelous we also have. 10 00:04:08.310 --> 00:04:21.570 Limor Shmerling Magazanik, Israel Tech Policy Institute: audience from different professions legal experts non legal professionals, we have various stakeholders from industry, academia, civil society and government. 11 00:04:22.140 --> 00:04:32.850 Limor Shmerling Magazanik, Israel Tech Policy Institute: So he promises to be a great makeup of both audience and speakers i'm emotionally mechanic and the Managing Director of the Israel tech Policy Institute. 12 00:04:33.420 --> 00:04:47.220 Limor Shmerling Magazanik, Israel Tech Policy Institute: We are a think tank focusing on issues of ethics human rights regulation and social norms and emerging technologies and we are part of the future of privacy forum. 13 00:04:48.240 --> 00:05:00.540 Limor Shmerling Magazanik, Israel Tech Policy Institute: based in Washington DC and I have with me, my friends and colleagues from the future privacy forum which you will get to meet and hear from soon, and also our friends from. 14 00:05:01.530 --> 00:05:12.240 Limor Shmerling Magazanik, Israel Tech Policy Institute: Brazil and Israel, the thought behind the series is to accompany these really process of amending its privacy law. 15 00:05:12.780 --> 00:05:27.120 Limor Shmerling Magazanik, Israel Tech Policy Institute: Which is going to start, I believe, next week in Parliament and accompany that with some background on international models on various aspects of Privacy Principles and how they were implemented in different. 16 00:05:27.810 --> 00:05:43.080 Limor Shmerling Magazanik, Israel Tech Policy Institute: locales and just a few words for background for those in the audience who may not be from Israel and familiar with the Israeli law or maybe not even from the legal profession, so this is going to be very basic for some of you, but it's going to be very short. 17 00:05:44.220 --> 00:05:57.450 Limor Shmerling Magazanik, Israel Tech Policy Institute: So for private entities, and that is natural persons, but also private companies, NGOs, everything is allowed unless it's forbidden by law. 18 00:05:58.020 --> 00:06:10.020 Limor Shmerling Magazanik, Israel Tech Policy Institute: For public bodies, everything is forbidden unless it is allowed by along both public bodies and private entities or potential users of personal data, first of all information. 19 00:06:10.620 --> 00:06:20.520 Limor Shmerling Magazanik, Israel Tech Policy Institute: and Israel, and you will hear stewing about other jurisdictions, but in Israel, the right to personal privacy to the privacy of my personal information personal data. 20 00:06:20.820 --> 00:06:30.450 Limor Shmerling Magazanik, Israel Tech Policy Institute: It has been acknowledged by the Basic Law and it as its constitutional level and, additionally, we have a privacy law that states. 21 00:06:31.080 --> 00:06:42.030 Limor Shmerling Magazanik, Israel Tech Policy Institute: The do's and don'ts and accountability required by organizations who are using data, and that applies to both public bodies ministries of government and private entities. 22 00:06:43.410 --> 00:06:53.280 Limor Shmerling Magazanik, Israel Tech Policy Institute: The default of the Israeli privacy law is that using personal data is not allowed, it is an infringement of my personal privacy unless. 23 00:06:54.060 --> 00:07:07.680 Limor Shmerling Magazanik, Israel Tech Policy Institute: The person or the entity, using the data obtains my informed consent, and that is default that is very substantial and critical in the structuring of this whole mechanism. 24 00:07:08.430 --> 00:07:15.870 Limor Shmerling Magazanik, Israel Tech Policy Institute: And while individual can be informed consent is the highway to data personal data used in Israel. 25 00:07:16.170 --> 00:07:27.780 Limor Shmerling Magazanik, Israel Tech Policy Institute: In other jurisdictions, there are other legal basis for processing data and we thought that this is the perfect time to discuss these types of legal basis and. 26 00:07:28.290 --> 00:07:39.570 Limor Shmerling Magazanik, Israel Tech Policy Institute: hear from our experts about lessons learned in other places, and this would be, I think, very beneficial to our Israeli. 27 00:07:40.050 --> 00:07:48.120 Limor Shmerling Magazanik, Israel Tech Policy Institute: Privacy process so i'm going to turn first to my friend and dear colleague Dr, who has done free for two now. 28 00:07:48.630 --> 00:07:57.690 Limor Shmerling Magazanik, Israel Tech Policy Institute: Who is the VP for global privacy and the future of privacy forum and an active researcher and a voice in the Data Protection Community worldwide. 29 00:07:58.350 --> 00:08:08.130 Limor Shmerling Magazanik, Israel Tech Policy Institute: Gabriella previously worked at the https and with the wp 29 she will take us through the legal basis for data processing in the gdpr. 30 00:08:08.520 --> 00:08:21.570 Limor Shmerling Magazanik, Israel Tech Policy Institute: And will shed some light on guidelines around legitimate interest and the whole discussion around that legal basis for data processing so Gabriella over to you and things. 31 00:08:22.440 --> 00:08:30.090 Gabriela Zanfir-Fortuna: Thank you so much anymore for such a kind introduction and thank you for setting the scene for our discussion today i'm very much looking forward to it. 32 00:08:30.570 --> 00:08:43.500 Gabriela Zanfir-Fortuna: And I do hope it's going to be informative for the prices that's currently undergoing in Israel when looking at how the data protect the privacy law. 33 00:08:45.540 --> 00:09:01.230 Gabriela Zanfir-Fortuna: should be amended modernized and so on, so i'm i'm i'm happy to start with the overview of how the gdpr in the EU is doing this thing about lawful grounds for processing. 34 00:09:02.670 --> 00:09:09.810 Gabriela Zanfir-Fortuna: But I seen before diving into this particular thing it's important to take a step back and think about. 35 00:09:10.860 --> 00:09:15.510 Gabriela Zanfir-Fortuna: Data protection and what its purposes right and think a bit about. 36 00:09:16.740 --> 00:09:30.870 Gabriela Zanfir-Fortuna: How data protection constructed, it safeguards to in fact allow for processing of personal data, not to forbid that wasn't the purpose of the prediction as a legal also law branch. 37 00:09:32.220 --> 00:09:46.830 Gabriela Zanfir-Fortuna: But it was too in fact set out safeguards so that personal data is being used in a way that is respectful of individual rights and freedoms. 38 00:09:47.730 --> 00:09:56.490 Gabriela Zanfir-Fortuna: One of my favorite go to quotes about that is actually from Peter houston's the former European Data Protection Supervisor and actually the first. 39 00:09:57.330 --> 00:10:08.310 Gabriela Zanfir-Fortuna: European Data Protection Supervisor, who said that in encoding now data protection was not designed to prevent the processing of stuff such information so of personal data. 40 00:10:08.610 --> 00:10:20.070 Gabriela Zanfir-Fortuna: or to limit the user information technology, per se, instead, it was designed to provide safeguards whenever information technology would be used for processing information relating to individuals. 41 00:10:20.700 --> 00:10:30.180 Gabriela Zanfir-Fortuna: And this was based on the early conviction that extensive user information technology for this purpose would have far reaching effects for the rising interest of individuals. 42 00:10:30.450 --> 00:10:41.670 Gabriela Zanfir-Fortuna: So, very important to to understand this distinction, which is at the heart of data protection it's not meant to for the processing of personal data it's meant to allow it in a in. 43 00:10:42.420 --> 00:10:55.920 Gabriela Zanfir-Fortuna: A respectful way towards the rights of individuals, because we have to acknowledge that in the modern world in the modern day personal data is is truly ubiquitous and. 44 00:10:56.970 --> 00:11:07.500 Gabriela Zanfir-Fortuna: it's not realistic to believe that we will leave in a kind of a cave somewhere and no no information about us is going to be made available. 45 00:11:07.530 --> 00:11:16.200 Gabriela Zanfir-Fortuna: anywhere so having this big framework in mind for the purpose of data protection as a branch of a law. 46 00:11:18.510 --> 00:11:39.630 Gabriela Zanfir-Fortuna: One of the key principles of data protection law is it was conceived in the European Union is the principle of lawfulness, and this is a principle that specifically provided under Article five gdpr and that manifests in different ways, but the key way in which manifests is actually. 47 00:11:40.890 --> 00:11:49.650 Gabriela Zanfir-Fortuna: Providing for the need to have a justification whenever personal data is being processed. 48 00:11:50.760 --> 00:11:57.000 Gabriela Zanfir-Fortuna: which is expressed in Article six of the gdpr in this justification for being. 49 00:11:58.200 --> 00:11:59.640 Gabriela Zanfir-Fortuna: Before being. 50 00:12:01.530 --> 00:12:14.070 Gabriela Zanfir-Fortuna: allowed okay to process personal data is one of six lawful grounds that the gdpr provides under Article six and. 51 00:12:16.080 --> 00:12:29.070 Gabriela Zanfir-Fortuna: Because the underlying purpose of data protection is to allow for pricing of personal data in a respectful way the gdpr actually provide several avenues for. 52 00:12:29.400 --> 00:12:37.140 Gabriela Zanfir-Fortuna: Having a justification for processing personal data, so this is how we ended up having six lawful grounds in the gdpr. 53 00:12:37.800 --> 00:12:51.360 Gabriela Zanfir-Fortuna: will see that other general comprehensive data protection laws around the world have even more local brands and I am very much looking forward to hear from Bruno on on on on this front. 54 00:12:51.960 --> 00:13:02.580 Gabriela Zanfir-Fortuna: um but going back to the gdpr we have this six local browns all of them are equally important, so we don't have a yorkie among the local browns. 55 00:13:03.150 --> 00:13:15.570 Gabriela Zanfir-Fortuna: And we have consent, we have a dentist says ED to enter a contract so whenever pressing of personal data is necessary for the performance of a contract. 56 00:13:16.530 --> 00:13:26.760 Gabriela Zanfir-Fortuna: To reach the individual is party or in order to take steps at the request of the data subject prior to entering into a contract so it's a bit broader there. 57 00:13:27.810 --> 00:13:35.340 Gabriela Zanfir-Fortuna: Of course, when there is a legal obligation, this is another one of the six lawful rounds where when there is a law requiring. 58 00:13:35.970 --> 00:13:46.410 Gabriela Zanfir-Fortuna: A controller or one organization to process personal data of course that's it, that is a justification, but then there are some conditions. 59 00:13:47.100 --> 00:14:02.250 Gabriela Zanfir-Fortuna: That stem from fundamental rights on how that law needs to be sent out and with the type of safeguards, it needs to do have in place as well the vita interests of the data subject are also to be considered as an awful ground. 60 00:14:03.300 --> 00:14:10.020 Gabriela Zanfir-Fortuna: Then we have a task in the public interest that's another level ground and that. 61 00:14:11.190 --> 00:14:19.590 Gabriela Zanfir-Fortuna: speaks of processing that's necessary for the performance of a task carried out in the public interest or in the exercises of official authority. 62 00:14:19.980 --> 00:14:35.250 Gabriela Zanfir-Fortuna: that's listed in the controller, so this is a local ground that's usually used by public administration, public authorities, but also quayside public organizations, so it might be a private organizations, but that have. 63 00:14:36.360 --> 00:14:41.490 Gabriela Zanfir-Fortuna: A important task in the public interest that they need to perform. 64 00:14:42.870 --> 00:14:51.570 Gabriela Zanfir-Fortuna: And then, lastly, we have the legitimate interests lawful ground which is by far the most fascinating under the gdpr because. 65 00:14:52.260 --> 00:14:58.200 Gabriela Zanfir-Fortuna: It comes with the complexity, with new ones, the provision itself says that. 66 00:14:59.100 --> 00:15:07.140 Gabriela Zanfir-Fortuna: legitimate interest allows processing of personal data, when this is necessary for the purposes of the legitimate interests pursued by the controller. 67 00:15:07.620 --> 00:15:23.460 Gabriela Zanfir-Fortuna: or by a third party so even the legitimate interest of a third party are to be considered, except with such interests are overridden by the interest or fundamental rights and freedoms over the data subjects over the individual whose personal data is being processed. 68 00:15:24.900 --> 00:15:38.880 Gabriela Zanfir-Fortuna: And, in particular, where the data subject is a child, so there is an additional protection there in the gdpr for children under this provision as well, so we have this complex concept of. 69 00:15:39.600 --> 00:15:57.720 Gabriela Zanfir-Fortuna: Allowing personal data to be processed when the controller, or even someone else has a legitimate interest without actually asking the consent of the individual right, this is kind of the binary there if we think about it so without involving the data subject and. 70 00:15:59.040 --> 00:16:00.480 Gabriela Zanfir-Fortuna: In asking their consent. 71 00:16:01.560 --> 00:16:03.810 Gabriela Zanfir-Fortuna: However, the conditions are quite. 72 00:16:05.550 --> 00:16:07.950 Gabriela Zanfir-Fortuna: Complex because we have. 73 00:16:09.030 --> 00:16:26.010 Gabriela Zanfir-Fortuna: A test that has been developed through case law through guidelines by data protection authorities and that clearly states that just having a legitimate interest is not enough to be able to use this lawful ground, this is very important. 74 00:16:27.300 --> 00:16:40.830 Gabriela Zanfir-Fortuna: You will see when you're looking at the recyclers of the gdpr and also when you're looking at the guidelines we have guidelines from the article 29 working party already for for some years now, which is still valid. 75 00:16:42.330 --> 00:16:56.040 Gabriela Zanfir-Fortuna: it's still used by data protection authorities that says that the legitimate interest itself can be anything literally anything that's lawful and that can be articulated in a way. 76 00:16:56.430 --> 00:17:09.810 Gabriela Zanfir-Fortuna: That makes sense and allows for purpose limitation so just to the extent it's not something that in compasses and over broad concept and to the extent that that interest. 77 00:17:10.260 --> 00:17:22.830 Gabriela Zanfir-Fortuna: Does not breach the law or literally can be justifiable and legal that is a legitimate interest, however, the complicated part comes now because. 78 00:17:23.850 --> 00:17:29.280 Gabriela Zanfir-Fortuna: it's very important that the personal data being considered to be processed. 79 00:17:30.210 --> 00:17:44.490 Gabriela Zanfir-Fortuna: justified by legitimate interest needs to meet the test of necessity, so it needs to be necessary for that legitimate interest, and this is very important to actually narrow down how you describe the legitimate interest. 80 00:17:45.780 --> 00:18:01.380 Gabriela Zanfir-Fortuna: In the personal data needs to be necessary to achieve that legitimate interest, but then, very importantly, comes the balancing test, and this is where I think. 81 00:18:02.550 --> 00:18:04.230 Gabriela Zanfir-Fortuna: the beauty of data protection. 82 00:18:05.370 --> 00:18:19.650 Gabriela Zanfir-Fortuna: lays because it is a perfect expression of how data protection is there to balance, rights and freedoms of individuals and legitimate interest or organizations to use the data. 83 00:18:20.040 --> 00:18:38.610 Gabriela Zanfir-Fortuna: Well, this balancing test is actually a way for organizations to think about whether the rights and freedoms of individuals are overriding the legitimate interests, the company or dessert party has or gather organization. 84 00:18:40.770 --> 00:18:51.150 Gabriela Zanfir-Fortuna: And i'm just walk through the process of thinking, how intrusive is the process indoors the individuals and. 85 00:18:52.500 --> 00:19:02.970 Gabriela Zanfir-Fortuna: Whether the intrusiveness of this processing and the way that is on the rights of individuals of cannot be overridden. 86 00:19:04.260 --> 00:19:14.220 Gabriela Zanfir-Fortuna: By measures that can be reduced to do the processing in general, so I think I will leave it at this for my. 87 00:19:14.640 --> 00:19:31.920 Gabriela Zanfir-Fortuna: First intervention and i'm happy to discuss further down the line, maybe some use cases and see how this has been involved in practice has been evolving in practice, but I think this is a good framing for where we stand with the local brands. 88 00:19:32.130 --> 00:19:40.530 Limor Shmerling Magazanik, Israel Tech Policy Institute: yeah Thank you good brother that's that's excellent, but I did have an urge to act to request an example, maybe just one for our audience. 89 00:19:41.310 --> 00:19:55.440 Limor Shmerling Magazanik, Israel Tech Policy Institute: If you can pick just one example of the balancing test, maybe something from the UK slow, maybe something from the apb guidance, can you share just one with us. 90 00:20:00.240 --> 00:20:01.320 Gabriela Zanfir-Fortuna: So sorry about that. 91 00:20:02.430 --> 00:20:09.990 Gabriela Zanfir-Fortuna: Can I come back to this in a second, I just want to make sure that I use a proper example. 92 00:20:10.080 --> 00:20:11.820 Gabriela Zanfir-Fortuna: That actually makes it not. 93 00:20:11.850 --> 00:20:14.460 Gabriela Zanfir-Fortuna: Just immediately come up with it now and. 94 00:20:15.690 --> 00:20:21.300 Gabriela Zanfir-Fortuna: i'll make sure to just jump on when the presentations are done. 95 00:20:21.600 --> 00:20:25.950 Limor Shmerling Magazanik, Israel Tech Policy Institute: Perfect perfect that's the benefit of having a chat with friends. 96 00:20:27.480 --> 00:20:40.770 Limor Shmerling Magazanik, Israel Tech Policy Institute: So, moving to Dr collegial and who is also a colleague from the future privacy forum and a longtime friend and a Managing Director of our. 97 00:20:41.430 --> 00:20:55.770 Limor Shmerling Magazanik, Israel Tech Policy Institute: Asia Pacific office, based in Singapore and clarice also served for about 12 years if i'm not mistaken, as a senior Member of the Canadian the French data protection authority and has. 98 00:20:57.510 --> 00:21:10.170 Limor Shmerling Magazanik, Israel Tech Policy Institute: Academic experience in research and publication, the same as Gabriella and she will take us through how countries that don't necessarily have privacy as a human right. 99 00:21:10.650 --> 00:21:21.330 Limor Shmerling Magazanik, Israel Tech Policy Institute: approach data protection law which exists, nonetheless, and thoughts around legal basis in those jurisdictions thanks clarice. 100 00:21:23.700 --> 00:21:31.080 Clarisse Girot (FPF): Thank you Lee Maureen and Hello everyone greetings from Singapore nowadays it's very customer to celebrate Chinese new year, so I hope to. 101 00:21:31.080 --> 00:21:40.410 Clarisse Girot (FPF): With wish you, health and prosperity in the new lunar a year of the Tigers so going to fetch it that's what everybody says nowadays So here we go. 102 00:21:41.310 --> 00:21:48.330 Clarisse Girot (FPF): Indeed, have been based in Singapore for seven years now, and over the past years i've learned to sort of. 103 00:21:48.750 --> 00:21:59.760 Clarisse Girot (FPF): navigate the complexity of data protection laws as they are developing in this incredible region extremely diverse huge, of course, the most populated region of the world. 104 00:22:00.240 --> 00:22:05.820 Clarisse Girot (FPF): And a lot of countries are currently working on the adoption of new data protection frameworks. 105 00:22:06.090 --> 00:22:16.920 Clarisse Girot (FPF): And even bigger number of countries jurisdictions rather are actually working on revamping reviewing very fundamentally they're all data protection frameworks and others. 106 00:22:17.400 --> 00:22:25.830 Clarisse Girot (FPF): Are sort of you know sort of hesitant about where to go next and, should they have the data protection law should be a priority, so we have an extremely. 107 00:22:26.100 --> 00:22:33.720 Clarisse Girot (FPF): Interesting landscape, we call it a fragmented landscape, to avoid things since it's a chat among friends, that is a complete mess, but, honestly. 108 00:22:34.080 --> 00:22:41.820 Clarisse Girot (FPF): When you are a company trying to do business across many borders, it is extremely challenging to comply with very different. 109 00:22:42.330 --> 00:22:50.340 Clarisse Girot (FPF): Data protection laws and one of the key challenges that companies i'm speaking about companies, specifically because my work is really into comparative law. 110 00:22:50.670 --> 00:22:56.370 Clarisse Girot (FPF): And it's not really government that have to apply multiple laws in accumulation it's really businesses who do that. 111 00:22:56.670 --> 00:23:10.290 Clarisse Girot (FPF): But it will take me to comment on the importance of consent and how we're all trying to wind it back, and that is very relevant for the current Israeli law reform process so basically the. 112 00:23:11.520 --> 00:23:20.460 Clarisse Girot (FPF): mean we've become familiar fps and before that, when I was working at the agent business law institute with this question because we launched a project. 113 00:23:20.790 --> 00:23:26.160 Clarisse Girot (FPF): In a spirit of promoting convergence and interoperability between legal systems in Asia Pacific. 114 00:23:26.730 --> 00:23:35.280 Clarisse Girot (FPF): But also, I mean the work has a wider REACH and therefore that's how we can probably be relevant for Israel as well and originally our idea was. 115 00:23:35.850 --> 00:23:42.480 Clarisse Girot (FPF): To see to what extent some recent amendments introduced into the Singapore personal data protection PDP a. 116 00:23:42.900 --> 00:23:50.010 Clarisse Girot (FPF): and, specifically, the introduction of the concept of legitimate interest inspired by a new gdpr which cabriolet introduced so well. 117 00:23:50.520 --> 00:23:58.110 Clarisse Girot (FPF): could find the ECHO in the legal systems of the zone and to explore the possible synergies between jurisdictions on this subject. 118 00:23:58.470 --> 00:24:05.520 Clarisse Girot (FPF): And the introduction of legitimate interests in the PDP in Singapore is interesting because it's indicative of a need to provide. 119 00:24:05.880 --> 00:24:14.760 Clarisse Girot (FPF): Meaningful alternative to the requirement to obtain consent, which is still very much on the role by default in Singapore and in many countries in the region. 120 00:24:15.360 --> 00:24:22.050 Clarisse Girot (FPF): But then balanced with appropriate safeguards and that's a really interesting point that we can discuss further down the line with value gabriola. 121 00:24:22.470 --> 00:24:32.040 Clarisse Girot (FPF): However, we're looking at this, are quite quickly we realized that the field of our study had to be quite significantly expanded, because in a competitive perspective. 122 00:24:33.150 --> 00:24:39.390 Clarisse Girot (FPF): Introducing legitimate interest is in fact in the law is in fact one piece of a much larger puzzle. 123 00:24:39.960 --> 00:24:50.610 Clarisse Girot (FPF): Which is that of the conditions under which the collection and the processing of personal data can be considered legitimate under multiple Asian data protection law, and that includes. 124 00:24:50.940 --> 00:25:02.280 Clarisse Girot (FPF): How consent requirements and notice and consent and exceptions to consent operate in Asia Pacific, what are the conditions for valid consent, I begin that plays a very significant role. 125 00:25:02.820 --> 00:25:10.110 Clarisse Girot (FPF): And how equivalent provisions operate in legal systems which neither are concerned base and there are many of them. 126 00:25:10.620 --> 00:25:23.130 Clarisse Girot (FPF): Nor have a concept of legal basis or lawful mess of processing, including legitimate interest like gdpr and I understand that this is very much the case in Israel, and so I mean we. 127 00:25:23.820 --> 00:25:29.820 Clarisse Girot (FPF): were looking at this and also how the implementation of comparable provisions work in practice. 128 00:25:30.510 --> 00:25:38.070 Clarisse Girot (FPF): To see if that would be sufficient commonalities and, of course, we also have to look at how other legal basis, like research and public interest and Simon can play a role. 129 00:25:38.850 --> 00:25:49.320 Clarisse Girot (FPF): What is really interesting when you do this comparative analysis, but not in camera we've really done it with practitioners and industry representatives and regulators to have a very practical feedback. 130 00:25:49.770 --> 00:26:01.620 Clarisse Girot (FPF): When this feedback has thought of it that are from 14 jurisdictions covered in our project that it is impossible to deal with one piece of that big puzzle without looking at the other pieces because they're all independent. 131 00:26:02.490 --> 00:26:13.470 Clarisse Girot (FPF): And another challenge is that even work consent is not necessary or justifiable and concert like legitimate interest could apply in practice, though consent remains. 132 00:26:13.890 --> 00:26:27.840 Clarisse Girot (FPF): A key element for data protection compliance in Asia, over and above all other provisions and even if there would there would be a preferred option like legitimate interest which businesses who very much asked for in a majority of jurisdictions. 133 00:26:28.380 --> 00:26:34.500 Clarisse Girot (FPF): And so the reason for that and that's really interesting I believe in the current reform process that you're in. 134 00:26:34.920 --> 00:26:45.120 Clarisse Girot (FPF): Its biggest consent is basically a cornerstone in virtually every jurisdiction, it has different role it plays you know apart different parts and different jurisdiction but it's always there. 135 00:26:45.570 --> 00:26:54.840 Clarisse Girot (FPF): So it is the one thing that companies can hang on to to develop a regional compliance program and data protection program at least there's content somewhere. 136 00:26:55.620 --> 00:27:03.390 Clarisse Girot (FPF): And the problem is that a park Asia Pacific we're all clear a park presents a very complex and dynamic Patrick of requirements. 137 00:27:04.080 --> 00:27:14.400 Clarisse Girot (FPF): With some jurisdictions that require unbundled revoke verbal consent for different types of process things such as sensitive sensitive personal data, direct marketing. 138 00:27:15.120 --> 00:27:24.390 Clarisse Girot (FPF): Mostly permitted bundle consent for general processing and some would even allow deemed consent or implicit implied consent or simple notification. 139 00:27:25.080 --> 00:27:33.990 Clarisse Girot (FPF): There are also all sorts of exemptions like you know for some types of processing like health research or public interest, law enforcement. 140 00:27:34.440 --> 00:27:40.680 Clarisse Girot (FPF): But they still will vary by context and by jurisdiction and they do not provide much relief in respect of typical. 141 00:27:41.130 --> 00:27:53.280 Clarisse Girot (FPF): Commercial processing or transfers of personal data for by private sector organization, so the result is that a present the most practical results for organization that see. 142 00:27:53.910 --> 00:28:00.780 Clarisse Girot (FPF): original data protection solution is to end for sort of high watermark of consent gdpr light kind of style. 143 00:28:01.680 --> 00:28:11.550 Clarisse Girot (FPF): That means most requirements and, in some respects actually over complies in jurisdictions where just notification would be sufficient or again if other more appropriate legal basis, like. 144 00:28:11.850 --> 00:28:17.010 Clarisse Girot (FPF): legitimate interest other purpose principle could apply, so, in other words to reformulate. 145 00:28:17.370 --> 00:28:27.690 Clarisse Girot (FPF): Because of legal fragmentation companies keep relying on consent by default because, for all its flaws is still perceived as a common denominator and the sort of easiest and safest way to comply. 146 00:28:28.050 --> 00:28:34.320 Clarisse Girot (FPF): across borders in a pack but However, this is truly are not satisfactory, I believe that you know. 147 00:28:34.890 --> 00:28:42.330 Clarisse Girot (FPF): You will all agree particularly gabrielle much word for the other speakers, but I mean this is not satisfactory, because it in shrines. 148 00:28:42.720 --> 00:28:46.350 Clarisse Girot (FPF): is strictly compliant they stick the box approach to data protection. 149 00:28:47.010 --> 00:28:55.650 Clarisse Girot (FPF): Although we all know that in the majority of situations content is not free, in the sense that gdpr, for instance, requires consent to be free and, in any case. 150 00:28:56.040 --> 00:29:02.490 Clarisse Girot (FPF): You know, usually does not offer any meaningful protection at all so it's This is just not the right way to go, and on top of that. 151 00:29:02.910 --> 00:29:16.200 Clarisse Girot (FPF): The use of bundled consent vague open ended privacy policies collection through an identified third parties and ineffective opt outs, we all know those right, so they make it all the more confusing for consumers to understand. 152 00:29:16.530 --> 00:29:26.940 Clarisse Girot (FPF): The full implications of content and they're not also useful for companies who want to do things well and paste their data protection responsibilities are squarely so. 153 00:29:27.480 --> 00:29:37.740 Clarisse Girot (FPF): Another source of concern in in Asia is that some some measures which have been envisage to improve consent in some jurisdictions. 154 00:29:38.190 --> 00:29:43.650 Clarisse Girot (FPF): By making it reversible which is seems logical because consent, has to be free so you're free to revoke it as well. 155 00:29:44.130 --> 00:29:51.660 Clarisse Girot (FPF): but also by requiring even longer notices like you stuff even more things you know, in the privacy policy and by law. 156 00:29:52.320 --> 00:29:59.460 Clarisse Girot (FPF): most often honestly, this is not effective, these measures are not effective when they're taking in isolation, in fact, they might even aggravate. 157 00:30:00.450 --> 00:30:06.870 Clarisse Girot (FPF): The problem further, so the feedback that we've received, and I hope this is really useful. 158 00:30:07.470 --> 00:30:14.430 Clarisse Girot (FPF): As a reflection, for you in Israel is that this culture of compliance with consent is so deeply ingrained. 159 00:30:14.970 --> 00:30:24.180 Clarisse Girot (FPF): In Asia, and beyond that no change will occur in practice, unless, among others, a critical mass of policymakers step in and take resolute action to that effect. 160 00:30:24.450 --> 00:30:32.730 Clarisse Girot (FPF): And the first movement in a pack, we see it actually in Australia, New Zealand, Singapore, there are many things, you know that show many signs that should that this approach to change. 161 00:30:33.000 --> 00:30:39.360 Clarisse Girot (FPF): So regulators now are you know enforcing a bit more aggressively that doesn't sound very polite but they really do. 162 00:30:40.080 --> 00:30:49.890 Clarisse Girot (FPF): So I mean very interesting movements ahead so SPF we've tried to seriously look into how we can contribute to improving the situation. 163 00:30:50.730 --> 00:31:04.470 Clarisse Girot (FPF): And for that we've done a very thorough comparative analysis of 14 jurisdictions to see how you know where we could find sufficient commonalities and anyway, to make it very short because it's taking too long already. 164 00:31:05.850 --> 00:31:06.810 Clarisse Girot (FPF): It looks like. 165 00:31:08.010 --> 00:31:10.560 Clarisse Girot (FPF): There is a common path which is. 166 00:31:11.100 --> 00:31:22.680 Clarisse Girot (FPF): We have to all depending on the legal framework we operate in and they all come from different backgrounds, whether the aim is rather to put that fundamental rights or you know, promote the digital economy, and sometimes even. 167 00:31:22.920 --> 00:31:29.640 Clarisse Girot (FPF): You know, rather like national security and digital sovereignty, there is still some common ground there, which is first. 168 00:31:30.090 --> 00:31:36.210 Clarisse Girot (FPF): To make use of concepts of just legitimate interest compatible users and equivalent a notions. 169 00:31:36.510 --> 00:31:48.150 Clarisse Girot (FPF): Provided regulators clarify how this should be implemented because there is a an apprehension as to the vague nature of the concept and the legal uncertainty that it entails enough for for us individuals. 170 00:31:48.570 --> 00:31:53.850 Clarisse Girot (FPF): were also a bit worried, frankly, that this is getting just the right to you know to big tech sort of. 171 00:31:54.630 --> 00:32:03.570 Clarisse Girot (FPF): But it also includes making consent meaningful again i'm concerned there's not that we hear that all the time that concern is dead concert consent is overrated. 172 00:32:04.230 --> 00:32:13.200 Clarisse Girot (FPF): The truth is it's rather that consent, has been overused and we should you know reposition consensus that is truly respected as a fundamental. 173 00:32:13.800 --> 00:32:25.200 Clarisse Girot (FPF): As a sort of fundamental right somehow and that go through winding back the range of circumstances in which consent is thought are requiring concern only willing to be given. 174 00:32:25.800 --> 00:32:28.830 Clarisse Girot (FPF): thoughtfully sparingly and with understanding. 175 00:32:29.550 --> 00:32:39.600 Clarisse Girot (FPF): And also and that's complex because it's not only a legal thing you can't just notify it like that, but we shouldn't support enhanced our transparency and consent. 176 00:32:39.900 --> 00:32:55.320 Clarisse Girot (FPF): For our ux and ui design with do attention brought to the different needs and literacy levels of users, so this is something which I find extremely interesting in in a park and i'm happy to dig deeper I can talk forever so. 177 00:32:56.190 --> 00:32:56.970 Clarisse Girot (FPF): i'll stop here. 178 00:32:57.750 --> 00:32:58.170 Limor Shmerling Magazanik, Israel Tech Policy Institute: And it's a. 179 00:32:58.950 --> 00:33:02.850 Clarisse Girot (FPF): reason to be in and i'm really grateful to have the opportunity to share. 180 00:33:03.720 --> 00:33:12.360 Limor Shmerling Magazanik, Israel Tech Policy Institute: Thank you so much that is fantastic, and I can vouch for you, being able to discuss this for four hours at a turn. 181 00:33:13.020 --> 00:33:23.730 Limor Shmerling Magazanik, Israel Tech Policy Institute: But we're going to make a shortstop and welcome Dr Lee I call flows, a week, who joined us and i'm very thankful to you for for joining and. 182 00:33:24.480 --> 00:33:30.960 Limor Shmerling Magazanik, Israel Tech Policy Institute: Just to recap from what we've heard before by Gabriella and clarice. 183 00:33:31.920 --> 00:33:44.460 Limor Shmerling Magazanik, Israel Tech Policy Institute: We heard that the idea behind the data protection regulation is to facilitate the use of personal data for for lawful reasons and with safeguards that respect. 184 00:33:44.850 --> 00:34:02.700 Limor Shmerling Magazanik, Israel Tech Policy Institute: people's rights and we discussed the various basis in the gdpr pinpointing a little bit about legitimate interest in the balancing test that is required there and clarice shared with us the fragmentation and the consent based. 185 00:34:03.300 --> 00:34:12.840 Limor Shmerling Magazanik, Israel Tech Policy Institute: Either legislation or practices in the apac region which remind us of israel's consent based approach at the moment and. 186 00:34:13.560 --> 00:34:31.050 Limor Shmerling Magazanik, Israel Tech Policy Institute: One of the other legal basis that is available to the users, mostly by public bodies is the public interest or research and for this I looking forward to hearing from yaakov who is. 187 00:34:32.250 --> 00:34:33.540 Limor Shmerling Magazanik, Israel Tech Policy Institute: Correct me if I if I. 188 00:34:34.650 --> 00:34:41.220 Limor Shmerling Magazanik, Israel Tech Policy Institute: didn't remember and didn't point out everything but you're a historian a teacher a researcher writer. 189 00:34:42.150 --> 00:34:48.030 Limor Shmerling Magazanik, Israel Tech Policy Institute: director of the yet a shame archives and previously director of the Israel state archives. 190 00:34:48.420 --> 00:34:56.970 Limor Shmerling Magazanik, Israel Tech Policy Institute: And I invite you to share your thoughts and experience on the importance of data sharing, for historical and research purposes in the public interest. 191 00:34:57.870 --> 00:35:13.050 Limor Shmerling Magazanik, Israel Tech Policy Institute: Including personal data, which is something that the data protection professionals are very used to protecting and safeguarding I think your insights can give us a different perspective, and we welcome your remarks. 192 00:35:14.910 --> 00:35:20.340 Yaacov Lozowick: Thank you for the invitation, there was a bit of a screw up on the way in but that to be you know. 193 00:35:22.950 --> 00:35:30.750 Yaacov Lozowick: I have to say I did not managed to listen to I listen, listen to the two please indeed i'm coming at this from a totally different direction. 194 00:35:31.350 --> 00:35:41.370 Yaacov Lozowick: than the others, but let me first start by quickly saying is well this week is in the middle of a absolutely gigantic scandal, whereby it seems that. 195 00:35:42.900 --> 00:35:51.570 Yaacov Lozowick: The East West were spying on whether they felt like by using extremely powerful spy wares and breaking into our smartphones and. 196 00:35:52.110 --> 00:36:03.240 Yaacov Lozowick: doing whatever they wanted and we're all in an uproar and I saw one of our specialists who deals with with with data protection and privacy laws was absolutely exuberant the other day, because you said because of the political. 197 00:36:04.320 --> 00:36:16.380 Yaacov Lozowick: situation, this is the first time ever that the Left and the Right in the Center everybody agrees, this is a very important subject, normally, it can be tagged politically, but now everybody agrees that this needs to the dailies. 198 00:36:17.130 --> 00:36:23.880 Yaacov Lozowick: But what i'm talking about has nothing to do with that and actually I think what what all of us are talking about that's like the to the federal very little to do with her. 199 00:36:24.600 --> 00:36:36.750 Yaacov Lozowick: Because the scandal has to do with a police actively breaking into what is privately ours, and what i'm discussing here is not breaking into anything it's using data which is there and is legally there. 200 00:36:37.170 --> 00:36:42.450 Yaacov Lozowick: And so it's it's a very different situation, the second thing is, of course, and I am talking about. 201 00:36:42.960 --> 00:36:52.470 Yaacov Lozowick: In most cases i'm not talking about data, which was created yesterday talking about all the data, although interesting interestingly, the other day I was listening to a podcast. 202 00:36:53.310 --> 00:37:05.790 Yaacov Lozowick: And American podcasts where they were talking about the different policies of policing and different walls of policing and if police should be more aggressive or less aggressive and so on, so forth, this is a burning issue in the United States this year. 203 00:37:06.960 --> 00:37:17.250 Yaacov Lozowick: And one of the one of the experts was making the point that all of these policies, the left wing of the policies and save you found the police and the and the right wing but policies and say more police whatever it is. 204 00:37:17.700 --> 00:37:22.320 Yaacov Lozowick: All of them would benefit greatly from what they generally don't have, and that is good data. 205 00:37:23.040 --> 00:37:31.530 Yaacov Lozowick: anonymized data, of course, but good data, how many people get arrested every month, how many people who get arrested had been arrested in the past, how many people get arrested for what. 206 00:37:31.860 --> 00:37:46.290 Yaacov Lozowick: For for for for what kinds of kinds of fences how many people takes how wanted to come to court if they're if they ever come to court, how many people are in cursor and so on and so forth, and he was bringing examples, he was he was taking these these these live. 207 00:37:47.310 --> 00:37:55.230 Yaacov Lozowick: political discussions and he would say here is data which affects your position if you're if you're interested in data if you're just interested in ideology of course now. 208 00:37:55.440 --> 00:38:03.660 Yaacov Lozowick: But if you're interested in in solving problems, and not just being ideological, then this data will will will will of course be very useful, I would say even essential. 209 00:38:04.110 --> 00:38:15.600 Yaacov Lozowick: Except that it's not really accessible to many plates someplace the United States, a very big country so some places, it is accessible but it's not In most places that's, certainly not in a systematic and user friendly way. 210 00:38:16.920 --> 00:38:27.180 Yaacov Lozowick: So, having said that, about contemporary data I just so happens at this week in i'm running a research project and as part of this research project that came across. 211 00:38:27.600 --> 00:38:30.630 Yaacov Lozowick: A file from the 1980s so 40 years ago. 212 00:38:31.020 --> 00:38:39.570 Yaacov Lozowick: Which is an archival five from Israel state archive for years ago, which was a speaker was a file dealing with people who are living in settlements beyond the green line. 213 00:38:39.810 --> 00:38:49.710 Yaacov Lozowick: Who, in the face of rising past any and violence, wanted to get out they wanted to move back inside the green line and they needed government support in order to be able to do so because they didn't have the money to. 214 00:38:52.680 --> 00:39:03.360 Yaacov Lozowick: And they are Kevin had blacking out and redacted almost the entire file, or you could see what it was about what that was so I, this is a couple weeks ago I wrote it out, so this is ridiculous, this is the last. 215 00:39:03.990 --> 00:39:13.650 Yaacov Lozowick: criteria that we do it and two or three days ago I got the they they agreed with me and they redid the file and they sent me a file, which is now less fully redacted strong. 216 00:39:14.190 --> 00:39:27.990 Yaacov Lozowick: But it was still pretty badly redacted the fact that they were they were redacting the names of the individuals involved and their ID numbers that makes perfect sense, because these are these are issues of their of their social and financial. 217 00:39:29.130 --> 00:39:43.560 Yaacov Lozowick: existence in general, but you don't if that the general public doesn't need to see it, but have to having anonymized data so that we don't know who these people were they went on, and they anonymize additional data, such as the make of the car they owned. 218 00:39:44.730 --> 00:39:51.330 Yaacov Lozowick: The the data that they were born, the country that they were born the day when they were married if they were married and all sorts of other. 219 00:39:51.690 --> 00:39:55.890 Yaacov Lozowick: Other information which actually was actually for the sociological research can be very interesting. 220 00:39:56.130 --> 00:40:03.990 Yaacov Lozowick: Who are the people who are getting up and leaving and who, the people are not getting up and leaving and what, what are the parameters that are affecting this, and this was all black and out and I sent this back to the. 221 00:40:05.460 --> 00:40:09.630 Yaacov Lozowick: festival and we'll see what happens, so this is an example of. 222 00:40:10.080 --> 00:40:23.010 Yaacov Lozowick: Research where we, the research really doesn't need to know the name of the individuals, but it is data from 40 years ago whether it was whether they they they gave it with the intent that I see it or not, they didn't they never thought about that, when they. 223 00:40:23.220 --> 00:40:32.730 Yaacov Lozowick: When they send these letters that tell the government, they didn't think but stroke research but it's there by law, it should be open okay let's look at another example. 224 00:40:33.690 --> 00:40:41.640 Yaacov Lozowick: And this is the the census from 1948 and 1948 a few months after the State had been had been founded the. 225 00:40:42.090 --> 00:40:49.320 Yaacov Lozowick: State will shut down for two days everybody had to stay at home and census takers went from house to house and registered. 226 00:40:49.860 --> 00:40:56.310 Yaacov Lozowick: Everybody in the country, to see how many people we had, and when I was the head of the State archive which was in 2000. 227 00:40:57.240 --> 00:41:08.100 Yaacov Lozowick: I left in 2018 but 2018 was the seventh and 70th anniversary of the State of Israel and we thought it would be a good idea cool idea to scan and type in all of these. 228 00:41:08.820 --> 00:41:21.480 Yaacov Lozowick: These these these questionnaires from seven years earlier and put them online which countries do this was not a bath he wasn't a unique idea people put countries put your sentence, the old census not really not a new sentence. 229 00:41:21.870 --> 00:41:24.810 Yaacov Lozowick: Seven years that's a good enough long tap tap tap tap wait. 230 00:41:25.230 --> 00:41:34.440 Yaacov Lozowick: And we went through all the motions and we had a long two year protracted painful discussion with the legal types were eventually blocked it and did not allow this to happen, and they said because. 231 00:41:34.770 --> 00:41:40.950 Yaacov Lozowick: There is information in there, seven years old, which is dictated by the loss of privacy and you can open it. 232 00:41:41.250 --> 00:41:47.730 Yaacov Lozowick: And we said Okay, we will not open their their their ID number but, but everything else, what what can be in there. 233 00:41:48.060 --> 00:41:58.740 Yaacov Lozowick: And, of course, when you're talking about seven years in a sentence you don't want the names and atomized you want them to be on so people can find their grandfather, so we can people can see what their what their what. 234 00:41:59.190 --> 00:42:03.450 Yaacov Lozowick: Their grandmother was what's what's what's was born and we're married or who knows what. 235 00:42:04.920 --> 00:42:12.570 Yaacov Lozowick: They are, as I said, ultimately, was blocked, they did not open material which still open, it still closed, now and in that context. 236 00:42:12.840 --> 00:42:24.120 Yaacov Lozowick: And one of the fields that they were really worried about was vocation and the other one was level of education and they said level of education that's very private people don't want to know that there that they have a level of. 237 00:42:24.540 --> 00:42:27.570 Yaacov Lozowick: Education and just so happened that, at the same time, with this was happening. 238 00:42:28.710 --> 00:42:37.740 Yaacov Lozowick: I had course of an individual came to me, and he said to me, I need to see my grandparents files and by law, we could give him his grandparents files. 239 00:42:38.010 --> 00:42:44.250 Yaacov Lozowick: They did we didn't put them on online whole world can see them, but he could see them, I sent him is the the the files of his. 240 00:42:44.760 --> 00:42:52.950 Yaacov Lozowick: His flashiness grandparents and then I asked him afterwards and he was really excited I was on the phone, we had a we had a we had a family reunion reunion, we all looked at it. 241 00:42:53.310 --> 00:42:58.020 Yaacov Lozowick: And I said, well, it says that all four of your grandparents were illiterate. 242 00:42:58.650 --> 00:43:07.650 Yaacov Lozowick: And the lawyers have been telling me that that is really a shameful thing, and you wouldn't want to put that online So what do you think if we were to put that online What would you think about that. 243 00:43:08.040 --> 00:43:17.070 Yaacov Lozowick: And he said yeah you guys are all backwards We grew up with parents telling us that we had to do our homework, because their parents had been illiterate. 244 00:43:17.580 --> 00:43:25.290 Yaacov Lozowick: And my generation, we are all lawyers and engineers and we're all because our parents told us, you can't be like our parents, you have to be educated. 245 00:43:25.710 --> 00:43:36.600 Yaacov Lozowick: And finally, said two generations later yeah i've got you have given me documented proof it indeed our grandparents were illiterate and then what our parents was telling us was true. 246 00:43:37.050 --> 00:43:43.560 Yaacov Lozowick: You don't know how exciting, this is for us so that's an another example of how different perspective. 247 00:43:43.980 --> 00:43:51.780 Yaacov Lozowick: The the law today is irrelevant for that, and I want to look at that the last the last example that I want to give that example for you, but even earlier. 248 00:43:52.410 --> 00:44:04.470 Yaacov Lozowick: From before the time of the stuffy upstate there were i'm not looking at the history, but there were different groups there were there, there were more active militant and they were they were different groups dealing with the with the with the British authorities. 249 00:44:07.290 --> 00:44:12.120 Yaacov Lozowick: And there was a moment, there was in two months time before the state was created where some of these. 250 00:44:13.500 --> 00:44:23.670 Yaacov Lozowick: semi legal organizations round up and sent over to the British to be arrested and incarcerated members of other organizations that they felt were. 251 00:44:24.480 --> 00:44:36.750 Yaacov Lozowick: were to were to violence Okay, so it was the edge of a civil war, if you will, between Jewish organizations who are actively against the the the the the the British Government and others who were. 252 00:44:37.230 --> 00:44:45.270 Yaacov Lozowick: filed against the government and it turns out that there's a large pile of these files from for this, the creation of the State. 253 00:44:45.570 --> 00:44:55.380 Yaacov Lozowick: Where the hug and our Members who are the people doing the the the the the arresting the again our Members were arresting people they thought were members of these terrorist organizations and they were. 254 00:44:55.710 --> 00:45:00.300 Yaacov Lozowick: bringing them in interrogating them and then selling them on and these files, you can see them. 255 00:45:00.660 --> 00:45:08.910 Yaacov Lozowick: And a request came in from somebody who's felt that his grandparents would have had been on one side or the other of that particular have you met anyone don't see the files. 256 00:45:09.450 --> 00:45:15.810 Yaacov Lozowick: And the earthy the the archivist and the and then the legal type said, we cannot open, even though by now. 257 00:45:16.710 --> 00:45:24.030 Yaacov Lozowick: This is 90 years ago right, we can open this material, because I, I remember this was one of the one of the legal character, said to me. 258 00:45:24.420 --> 00:45:34.140 Yaacov Lozowick: One of these meetings and everybody else not except for me, he said I wouldn't want, if my grandparents who on one side or the other side of that I wouldn't want that to be public. 259 00:45:35.550 --> 00:45:40.380 Yaacov Lozowick: And I said actually I would be fascinating if I was able to know about that about that sort of thing. 260 00:45:40.980 --> 00:45:45.150 Yaacov Lozowick: So the bottom line of all what I what I wanted to say my time is running out, is. 261 00:45:45.660 --> 00:45:57.780 Yaacov Lozowick: that although the SP the issues that you're discussing are of crucial importance when we're talking about who holds on to contemporary data from today. 262 00:45:58.290 --> 00:46:05.040 Yaacov Lozowick: And it's, the question is, can it be open today and who can see it, and what users can be made it and that's all the gentleman and all important. 263 00:46:06.000 --> 00:46:10.890 Yaacov Lozowick: I do request that, at least in the case of large organizations and the case of countries. 264 00:46:11.250 --> 00:46:27.990 Yaacov Lozowick: You keep in mind that eventually this data will become what I would call archival data it's no longer contemporary it's no longer active it's less than less accurately reflects reality less and less if at all, reflects reality, it reflects a past. 265 00:46:28.440 --> 00:46:39.840 Yaacov Lozowick: But the ability to use that material for research academic personal reunions family reunions all sorts of other reasons rises the older that material. 266 00:46:41.550 --> 00:46:55.830 Yaacov Lozowick: And it is absolutely unacceptable over this is the case of Israel, right now, it is absolutely unacceptable that the rules and the laws that dictate how the contemporary data being protected also be applied to archive okay. 267 00:46:56.880 --> 00:47:10.860 Limor Shmerling Magazanik, Israel Tech Policy Institute: Thank you, thank you very much, I thought that was fascinating and before we move on to the next speaker just to make sure if any of our other panelists want to make some comments on your curves presentation. 268 00:47:14.430 --> 00:47:15.300 Clarisse Girot (FPF): don't tempt me. 269 00:47:18.180 --> 00:47:21.690 Clarisse Girot (FPF): i'll go ahead and no I mean i'm absolutely fascinated. 270 00:47:22.170 --> 00:47:31.170 Clarisse Girot (FPF): By what you just said, because of course we come from we operate in completely different backgrounds and you know completely different settings and the work you do is so. 271 00:47:31.530 --> 00:47:42.720 Clarisse Girot (FPF): Fundamental and even more so in Israel, where I tides, are you know it's incredibly precious for a relatively young state right compared with many others um. 272 00:47:43.320 --> 00:47:54.780 Clarisse Girot (FPF): I think it's fascinating because we are all passionate about what we do, and this is the beauty of the regulation of data is that today we virtually every single sector. 273 00:47:55.260 --> 00:48:02.700 Clarisse Girot (FPF): You know, in our daily life, but also going back has an impact on you know everything we do, and all the jobs we have. 274 00:48:03.690 --> 00:48:15.930 Clarisse Girot (FPF): we're all passionate and at the same time, we have to work together, because, as the debris lower saying a bit earlier it's all about striking the right balance, and I remember very well working for the French data protection authority. 275 00:48:16.710 --> 00:48:21.210 Clarisse Girot (FPF): You know very difficult discussions, but at the end of the day, between smart people you always find a way. 276 00:48:22.440 --> 00:48:29.670 Clarisse Girot (FPF): To to manage your way out and satisfy everybody's interest, even though sometimes you know some, we have to make compromises. 277 00:48:30.030 --> 00:48:35.790 Clarisse Girot (FPF): But I do recall very vividly that the discussions on, you know how to draw the line between. 278 00:48:36.390 --> 00:48:45.420 Clarisse Girot (FPF): Freedom of information also or freedom of expression and and data protection, you know we're fairly heated right it's not easy but that's. 279 00:48:45.720 --> 00:48:54.030 Clarisse Girot (FPF): There is nothing in the law that prevents this conversations from happening and that's I think super important to to mention and. 280 00:48:54.540 --> 00:49:07.440 Clarisse Girot (FPF): I really regret when I hear that data protection and privacy are used as a shield when actually you know the other, the objective that you pursue is perfectly legitimate and, in fact, extremely important for society so it's Probably not. 281 00:49:08.070 --> 00:49:21.240 Clarisse Girot (FPF): Sorry, I mean it's late for me and i'm very passionate about these things, but I find it amazing to have the opportunity to strike the balance between such different you know objectives, and it can be done, the data protection laws allow for that yeah. 282 00:49:22.410 --> 00:49:23.670 Limor Shmerling Magazanik, Israel Tech Policy Institute: gabrielle every now. 283 00:49:24.990 --> 00:49:40.260 Limor Shmerling Magazanik, Israel Tech Policy Institute: And i'm actually really curious if any of you know of any articles wording in different jurisdictions that are more specific, I know that the gdpr has a legal base that allows for. 284 00:49:40.830 --> 00:49:55.110 Limor Shmerling Magazanik, Israel Tech Policy Institute: Processing data for research purposes historical research archiving but are you familiar with somewhere where these articles are more elaborate or are the guidelines that can be learned from. 285 00:49:57.720 --> 00:50:05.040 Gabriela Zanfir-Fortuna: And not not that i'm aware of I don't know how the LGBT in Brazil deals with this, but the gdpr indeed has. 286 00:50:05.700 --> 00:50:23.490 Gabriela Zanfir-Fortuna: A tailored regime, I would say, for pricing of personal data for research purposes and for archiving purposes as well in the public interest, and this has to I mean they are recognized as legitimate interests first, first of all. 287 00:50:25.140 --> 00:50:27.060 Gabriela Zanfir-Fortuna: There are also. 288 00:50:28.890 --> 00:50:31.830 Gabriela Zanfir-Fortuna: There is a bit of room there for. 289 00:50:32.970 --> 00:50:38.670 Gabriela Zanfir-Fortuna: compatible purposes they're considered actually to be compatible purposes so. 290 00:50:39.780 --> 00:50:54.630 Gabriela Zanfir-Fortuna: There is a presumption that they are actually almost always compatible with your initial purposes, when you process data for this type of of purposes so including archiving in the public interest. 291 00:50:56.490 --> 00:51:03.990 Gabriela Zanfir-Fortuna: And we will here soon from the European Data prediction boredom this because it hasn't been working on guidelines. 292 00:51:04.680 --> 00:51:18.150 Gabriela Zanfir-Fortuna: on how this compatibility of purposes functions on guidelines broadly on pricing of personal data for research purposes and for archiving purposes and historical purposes in the public in interest. 293 00:51:19.980 --> 00:51:42.090 Gabriela Zanfir-Fortuna: As part of that we will learn a bit more how this happens in practice but it's very important to highlight that, indeed, the gdpr took the importance of this into account in there are specific places in the regulation that Taylor, there are the data protection regime to do it. 294 00:51:42.990 --> 00:51:51.780 Limor Shmerling Magazanik, Israel Tech Policy Institute: So thank you for Berlin and tourists that's really good news and and worth following to see what the DVD comes out when we look at these really. 295 00:51:53.160 --> 00:52:00.990 Limor Shmerling Magazanik, Israel Tech Policy Institute: Legislation and Bruno i'm going to maybe introduce you, and then you will have the floor, both for your comments and for your presentation so. 296 00:52:01.410 --> 00:52:15.090 Limor Shmerling Magazanik, Israel Tech Policy Institute: Just so everyone is aware, Dr Bruno beyond he is the founding director of privacy, Brazil and a member of the National Council for data protection and is a consultant, and one of the authors of the report that all of you. 297 00:52:15.720 --> 00:52:30.060 Limor Shmerling Magazanik, Israel Tech Policy Institute: In the audience has received previously and he will present insights from Brazil from the recent data protection legislation process where there's issues of legal basis and striking a balance between. 298 00:52:30.420 --> 00:52:39.090 Limor Shmerling Magazanik, Israel Tech Policy Institute: Different interest in the legislation was discussed thoroughly so Bruno, the floor is yours and i'm just sorry. 299 00:52:39.900 --> 00:53:00.270 Limor Shmerling Magazanik, Israel Tech Policy Institute: All of the participants or invited to enter questions in the Q amp a if you want to deep dive into any of the legal basis that were mentioned so for either jurisdictions, we will have time for Q amp a at the end so welcome your questions in the Q amp a function grow. 300 00:53:01.110 --> 00:53:10.110 Bruno Bioni, Data Privacy Brazil: Thank you lemur so i'd like to thank you, our friends from fb F and because of these organizations really worth and informative. 301 00:53:10.890 --> 00:53:19.260 Bruno Bioni, Data Privacy Brazil: event so i'm really looking forward to to share some impressions and the information about this global trends let's say. 302 00:53:19.650 --> 00:53:30.000 Bruno Bioni, Data Privacy Brazil: And how we can be informative and cooperative with each other so i'd like also choose to thank you to be part of this event, with his. 303 00:53:30.510 --> 00:53:45.690 Bruno Bioni, Data Privacy Brazil: yaakov and also Gabriella one of our greatest global thinkers about privacy and i'd like to start by introducing my organization I think it's quite important, because I can say what we do and what we. 304 00:53:46.260 --> 00:54:03.510 Bruno Bioni, Data Privacy Brazil: don't do as well, so I will share my my presentation, I have some here some lights so let's see yes so and i'm executive director of that price, Brazil and that purse Brazil is. 305 00:54:04.770 --> 00:54:18.150 Bruno Bioni, Data Privacy Brazil: Space of articulation between two organizations The first one is that the person Brazil score Academy, so we are delivering professional course about privacy, data protection in governance of emerging technologies. 306 00:54:18.510 --> 00:54:27.300 Bruno Bioni, Data Privacy Brazil: And also, at that price busy research association, so a nonprofit organization that are conducting direct co and also empirical research. 307 00:54:27.570 --> 00:54:33.780 Bruno Bioni, Data Privacy Brazil: About practicing that protection and today i'd like to to share some some thoughts and the results. 308 00:54:34.350 --> 00:54:44.790 Bruno Bioni, Data Privacy Brazil: of one of our research, which was precisely about legal grounds at LGBT the Brazilian comprehensive data protection law, specifically. 309 00:54:45.120 --> 00:55:01.110 Bruno Bioni, Data Privacy Brazil: About legitimate interests so here in Brazil, I can read say we opted to have legitimate interest in our eco foods as consent and other eight legal basis for non sensitive and data. 310 00:55:01.560 --> 00:55:17.280 Bruno Bioni, Data Privacy Brazil: And so we we decide to move on, so as far as their stood understand is quite different, that we have right now in Israel, and that you are trying to to enter in modernization process so. 311 00:55:18.030 --> 00:55:27.090 Bruno Bioni, Data Privacy Brazil: What we have here at Brazil and developed our comprehensive data protection law, so it was a long journey and. 312 00:55:27.630 --> 00:55:42.930 Bruno Bioni, Data Privacy Brazil: Here in almost a decade so 10 years of the solution and consultations of our LGBT we have a lot of policymaking in normative learnings that i'd like to share and our. 313 00:55:43.470 --> 00:55:58.530 Bruno Bioni, Data Privacy Brazil: Report was translated for English with supported of our friends from fema, so I think you have read access and the report and my presentation will be divided in two parts, so first. 314 00:55:59.460 --> 00:56:10.230 Bruno Bioni, Data Privacy Brazil: What is the what what were, what was the path to creating to crack to legal grounds at the Brazilian comprehensive data protection law, so what was. 315 00:56:10.920 --> 00:56:21.900 Bruno Bioni, Data Privacy Brazil: Our policy making learning and after in the second part i'd like to to share with you some controversial interpretations of our law. 316 00:56:22.260 --> 00:56:26.640 Bruno Bioni, Data Privacy Brazil: Right now, with regards legitimate interests and other legal grounds so. 317 00:56:27.180 --> 00:56:39.030 Bruno Bioni, Data Privacy Brazil: Beyond the tool to prove to to pass to approve a lot, there is also a second wave and in order to better interpretation calibrate the means of the law, so I would like to share. 318 00:56:39.540 --> 00:56:53.100 Bruno Bioni, Data Privacy Brazil: Those two parts of our current situation here in Brazil so with regards the path to craft legit meanness as a legal ground and now so to shift and the normative design. 319 00:56:53.520 --> 00:57:04.650 Bruno Bioni, Data Privacy Brazil: of our Brazilian comprehensive data protection law, so there is a really worth and informative timeline here in Brazil, because, as I just said, we had like. 320 00:57:05.070 --> 00:57:20.520 Bruno Bioni, Data Privacy Brazil: 10 years almost 10 years of Perkins Christians several drafts tax that was put it in an open book public consultation and since the beginning, we could see how the tax evolve. 321 00:57:21.090 --> 00:57:34.650 Bruno Bioni, Data Privacy Brazil: So until 2016 between the first book consultation in the second consultation consent was the main hypothesis as a legal ground, like the room. 322 00:57:35.400 --> 00:57:49.350 Bruno Bioni, Data Privacy Brazil: And the President civil society decides to evolve, the text also who was working at the government at that time that was creating and drafting crafting the text of the draft to be. 323 00:57:50.010 --> 00:57:59.190 Bruno Bioni, Data Privacy Brazil: After these two rounds of stations, then we decide to include legit legitimate interest as an order and as an additional. 324 00:57:59.520 --> 00:58:10.050 Bruno Bioni, Data Privacy Brazil: legal base in the we have here alone Journal and several disputes that happen, and if it gets really worth to to share with you. 325 00:58:10.830 --> 00:58:21.840 Bruno Bioni, Data Privacy Brazil: First of all, I think here in Brazil, we could learn and with our regulatory delay and also doing some research about. 326 00:58:22.410 --> 00:58:32.550 Bruno Bioni, Data Privacy Brazil: what's worse and what is the state of art of legal grounds in other jurisdictions and one of them, of course, is the European one. 327 00:58:33.300 --> 00:58:42.540 Bruno Bioni, Data Privacy Brazil: As Gabriela set and legitimate interest is not new was crafted in the directive so almost 20 years ago. 328 00:58:42.960 --> 00:58:56.340 Bruno Bioni, Data Privacy Brazil: And that, since the beginning of the directive because of the fragmentation of the enforcement of law, we could see that there was a lacking of more consistent and harmonize approach in Europe. 329 00:58:56.970 --> 00:59:08.580 Bruno Bioni, Data Privacy Brazil: For instance France interpret data legit meters in one direction German interpret data their religion in this in another in a completely different direction that's why. 330 00:59:09.000 --> 00:59:24.120 Bruno Bioni, Data Privacy Brazil: The working file for night decide to to issue those guidelines in order to better better calibrated the interpretation of legitimate interest that's the whole idea behind and what is call it legitimate interest assessment. 331 00:59:24.930 --> 00:59:37.110 Bruno Bioni, Data Privacy Brazil: With is a kind of test that sets parameters in order to have predictability of how to interpret data in a house or how to be used legitimate interest as a legal ground. 332 00:59:37.470 --> 00:59:50.610 Bruno Bioni, Data Privacy Brazil: So, since the beginning, we could see that legitimate interest is I really open legal concepts that probably will be a have several interpretations and then. 333 00:59:51.210 --> 01:00:06.750 Bruno Bioni, Data Privacy Brazil: Here in Brazil, we could see how the Brazilian civil society and has read those guidelines, but also a did a lobbying in the Parliament to better. 334 01:00:07.350 --> 01:00:17.790 Bruno Bioni, Data Privacy Brazil: craft a legitimate interest as a legal ground so, what was the results, the balance of this process on one side, the practice sector was lobbying for. 335 01:00:18.180 --> 01:00:31.110 Bruno Bioni, Data Privacy Brazil: Digital interest without some kind of parameters, on the other side NGOs was fighting of not having legitimate interest as a legal ground because they argue that would be. 336 01:00:31.620 --> 01:00:44.640 Bruno Bioni, Data Privacy Brazil: A blank check that would be abused and so on, at the Center of the spangler movement there was this as an academics it's important because you have to strike a balance between. 337 01:00:45.630 --> 01:00:58.860 Bruno Bioni, Data Privacy Brazil: and protection of fundamental rights and liberties, but also the economy use of data, which is quite important for research, for innovation and so on, so why not and. 338 01:00:59.640 --> 01:01:04.200 Bruno Bioni, Data Privacy Brazil: internalized observer alleged mean there's as one of legal grounds but. 339 01:01:04.860 --> 01:01:16.950 Bruno Bioni, Data Privacy Brazil: Put in the text in the heart text of the law, nothing the guideline is nothing in restaurants, but create a specific article to set the parameters in order to have a more. 340 01:01:17.340 --> 01:01:29.010 Bruno Bioni, Data Privacy Brazil: Predictability of how to use it to interpret data deletion interest as a legal grounds and recently we publish an article and say in this. 341 01:01:29.640 --> 01:01:47.760 Bruno Bioni, Data Privacy Brazil: Book of the CPP conference that tells this history and this lobbying and movement between not only just the interest, but as the law itself, so I think it's also worth if you could take a read afterwards so. 342 01:01:49.350 --> 01:01:59.100 Bruno Bioni, Data Privacy Brazil: The result of this, this process, I think it's quite worth, because then we have here in Brazil what we call it a normative equation of LGBT. 343 01:01:59.520 --> 01:02:12.390 Bruno Bioni, Data Privacy Brazil: So it's a little bit different in comparison to GP, are we not only have legitimate interest as a legal round, but to have a dedicated in a specific article that sets the parameters of. 344 01:02:12.720 --> 01:02:37.470 Bruno Bioni, Data Privacy Brazil: Its application and also, we have a specific article that reinforces the legal and recording process which we think, and we believe we are advocating for this kind of interpretation that when you when you are using legit mentors you have to have a specific. 345 01:02:38.640 --> 01:02:46.200 Bruno Bioni, Data Privacy Brazil: records in this specific record is legit mentors assessment so if you are going to use. 346 01:02:46.830 --> 01:03:01.800 Bruno Bioni, Data Privacy Brazil: legit meters, which is a kind of bonus and gives a lot of discussion narrative for the controllers and so one you have to and render and Mr in this fight account about it, so I think. 347 01:03:02.280 --> 01:03:18.630 Bruno Bioni, Data Privacy Brazil: We we had this adds normative balance that puts in a more simple way when you are using and we, we are moving the other nine legal grounds, you have just to have a simple. 348 01:03:19.260 --> 01:03:38.310 Bruno Bioni, Data Privacy Brazil: robot or records of processing, but when you are, you are using digital in this as a legal ground, you have to have a different kind of accountability mechanisms in order to better decides how the organization is striking and this delicate dance between. 349 01:03:39.450 --> 01:03:44.430 Bruno Bioni, Data Privacy Brazil: liberties, fundamental rights and the economy, use of data and. 350 01:03:45.600 --> 01:03:54.570 Bruno Bioni, Data Privacy Brazil: Here in Brazil, we have this specifically article, which is Article 10 so we have like four face of this text. 351 01:03:55.080 --> 01:04:06.930 Bruno Bioni, Data Privacy Brazil: And this test that strikes that treachery strikes the balancing between economic us and also liberties and fundamental rights and that here we are waiting. 352 01:04:07.230 --> 01:04:22.740 Bruno Bioni, Data Privacy Brazil: For the Brazilian that protection authority to decide if this is a mandatory and wage to hinder our accountability mechanisms mechanism when legit main interest is a legal route or not, so there is this kind of. 353 01:04:23.430 --> 01:04:39.990 Bruno Bioni, Data Privacy Brazil: Dispute controversial controversial interpretative point of the law, right now, and then we have another kind of controversial controversial issue, which is certain that the controllers. 354 01:04:40.590 --> 01:04:56.430 Bruno Bioni, Data Privacy Brazil: Are interpretative in the law, even in a higher costs of compliance and then just having legitimate interest assessment, because when we pass it the law, the tax, there was a kind of mistake. 355 01:04:56.940 --> 01:05:11.490 Bruno Bioni, Data Privacy Brazil: And then here at some that employers are always using data protection impact assessments to handle an account about a legitimate interest which I think personally think. 356 01:05:11.910 --> 01:05:26.220 Bruno Bioni, Data Privacy Brazil: is wrong because it's not the legal grounds that defines if the data operation and the data flow is riskier or not Okay, you can have and data protection. 357 01:05:26.970 --> 01:05:42.420 Bruno Bioni, Data Privacy Brazil: Data flows, which is based on consent, and these even riskier rather when you are using legit mean this as a legal ground so we are also waiting for the President data protection authority as so this. 358 01:05:43.290 --> 01:05:53.430 Bruno Bioni, Data Privacy Brazil: controversial question what kind of accountability mechanisms is necessary when organizations are relying on legit mentors as a legal ground. 359 01:05:54.600 --> 01:06:07.140 Bruno Bioni, Data Privacy Brazil: So we were able to craft a specific article to set the parameters, but right now we are waiting for a definition and our legal let's say security. 360 01:06:07.500 --> 01:06:19.980 Bruno Bioni, Data Privacy Brazil: In order to be able, and to be precise about what kind of accountability mechanism is necessary or not, and then the second controversial issue here in Brazil. 361 01:06:20.670 --> 01:06:40.170 Bruno Bioni, Data Privacy Brazil: is about the limits about about religious mentors because an hour and law does not say anything about the rights of opt out, in other words, when the data subject can also revoke and. 362 01:06:40.740 --> 01:06:56.790 Bruno Bioni, Data Privacy Brazil: her or his consent, but maybe he or she can also impose some limits or false with regard some specific to use of their data when they're there is another legal ground for it. 363 01:06:57.210 --> 01:07:04.110 Bruno Bioni, Data Privacy Brazil: For instance, in gdpr or even in the European Union there is this kind of absolute right of. 364 01:07:04.740 --> 01:07:22.380 Bruno Bioni, Data Privacy Brazil: opt out with regards and marketing purpose for that process during Brazil, we don't have a like a similar provision So here we are waiting for for for for this clarification of how I interpret dates below so. 365 01:07:22.980 --> 01:07:35.910 Bruno Bioni, Data Privacy Brazil: liking in really few words, and then I can put here in in the interest of time, and I think here in Brazil, there is there was really informative. 366 01:07:36.390 --> 01:07:55.980 Bruno Bioni, Data Privacy Brazil: journey about how to involve the law in order to strike this balance, and this is quite necessary when you are crafting and the region are the menu of options with regards to legal grounds and constant constant is, this is not differently. 367 01:07:57.060 --> 01:08:11.820 Bruno Bioni, Data Privacy Brazil: let's say the pillar of decision, but it has to be like a more clear way about this interplay between constant and our other legal grounds Brazil has adopted. 368 01:08:12.840 --> 01:08:20.670 Bruno Bioni, Data Privacy Brazil: A let's say a creative way to not only absorb legitimate interest as one of the legal grounds but also. 369 01:08:21.300 --> 01:08:35.850 Bruno Bioni, Data Privacy Brazil: Try to dedicate a specific article to set the parameters which is good, I think, for not only data subjects, but also for business that controls, because, because then you set a more predictable way. 370 01:08:36.630 --> 01:08:47.070 Bruno Bioni, Data Privacy Brazil: of how to use and how to being forced and the legitimate interest in the future, which is, at the end of the day, about. 371 01:08:47.700 --> 01:09:04.590 Bruno Bioni, Data Privacy Brazil: legal security, predictability and so on, so i'd like to thank you again for for having me, I hope that this presentation was informative for this really exciting process of modernization of ISA oh that's the actual. 372 01:09:05.520 --> 01:09:13.830 Limor Shmerling Magazanik, Israel Tech Policy Institute: Thank you so much, no, it certainly was and I have a few questions I think in follow up to to all of your privacy. 373 01:09:13.830 --> 01:09:28.140 Limor Shmerling Magazanik, Israel Tech Policy Institute: Experts in line with with what you just said, focusing on legitimate interest and on the fact that one of my takeaway is that it needs to be well defined. 374 01:09:29.280 --> 01:09:43.200 Limor Shmerling Magazanik, Israel Tech Policy Institute: And and accompanied by guidelines or a specific articulation of what is required in order to achieve a proof of legitimate interest and one of the things that was. 375 01:09:44.640 --> 01:10:03.930 Limor Shmerling Magazanik, Israel Tech Policy Institute: popping up for me is, if you can tell us a little bit about how do you work out the right to object to processing of data in instances where the lawful ground is legitimate interest, then, how does the person know and can they object does that, how does. 376 01:10:03.990 --> 01:10:06.090 Limor Shmerling Magazanik, Israel Tech Policy Institute: Those two pieces come together. 377 01:10:06.300 --> 01:10:09.180 Limor Shmerling Magazanik, Israel Tech Policy Institute: feel free to to jump in. 378 01:10:09.540 --> 01:10:11.070 Gabriela Zanfir-Fortuna: I will jump in here first. 379 01:10:12.270 --> 01:10:20.550 Gabriela Zanfir-Fortuna: Because I wanted to make a point that I forgot to making my first presentation and then I know Bruno will will double down on that. 380 01:10:21.210 --> 01:10:30.090 Gabriela Zanfir-Fortuna: it's very important to know that even when personal data is being processed on the basis of legitimate interest, so therefore you are not asking for the consent of the individual. 381 01:10:31.980 --> 01:10:44.910 Gabriela Zanfir-Fortuna: The all the other safeguards in rates still apply and, for example, you, you still have to give notice to data subjects and find a way to. 382 01:10:46.290 --> 01:10:56.670 Gabriela Zanfir-Fortuna: REACH with reasonable means, of course, and when it's reasonable to find a way to give them notice and have the right of information being the dust complied with. 383 01:10:58.230 --> 01:11:12.930 Gabriela Zanfir-Fortuna: Most of the rights of the data subject continue to apply just as well for legitimate interest under the gdpr, though, for example, the data portability is not the right that's available for right now. 384 01:11:13.620 --> 01:11:28.950 Gabriela Zanfir-Fortuna: For individuals whose data is processed on the basis of legitimate interest, but all of the other rights are so you still have the right to access your own data and receive a copy of it, you, you can object to the pricing. 385 01:11:30.210 --> 01:11:43.470 Gabriela Zanfir-Fortuna: More than that, the other principles still apply or purpose limitation data security and confidentiality and the principles in Article five still apply so it's it's really not a safe harbor type of thing. 386 01:11:44.460 --> 01:11:59.640 Gabriela Zanfir-Fortuna: it's it's quite thoughtful and complex to be able to justify it as Bruno was pointing out, and then, once you achieve that you still have to go through the the information and giving notice prices and allowing data subjects to. 387 01:12:00.690 --> 01:12:02.730 Gabriela Zanfir-Fortuna: exercise their rights and i'll stop here. 388 01:12:03.960 --> 01:12:04.290 Gabriela Zanfir-Fortuna: Okay. 389 01:12:04.650 --> 01:12:11.970 Limor Shmerling Magazanik, Israel Tech Policy Institute: And in our last few minutes, maybe another question that I was meaning to ask all of you. 390 01:12:13.140 --> 01:12:34.650 Limor Shmerling Magazanik, Israel Tech Policy Institute: How does legitimate interest work with third parties, so when we have legitimate interest that is directly by the controller, that is, maybe easier to understand, can the controller work out a legitimate interest of a third party force party How does that work. 391 01:12:39.600 --> 01:12:41.850 Gabriela Zanfir-Fortuna: Well, in theory, yes. 392 01:12:42.990 --> 01:12:50.460 Gabriela Zanfir-Fortuna: Right, because this is how the provisions are crafted and the legitimate interest of a third parties being considered. 393 01:12:51.300 --> 01:13:02.010 Gabriela Zanfir-Fortuna: I would even say that even the legitimate interest of this facade like associate all so Seattle, a legitimate interest a bit broader but community is also considered. 394 01:13:02.460 --> 01:13:16.830 Gabriela Zanfir-Fortuna: For for some of this processing under legitimate interest, so the third party can actually be what Community right if you can properly justify it um so I see that Bruno wants to intervene and they weren't too. 395 01:13:18.090 --> 01:13:23.550 Bruno Bioni, Data Privacy Brazil: yeah so with regard to your your your first question Lima and. 396 01:13:26.160 --> 01:13:39.000 Bruno Bioni, Data Privacy Brazil: I just said, point out that is one of the biggest controversial issues, right now, here in Brazil, we are waiting for for the Brazilians EPA issue some guidelines are anything and. 397 01:13:40.080 --> 01:13:41.280 Bruno Bioni, Data Privacy Brazil: What I can I see. 398 01:13:42.510 --> 01:13:57.630 Bruno Bioni, Data Privacy Brazil: defensively and the rights of the art is not like absolute right So if you have to be strikeout and balance in in some case it's not possible to to exercise, for instance, when we are dealing with. 399 01:13:58.560 --> 01:14:12.480 Bruno Bioni, Data Privacy Brazil: Fraud Prevention doesn't make any sense the right off up out here, but in other case makes perfect sense for instance with regards marketing purpose. 400 01:14:13.830 --> 01:14:28.170 Bruno Bioni, Data Privacy Brazil: that's, why did you PR is established that would be in that specific case an absolute right and then here in Brazil there, there is a and this whatsapp case. 401 01:14:28.980 --> 01:14:48.810 Bruno Bioni, Data Privacy Brazil: That is dealing is being conducted, not only by the Brazilian data protection authority, but also for the consumer watchdog there to trust authority and also for the Federal prosecutors policy and they designed the issue or guideline about the importance. 402 01:14:50.010 --> 01:15:03.810 Bruno Bioni, Data Privacy Brazil: of transparency and safeguards because Okay, if you are relying not on not on on on constant, but the legit interest for this new kind of type of using. 403 01:15:04.170 --> 01:15:17.970 Bruno Bioni, Data Privacy Brazil: With regard that specific of what form at least the data just should be the right triples I i'd like to keep going using, for instance, what side and. 404 01:15:19.170 --> 01:15:29.820 Bruno Bioni, Data Privacy Brazil: When the current business model was completely different, for instance, that that is the message behind off of the scenes, and the second and. 405 01:15:30.390 --> 01:15:47.670 Bruno Bioni, Data Privacy Brazil: And here in Brazil, there is also this provision about legitimate interest of third parties and I completely agree with with gabrielle I think it's something quite useful for society so think about an open data policies, so I can. 406 01:15:48.780 --> 01:16:02.340 Bruno Bioni, Data Privacy Brazil: Do the download of the whole database and then try to innovate, create new use I don't have any condition to go to the data subjects of 1 million of users and ask for constant but maybe. 407 01:16:02.700 --> 01:16:14.070 Bruno Bioni, Data Privacy Brazil: I have a proper legitimate interest to do it, so that year I think it's quite interesting issue, because then we can think about data as a common goods. 408 01:16:15.900 --> 01:16:28.110 Bruno Bioni, Data Privacy Brazil: And then the society can have also the proper mechanisms to innovate about it and so that's why it's so important to what Gabriella said. 409 01:16:28.680 --> 01:16:50.490 Bruno Bioni, Data Privacy Brazil: At the first part of of her top and that protection is like a two fold mechanism is not only about protecting liberties fundamental right, but also, and about a free flow of information and of interest of public interest in this society as a whole, oh. 410 01:16:50.520 --> 01:16:53.940 Limor Shmerling Magazanik, Israel Tech Policy Institute: Thank you, thank you very much for that that's marvelous and glorious. 411 01:16:55.050 --> 01:16:57.600 Limor Shmerling Magazanik, Israel Tech Policy Institute: Do you want to give your last sentence and the. 412 01:16:57.600 --> 01:16:58.320 Limor Shmerling Magazanik, Israel Tech Policy Institute: idea of where. 413 01:16:58.440 --> 01:17:00.030 Limor Shmerling Magazanik, Israel Tech Policy Institute: know, in the last 10 minutes. 414 01:17:00.150 --> 01:17:09.030 Clarisse Girot (FPF): or so very short time, but i'm festival yes in in Asia as well in the jurisdictions that implement legitimate interest it's very clear. 415 01:17:09.390 --> 01:17:19.680 Clarisse Girot (FPF): That the wider societal interest in constituent illegitimate interest, so it goes beyond the interest of the organization that collects the data in any case in Singapore it's been made are extremely clear. 416 01:17:20.370 --> 01:17:26.460 Clarisse Girot (FPF): So, but there will always be variations also between countries on how the legitimate interest will be assessed. 417 01:17:26.760 --> 01:17:36.120 Clarisse Girot (FPF): In the same way that there will be differences, you know, to assess the frontier between data protection and freedom of information in archiving law, I mean it's always in national. 418 01:17:36.480 --> 01:17:44.730 Clarisse Girot (FPF): Balancing test rates are but it's definitely, it is possible, and I just wanted I hijack that the last question, to say that to your colleagues as well that. 419 01:17:45.240 --> 01:17:50.820 Clarisse Girot (FPF): Beyond the gdpr are also national provisions at the national level, that you know implement. 420 01:17:51.390 --> 01:18:06.090 Clarisse Girot (FPF): The regulation further and archiving purposes definitely are specified and national legislation to provide a lot of detail on what is communicable what is not so if there's interest i'm happy to share, at least for French law, and I know where I can find out. 421 01:18:07.980 --> 01:18:08.700 Limor Shmerling Magazanik, Israel Tech Policy Institute: The ECHO. 422 01:18:10.230 --> 01:18:11.430 Clarisse Girot (FPF): last comments. 423 01:18:12.870 --> 01:18:22.110 Yaacov Lozowick: I just want to reiterate what I said before, that was standing and everything that has been said here is is compelling and all the true and lots of. 424 01:18:23.490 --> 01:18:32.280 Yaacov Lozowick: Time plays a crucial role as I used to say when I was in these arguments in the 23rd century 100% of what's in the archive from the audience. 425 01:18:32.910 --> 01:18:40.500 Yaacov Lozowick: will be fully open Now the question is what happens between the beginning of the 21st century and sometime in 23rd century. 426 01:18:41.220 --> 01:18:48.270 Yaacov Lozowick: But the the principle is that eventually and, as time passes, more, and this is more and more compelling. 427 01:18:48.930 --> 01:19:00.930 Yaacov Lozowick: Everything whether the individuals wanted it to be open or not, whether their relatives and they're living relatives wanted to be open or not essentially everything is open totally and fully. 428 01:19:01.650 --> 01:19:07.170 Yaacov Lozowick: And the the the except what is never collected, of course, that's that's that's a different issue. 429 01:19:07.980 --> 01:19:21.930 Yaacov Lozowick: And so, so the any any legal system, which is, which is dealing with is crucial or important crucial issues of protecting our data today must take into account the scale of the advancing time. 430 01:19:22.410 --> 01:19:33.030 Yaacov Lozowick: Because what is true today is less true 20 years from now, and what is true 20 years from now, is less than 50 years from now 100 years from now, that scale has to be brought into. 431 01:19:34.950 --> 01:19:36.660 Limor Shmerling Magazanik, Israel Tech Policy Institute: Thank you, thank you so much, so I want to. 432 01:19:38.250 --> 01:19:50.100 Limor Shmerling Magazanik, Israel Tech Policy Institute: thank all of the speaker is Bruno the risk of really ECHO Thank you so much for your valuable contributions, it was fascinating for me, hopefully, also to our audience. 433 01:19:50.700 --> 01:20:13.380 Limor Shmerling Magazanik, Israel Tech Policy Institute: We are having our next session a week from now next Wednesday Wednesday, please know that it's an hour later it's going to be three Israel time and not to looking forward to seeing all of you, thank you all speakers and have a wonderful rest of your day isn't a night bye bye. 434 01:20:14.340 --> 01:20:15.270 Gabriela Zanfir-Fortuna: Thank you so much. 435 01:20:17.940 --> 01:20:18.750 Bruno Bioni, Data Privacy Brazil: bye bye Thank you. 436 01:20:20.130 --> 01:20:21.060 Clarisse Girot (FPF): Thank you so much. 437 01:20:23.610 --> 01:20:24.030 Clarisse Girot (FPF): happy.