10_Data_Privacy_and_User_Trust

9:48PM Mar 1, 2024

Speakers:

Judith Hellerstein

Benjamin Akinmoyej

Jonathan Zuck

Víctor Rodríguez Reyes

Avri Doria

Keywords:

data

human rights

privacy

benjamin

information

laws

give

collect

protect

person

government

sharing

data protection

platforms

trust

regulations

rights

innovation

systems

victor

Yes, thanks so much. I'm sorry, I can't be there in person. This wasn't up to it. But we just have a fascinating discussion on trust. And we want to continue this discussion on data privacy and trust. Because data privacy is one of the major challenges and implications surrounding the privacy of personal data. We're collecting all this data, data drives digital transformation, but it's what we rely on. And we generate massive amounts of data. And governments collect and use this data on us to make either better decisions delivered improved services, applications, health and education. And they want to build more better systems and use our data smartly. So we don't have to keep adding our information when we request a service. But with all these benefits, we have, there are a lot of great problems. And currently, governments and power sector collected are collecting massive amounts of data. And they are not, there's no real we have no, there's no certain laws that protect the data. We have new approaches to gathering and analyze data and big data. So this expansion poses many challenges for how do we protect? How do we protect the consumers data? And so we have an interesting panel here. We're trying to foster a better understanding of data privacy challenges. So we have Victor Rodriguez, who is with the intellect, he has an intellectual property and health law firm. We have avea Bravia, who's an independent researcher, and then we have Benjamin Akamai. Here, we're going to discuss these issues. Our panelists will have about 10 minutes. And we'll sculpt first to Victor who's going to set talk about the laws and other discussions of data privacy, and then we'll move to avi and Benjamin. And then we'll have a q&a discussion, a little bit of buzz among the panelists, and then we'll open up to General q&a. So each panelist will have about 10 minutes. So, Victor, take it away.

Thank you. Yeah, hi to everybody. So I don't want to go that far, talking about the history of privacy regulation. But I think we need to we can start in 1972 when Californians passed constitutional change to their constitute state constitution. And they decided to incorporate the right of privacy as an entry level, right, of the citizens of California. And in that proposition, they recognize that there were some troubling things happening already in 1972. And they talked about the issue of the collection of personal information, the fact that that information could be used without any control. And also the fact that citizens had no right to know what information was being collected, how it was used, and whether that information was correct. So we jumped to 1995 when Europe passes the Data Protection Directive, and from that, I think we jumped to what is most closely where we are at now, which is the entrance of the GDPR general data, privacy regulation in Europe. And in the GDPR, essentially, find some principles that it understands that are required for the what how they call it processing of the citizens data. And they talk about giving the right to data subjects, of First of all, being able to object to the collection of data. There are certain exceptions to that of course when you're doing a business transaction with somebody or And there are a few others like crime related organizations that have a right to investigate and so on. They also included the right of being able to object to the sharing or selling of your information. Also included in the rights were the right to be able to verify that information and make sure of its accuracy. And as part of those rights was also included. The fact that you can request the information to be deleted, to the extent that it is not absolutely necessary, or that certain amount of time has elapsed. And the fact of being able to the portability aspect, which is that you can take away the information that somebody has collected from you. So, after, just after that, in 2019, the California Consumer Privacy Act was approved and CCPA. And established principles that were very similar to what GDPR was trying to. So that takes us very close to where we are at now. Currently, today in the United States, at least 13 states have comprehensive privacy regulation. Here in Puerto Rico. In fact, there is a bill that is has been so most ready. It's very close to being approved, where that happens. We're not, we don't know, it's quite similar to CCPA. So that would be another part of the United States that becomes part of the comprehensive regulation privacy data. There have been also several attempts in federal in Congress to approve some federal data regulation. Of course, given what what has been happening in government the last years, I don't think we should hold our breath for that. But there are some elements that both political parties seem to agree on. And, for example, that we can talk a little a little bit about this later. This week, the Supreme Court heard arguments regarding the so called sanctuary laws passed by sex Texas State and for the state trying to regulate what actions can social media platforms take regarding the content that is uploaded to them? And in that aspect, there seems to be some type of potential agreement between both Democrats and Republicans. So that might be I think it's going to be pretty hard. And it's not going to happen in the following two years. But there can be some potential for some potential regulation coming in the federal aspect. I think with that, that that can serve as a framework for where we're at in legal terms. And I'll pass it over to Abby.

Thanks so much for setting the stage here. We'll go on to Avri.

Hi. Yeah, Avri speaking, I'm sorry, I couldn't actually be there in person, I really liked going to Puerto Rico. And so really regret that I'm not there, but I'm not. So in terms of looking at the subjects that are on the list here, are probably going to touch on three of them, though I may touch on on more. One is the first one that's in the description. It talks about balancing data driven innovation and privacy rights. I have problems with that. Because you do not balance with human rights. So the second you bring in the notion of balancing and say human rights, it's you've already put yourself in a situation where human rights are inseparable. They are binding on all governments that have signed the treaties everywhere. You cannot balance human rights, nor can you balance human rights against the activities. Human rights, on the other hand, constrain what we may or may not do now, in most cases, this is just Oh, but that's the government's business. Let the government make laws that protect the human rights and then we're all fine, we have the GDPR we have the other instruments, why worry?

Did I drop? Is Avri here?

so sorry, I have had by zoom on pretty much since the last 12 hours because I was in an a meeting in Riyadh online until just before this started, and this interruption happens every once in a while, I apologize. So, to go back I

[inaudible]

almost always one size fits all, there's very little particularity in it. And laws creates new opportunities for people that want to break law, it just it just is the way it happens that that new laws, new regulations, or new opportunities for for problems, where as you know, all of us are supposed to protect and promote and respect the, the human rights that we have. So one of the things that that happens is when we look at human rights and platforms, when we look at human rights, and and the need for data driven innovation, we have to basically sort of do an impact assessment. One of the things that hasn't come up much this morning yet, and yet, is a big part of this all, you know, I very much agree with what everyone said about transparency and education. In the earlier my word, by the way, in that word in that word mix was I think it was multiple sources, that basically one always needs multiple sources of information. So So you know, it's going there, human rights impact assessment, something that looks at what a platform, a company, a person, or whatever is about to do, and looks at it against the human rights looks at it against all 30 of the human rights that are listed in the UDHR. And then all of the many other covenants, all of the things that the 10 treaty bodies that that do human rights are, you know, that govern human rights to what extent they can govern it, you know, to look at those to do an analysis in the full complexity of the product, we want to put out the research we want to do the innovation we want to take on and the human rights and and how do we interact with them? And then when you have that full mix, then you can look at it and sort of say, Okay, how do we make sure that what we're doing does not impact any of these things negatively? Obviously, we have to obey the law. And that says, you know, that that's not an exception. But we have to do more as part of our responsibility to protect, promote, and and, and, and respect. I always get those three in different orders. But those are the three and that isn't just a duty for governments, though. Okay, looking at the last two and I haven't been that good at looking my time is basically can technology help? That was one of the questions I looked at. One of the things that I've worked on for years less so in the last couple is the whole notion of human rights considerations in protocols. How can we build a base layer in the protocols that we have? Not that they can necessarily protect privacy? Remember anything you build to protect something?

Avri seems to be a little bit frozen. So Maybe we'll move to Benjamin who's in person. And he's going to talk about the lot what what will come responsible data sharing play imbalancing data driven innovation. So I'm gonna go to Benjamin.

Okay. Thank you. Thank you, Judy, I'm sorry about every he really wants to share some interesting insight there. So I'm taking this perspective, like, I am a strong advocate also for privacy, because that's something so you see, the people I danced with a lot of time believe in. But I also believe that for you to be able to do a good job of privacy, in fact, to have data in the first place, is for a particular reason that you collected the data. So I'm always considering, like, why don't we first talk about the issue of data sharing in a responsible way, like, if we can address and provide arrangement for that, then we can now see what violates this proper avenue of giving data out in a respectable manner, empowering manner. And in a way that respects all what I've seen that the privacy advocates, or data protection, noise I hear a lot of time is just privacy, privacy, protect, protect, the way that we collect the data, invest all of these costs, or whatever it it takes to collect the data. And we now suddenly found out that, oh, this data is has a lot of economic value. And then everybody wants to get a piece of the pie. But nobody's putting the user at the center. Like, even the user is not doesn't have any power, doesn't even know he's been deprived of all the innovation that this data can bring to him, or her. And all everybody's talking about, let's protect the data, let's create privacy, such that even as an individual, sometimes I get locked out of things that are supposed to serve me, because of all the privacy hoops that I have to jump over. So for me, some of the initial things that I've been advocating for is first of all responsible data sharing if the data is already collected. And if it can improve my life, then the second thing is, even if you are creating the laws, I can't I don't want to thank GDPR. But it has brought up a lot of laws all across the places where I live, and Nigeria being one of them. Now, suddenly, lawyers are jumping everywhere. I mean, collecting we don't even have money to collect the data properly. Now, you're telling us about compliance of the data protection law? Or you don't even ask us do we use the data for anything useful, we are yet to get capacity to use the data to make any informed decision. Now you're asking us about compliance of that data. So for me, I totally respect that individuals, people who have worked on this data should be part and parcel of creating these guidelines, we should be involved in the process. So I will say, community of people that they collect data from, have some say, in the rules and regulations that has been set around the data protection, privacy norms, then after that, then there should be an established way to create data sharing principles that should guide with the user in the center, and being empowered to either benefit from whatever comes out of the economic value of that data. Like if you're collecting data from me, tell me how much cost because I generate this data every day. So tell me how much I have, as it's funny, but that's what I seriously believe in. Like, I cry for it every time that tell me how much I get for all of the benefits that this data brings him I look ridiculous to start with. But I believe if enough momentum and energies put towards this will start having some kind of framework and shaping guideline towards this moving forward. And in the end, I know that you're using my data anyways. But I'm also getting some values from me, and you're doing it responsibly. And then there's no reason to make so much noise or so much agitation around people advocating for my privacy and protection, but don't really care so much about me, rather enforcing guidelines, laws, and all of that, that does not bring any value to me, even though anything also prevent me from doing anything meaningful when I finally develop the skills with this data. So I'm saying, and my advocacy is always around creating as we advocate for privacy, data protection, human rights, just like I really appreciate what everything, there's no way we can place value in terms of monetary value, or standards or when it comes to human rights because it has enormous impact on different persons and different individual depending on where you stand per time. What I advocate for is, let there be some level of responsibility when you are sharing the data. And when you consider human beings and human rights, and even things like cultural norms and cultural values, it's difficult to place economic value on those down to just see if I don't even know the value we can place on humans, but it puts people at peace, and it establishes their their value. And that's why I would, I don't want to jump into it. But it takes me quickly to ethics, but maybe we'll get that to that later. But if there are sharing becomes a thing and is responsibly done, then the argument of data protection wouldn't have to come too much in terms of trying to balance it, it will naturally just go the way it should go. There will no bad players who don't follow the standard practice of data sharing responsible data sharing will easily be able to see them and pick them out. And that's where I stand. But I know I'm not a lawyer, and I don't speak for lawyers, and I don't but I don't really align so much well, with when I see all of the guardrails and the laws and the privacy, it makes it difficult for every innovation to thrive, as well for new communities that are trying to leverage the realities of the environment. Because I believe that the limitations of environment is a is a is, is a critical ingredient to to innovate. But when suddenly you have been exposed to California privacy laws, these are advanced and develop societies. And you're in an environment where you're trying to innovate with the limitations of your environment. And suddenly you have to contend with EU laws, and all of those things and like, give me the years to get there too. But let me innovate within this environment with this delimited with without bringing this loss ball already innovators in environments already been told GDPR will let you do this, calibre will let you do this, then am I going to ever innovate there, now we're talking about balance that can never be balanced, when it comes to the issues of human rights. And I wanted every to really finish on that, because you can place any value on the things affect humans dignity, human rights, culture, and all of that, you just need to respect the nuances and be flexible enough to accommodate them. So for me, in order to be able to and I'll shut up now is to allow for proper data sharing that is responsible, and also cultural respecting, that brings all the value systems in these various environments into play. It might be difficult, people will say, I did, how can we address all over the world, different people, but then it complained to what I've really seen of human rights impact assessment, there might be cultural values, impact assessments, all of those other norms into these data protection or data sharing value systems that I'm advocating for. So I'll keep quiet here. Thank

you so much. Thank you, man, I for your passionate reply. So one of the major problems that we that both Benjamin and Victor has just talked about is the laws is that when laws are passed, and when policies are passed, there was not as they're supposed to be a discussion with the community is supposed to have a discussion of all the stakeholders that this with the with the industry, with the with the nonprofit with the NGOs, with academia, everyone should be going in line for discussion. And the Lord should or should also be focused more as Benjamin sang culturally sensitive. And some of the cases of the Alliance what Benjamin is saying is that a lot of cases in developing countries, they're putting laws that don't really reflect realities in these countries. They're not ascribing to a data sharing principles, David, when he talks about innovation, innovation is great. Is that when you what the idea of the data sharing is, is that you walk into a health clinic In a different in a different province, you can don't have all your records. So that doctor will know how, oh, you can have this medicine, oh, you're allergic to that. And he'll be able to tell you, that is what the innovation brings. That is where they'll have information on you the ideas about how we could share data, we could help the consumer, provide better services, provide all types of services, if done correctly, and there's a nonprofit group who's been working on data sharing principles on this area. And it's really incumbent of what on what I tell other countries is, you need to work with everyone and in all the stakeholders to get some agree to principles that will be human rights centered, that will be driven. But less we want to explore these issues more. And so I want to talk to Victor, how can we achieve these balances? What is it is like in your in Puerto Rico? Have there been discussions with the community on this law? Has there been an involvement? What? That's a key to everything that it's going on is how do we get all the stakeholders involved? And just you can talk about that, Victor?

Sure. I think that's a great question. And the answer is quite easy. There have been no, no stakeholders involved in this process. For the law here. I mean, even even this, this would be like the CCPA, a law directed at primarily protecting consumers. So information obtained from businesses, with exceptions for Yardi small businesses. So this is mostly directed at the collection of information by businesses. And pretty much I think, most businesses have no idea that that this law, you know, it's knocking at the doorstep of almost being approved. So that's, that would be a short answer. I want to talk something about before pressing on that topic, what Benjamin was saying, and and also youth, I think it's important to look at potential different frameworks about how we think about data. And certainly, I think GDPR, CCPA, and other type of laws have tried to use a model, even if they fail at it, of looking at data as an extension of the personality of people. Calling it a right, of the privacy of the individual. However, the issue with that sort of framework, is that you can renounce your rights, you can give your rights away. And once you do, then you have lost it. And essentially, this is what these regulations allow you to do for certain limited aspects, right. And this goes to what also Benjamin was saying, regarding sharing and responsible sharing, right. So this regulations, particularly GDPR, try to address the issue of how the information is being shared. And it imposes a lot of requirements on the sharing of that information. But we have to think that the law is not always the only way and sometimes not even the best way to achieve certain outcomes. Because as we know, approving laws that establish requirements, only benefits those that have the strongest power economic power, and means to comply with those requirements. And it affects it creates barriers of entry for other people, other businesses that do not have the ability to comply with those regulations. So I think another interesting way to think about that is thinking about data, not only as an extension of the individual, as a right, but as a property, right. So think about intellectual property. If we have if I create some song music, and somebody wants to use that music. There are ways in which that can be achieved by providing a license, which is nothing else and the opportunity, a permission to use that right. But that usually comes with an economic benefit for it. The Creator, even if you know, how much the economic benefit is, can be really limited, right. But in a sense, x, that is a way that is a that economic, that property right allows for a system of incentives that allows the extension of that we can call it personality, right of the Creator, to be used by others to be able to create on top of that original creation, but at the same time recognizing the value of the original creator, and creating economic incentives for that. So I think we should not overlook the economic aspect and the property aspect of data, even though you know, we that's certainly not where we are right now. It's very hard. It's certainly not an individual's data is not an individual's property. Right. It is right, in certain jurisdictions, but it's not a property right.

Thanks so much. I've read Do you want to comment on this? One, move on to the next question.

Actually, just just want to comment on it briefly. And it's one of the questions that comes up on the people can give up there right. Now, I have an issue, and it's very much a legal, you know, empowering that has gone on to say, yes, you can sign a contract to give up your rights. And yet, I've never quite understood how that works in terms of inalienable rights, how do I give up something that is inalienable? And so you know, and then you start looking at rights, and you're saying, Well, my right to life now, I can't really give that up. And, and, and such. So I worry about the facility with which we give up rights. And we say that, sure, by signing names on a dotted line, or pressing a gate, a big green button, I have given away all these rights that I don't even understand. And it's something that I really think the law and the theoreticians of law really need to take a step back and look at, I think the idea of using privacy rights to protect my privacy, especially since that is one of the I mean, that my my product, since the set is one of the rights that is protected, the right to protect your own the products of your own labor. I think that that is a very important thing to look into. So I agree with that. Thanks.

Thanks so much, Abby. Yeah, in this panel was his focus. Today's vote is focused on trust. We have a lot of issues recently, with how do we protect the data of person has out, people do the Facebook quizzes, people do all all the stuff, where they're giving out their own personal data, and they are risking losing parts of identity or having identity stolen. And I know, cognizant that the next panel is on cybersecurity, which data security is one of the goals of data protection laws, is that we are protecting a person's privacy, a person's personal information, what it's called a PII, private personal information is needed is what authenticates you, as we're going on to do all these digital services. What the government is giving out, we, we, one of the things are based on is is how do we authenticate yourself. So if you are logging on to a service, you need to be able to authenticate yourself just like to get your health records or to get a new license or to get other things that make the world so much easier to operate. We don't have to go in person to different things. We can do it online, we can get our document, but this goes back to authentication. And if we need if we lose if we don't have enough protection on privacy rights, we lose that basic rate of us to be who we are, who digital identity is and does and that it's also what's what's really I know close to the heart of IV and others and intensive that we lose this right if we if we are not careful and protecting our digital identity. And Benjamin I'll bow to you because we haven't gone to you with this question. But how is that being protected? Or as you advocate on the privacy issues? How do we make sure that our privacy is being protected? And the laws and then we'll go back to Victor who's going to talk about the law is about protecting it?

Me, okay, so well, in terms of trust, so I personally think it's important that we technology has resolved all of that for us if I mean, there are various ways we can very validate who I am, I mean, we you are determined who you get access to the trouble is this non ending extraction of data that makes people collect what they don't even need. So for me, I totally believe that, for the services that we especially the world's government are forcing us to do like, in the country I live in right now everybody is forced to register their sim. And then you have to go and give you a biometric. They collect that composite really, for almost everyone, I think biometry is a bit not compulsory, but you're forced to put your name to your Sim. Now, you don't want that database to be exposed to maybe somebody wants to use it for various other reasons, either for ethnic reasons, or whatever, even political purposes. So the sensitive, sensitive nature of that data is really, really important. And I'll advocate for anything to the highest or doubt for it to be properly protected, especially when a government insists that if you don't get this done, your bank account will be closed, or you will get access to other services that does the level of authoritarian NISM. That is involved. So once you do such a deals on you to have the same level of protection to make sure these there are these citizens are given to you not voluntarily, but you got it from them compulsorily, you must protect it, and make sure preserved, and we don't want any story. So for me, this is totally something different. So when government insist on things like that, because for every other thing, people look for different ways of I've interviewed students from my research, and I see that people create pseudo names to access different platforms, as is convenient for them, you know, but when it gets to government services, my LinkedIn, my international passport, to my SIM card to my bank account, I have no choice, they tell me if you don't have this done, you can access this service. So at that level, maximum level of trust must be there. What what I have logged in must be what cannot be protected, you can't have I can't afford to have my identity stolen. But when it comes to other platforms, it depends to the level of degree to the level of, of importance of that platform to me, for my bank. Yes, there must be maximum trust. It must be what I signed in must be what I say my bank account, all of them was must be true. But if it's on Facebook or Tik Tok, if I choose to use them, I don't know. It depends on the level to which those platforms serve my needs. So that those are the levels I said some of these arguments, but in terms of things that have become mandatory, and they ask you, you must do this. If not, you don't get because as I'm even advocating right now that individuals should have opportunity to say I want to be offline. I don't want to be on this digital platform. Boy, it seems we don't have that choice. We must go online, they will tell you, you have to go online and do this. This is only done online. And at that point, what I signed in was be what I see. And I think governments and agencies that have been tasked with this responsibility needs to ensure that maximum protection, highest standards of privacy must come into play. And those platforms cannot afford to fail in any ways. So that's where I stand with this essential platforms that we have been forced to start using. I don't know maybe in the future, there will be opportunities or movements that will say we can opt out of these platforms. But at this point, you can't as far as the country I stay in right now. You must register your Sim. You must validate processes, animals, they give you deadlines, like you must do before this this particular time. So that is why it's done. Yeah.

Thanks so much. Yeah, that was all discussions on authentication. And that is also the danger of face of sell some other social media platforms when you give out too much information, because that can be can be cure collected by others for nefarious purposes to falsify your identify so that they can break into your bank account. They can take your money away, they can do all that. So I'll let our legal expert here talk about some of the downfalls there. I can go on.

I want to talk about two examples is of course, looking at it from the United States point of view. That I think but but I think really resonate with what Benjamin has been saying. And I would say, I mean, we're talking about trust. But I think there's trust. And I agree a lot with what Ben Newman was saying. And I think there's risk. And there are two different things. We accept risk in a lot of our data transactions, for example, uploading our information to social media and so on. Without necessarily I don't think trusting that that information is going to be used accurately, right? We don't think about most people, I would guess, think about it the same way as the way we think about providing our financial information. Right. And I will take it another step. Even Forex, and I'll give an example for that. The United States there's been a push to create, as I think Judith wealth was saying, I a national electronic health record database, so that wherever you go into United States, you will be able to your physician, or if you're an emergency to hospital, whoever it is, will be able to have access to your medical records anywhere you are in the United States. And, of course, there is no national network for that you need to build local networks, state networks, that then can exchange information and create that national network. Certainly, I think there could be very a lot of arguments for in favor of that, that could mean a lot better health care for people right. But I think that requires an even higher level of confidence in the way the information is being managed, the way the information is being shared, because that might there might be very, very personal information there that you are not willing to trust to anybody. And I can talk from personal experience having had some conversations with stakeholders here in the island regarding the creation of the local EHR network that would be connected to to the national network. And for example, stakeholders that represent patients that have certain conditions that can be considered very important to keep private know. So they they are very worried that the state has not provided enough information for their for them to not only be certain that, you know, such systems comply with HIPAA law, but that particularly the persons that they take care of will not be there information will not be access for political reasons, like Benjamin was saying, or for any nefarious reasons, or even by an accident, right. Where the technical measures are not met, and or information is leaked, is hacked, and so on. There are cybersecurity laws here in Puerto Rico. There are cyber security laws in all of the 50 states in the United States. But those laws, essentially they try to create standards for what businesses have to comply with. And they do create notice requirements, and they do impose penalties. But we all know that those systems are not perfect, right. So I think that type of information certainly requires a level of trust that is very different from the infant ation that you're uploading to, you know, social media. And on the other topic, and I will start with that. Authentication right. Recently FinCEN, the financial crime Enforcement Network has created certain requirements in the banking industry for cross border banking, right. So this is something that's becoming more common having a lot of financial industries here in the United States that allow for money to be transferred from the United States or into the United States, right? Something that a lot of people that have family outside the United States use to to send money to their parents that are leaving, you know, outside of the United States. So it is something that is important, and that even the financial agencies in the United States have has recognized as important not only to people that need those services, but also for the United States economy, to continue to be particularly the use of the dollar, or something that is viable. There has to be certain ways for these financial institutions, to comply with money laundering regulations, but at the same time, they are not preventing people that they are not able to physically identify because they live outside of the jurisdiction of the United States. And because maybe they might not trust the government regulations that are providing those identifications, how secure are those identifications? So what happens there, then we need a technological solution to that problem. And that means not only using whatever identifiable documents, there might be, you need to use IP address, you need to use geolocation. You need to use, you know, certain financial information about those individuals, are they in, in what country are they in? And you can do that by using IP address making sure that they don't have you know, BPMs, and so on. So that augments the amount of information that's being collected about those individuals, right. And that certainly increases the potential risk. Because it not only goes into identifiable information, it goes into financial information in terms of how many transfers are you going to be doing? In a month? How much do you earn? How much money are you going to be transferring or receiving? What type of business are you in? These are all questions that FinCEN goes in, to try to provide some sort of framework that allows this financial institutions to identify with a certain level of, you know, Glamour, granularity, who this person's are, and lowering the risk that you know, they might be using the system in an illegal way.

Thanks so much, Victor, we could go on for a long time, but we promise to have audience questions. So Kathleen will help me feel any audience questions that there might be as I am not here, so I cannot see them. But please have some Do we have any questions from the audience? Yes, great. We'll do this one on line two is, is that?

Hello, Pedro learner for the record, I want to ask actually, about a specific part of the presentation that is around the feasibility of that proposal of an economic return to the owners of personnel that writes. And I would like to build upon the cooperation that was made with the intellectual property, cuz it's similar in some terms, but it's much closer to the nature of the moral rights, the personal rights there than the patrimonial rights. And so even though they are commercialized like patrimonial rights, they are as every point out more rated to that by the personality rights that are not allowable and are not part of a possible commercialization. And this makes sense considering that the work the work that's made there, the intellectual energy intellectual creations, they're not part of the outline are not made by the owners. They're made by the companies who collect and organize them to Make this useful assets. So, and more important than that, it seems like it's a solution for a specific platform. But it becomes much more problematic when we are talking about multi platform or a regulatory framework, how we're going to make something that's possible, like a unified governmental platform that data farmers will inform and collect money to buy data. So it can return some standards to each user. It all seems very complicated to me. And maybe because that protection is not one of my area of stages of interest, maybe just being a little bit ignorant about it. But I would like to know how that would be feasible in a more concrete manner. Like is there? I know there's some proposals are made for specific platforms, but they were not success cases. So I just want to know a little bit more on that if there is anything that you will have overheard that were a good bet for a good proposal that was working in this way. Thanks.

Let's take another question because we don't have as much time. So we'll gather Another question is out there so they can answer is

thank you for nice presentation. I was following some bits. But even on Benjamin's presentation, my name is sabe Abraham. And my question is, who watches the watchmen in terms of the data protection and right data protection data, privacy, right protection Commission's are being funded by government regulations and policies are under government. So if government breaches the individual people's data, because government is funding, sometimes they don't come out to even tell us what exactly happened. A case in so many countries whereby even in my country, the government is taking data for SIM card registration, collecting biometric information. Now, we don't know where the transparency and trust is because we don't know where the data center is sitting. We don't know who even manages that data. How do they even share that data? Data, cross border data sharing with other countries government don't tell us but in a compliance roadmap with this kind of regional data protection, you must also set out your guidelines that we share data with this companies, we host our data here. So at least the truthful aspect of data and privacy right on the government level, we think that it is no more transparent, although as as how we trust this bank, that I have my money savings in the bank. So anytime I can go to the bank and redraw my cash or my bank will not move away from that is not how data is because some of them policies, regulations been handled by government. We cannot sanction government if a government agency breaches the privacy concerns of its users or customers. How do we deal with that? Can we have an international independent body that's can also oversee those kinds of regulations and policies? Thank you very much.

Thanks so much. Victor probably is best able to deal with the legal question. I can answer it, but I'm not a pat. He's a neck relax, but

ya know, now, just, just very briefly, I agree pretty much with the gentleman's comment, it might be certainly more comparable. And this goes to whatever he was saying, right? If this is an inalienable right, then we shouldn't be able to you know, give it up that at least not that easily, and, and certainly GDPR and CCPA do establish things that even if you allow the sharing of information, certain things have to happen, and under GDPR, certain information cannot be shared anyhow, unless for any specific, certain very specific purposes, like health information, so on. So, I agree with that. And I don't know of any specific initiative or example that I can give, except that US syncing something similar to you know, what we have seen recently with smart contracts, right? So think about data as some sort of token that can be passed on. And that still traces back to the origin of that token, right. So there might be ways and you know, I Certainly don't have a perfect idea of how this would be possible. If I did, I would have my you know, startup already running. But I certainly think about it that like that as a token that can be passed on. And for example, and FTEs, and so on, you know, the creator, the artist gets paid for whenever that token gets transferred from one person to another, right. So it's just a thinking about other ways in which it might be easier for the originator the data subject to have more control, because sometimes we think that they that property, since it can be you know, you can depart from property, it is not useful, but property at the same time has characteristics that that same transmissibility, allows for it, to have certain characteristics that make it useful. So it's just an idea. And I cannot give any specific implementations of that idea.

Try. So I think there have been some approaches like I think there's a group in in in Europe, that have this whole movement about my data, the big conference, and things like that. And they talk about things like consent management, where whenever a particular person's data is going to be in the play, you get like, a prompt, like, Oh, your data will be used here. If you agree to it, they'll go ahead and use it, maybe later, they come back and compensate, you get a token or something like that. However, some of these conversations, even data protection arrangements that we all hold on to simplify the value of data, as if, like, when we will say, data is the new oil is not that simple. If data is contextual, if my data is just my data, it doesn't make any sense. But now you put it in Puerto Rico, and you put distance, I start intersecting with Mr. Having value, and the more so data, get gains more value as the context changes. So Bo, what so knowing that that is the value that data carry, like, my data is not valuable when you add your own data to it. And all of that makes it really interesting. But what we're saying is, every time my data is into play, can I be notified? Or can I be involved in the process and decisions that happen to my data? To some extent, the idea of it, just like Victor says, If you know, the solution will have a start up already, is because we're not giving it the necessary attention. And that's why we're not making progress in that direction. The simplified definition of data like, as if you could just see it, and just make it whatever it is, it was stop making or stop where we are right now. But if it becomes really important, the user becomes the centerpiece of it, then you start there, the argument might change and new frameworks might come up or my reach what we already seen the starting does what I want to say that, because we don't have any standard solution for it at this moment, we shouldn't give up. But we should amplify that conversation. And then we see the innovation that comes thereafter. That's where I started. Thanks

so much. We are rapidly ran out running out of time if Avi wants to make a last 32nd input. Well, thank

you. Thank you for giving me a last chance to speak. Yeah, I really want to make three quick points. One, Victor mentioned the connection between risk and trust. I think they are inextricably linked, that basically all trust is in a sense, a risk analysis and a determination of one's risk appetite. I think in terms of answering the question about who does oversight over the overseers. I think within the UN system and human rights, you've got the universal periodic reviews that are done. But more importantly, and this goes back to something that Judas said is about the need for whether it's called multi stakeholder, participatory democracy or whatever, that you have bottom up oversight over the overseers because that's the only kind of mechanism that will give you a degree of transparency a degree of looking into that and taking responsibility for the oversight in our own hands. And I wish I would have had time to say more but it was a pleasure saying that.

Yeah, thanks so much. And I also wish I could have been there in person because I would have been able to have these discussing review after the fact. And with the rest of the team there, because there's a lot of issues, especially around data privacy and data protection and trust. And trust is essential, because without trust, we won't have other other other aspects of it. So without taking too many time, we we can hand that back. And I know there's a break coming up right now, and then we'll go on to our next session on cybersecurity. But thank you so much. Thank

you. Let's all give a round of applause for the panel and Judith and Aubrey and Benjamin and Victor