Sex, Big Data, and User Autonomy

10:51PM Jul 31, 2020

Speakers:

Keywords:

data

apps

big

user

individuals

tinder

people

talking

collected

autonomy

sexual

point

content

gender

platforms

ways

intimate relations

sex

information

question

Hello from Kansas City

hope has been like a bright light in a pandemic summer.

circulating every two years always leaves a trail in my mind

during quarantine. I've been working from home. Yes you for some reason I started sketching this year.

I've also been missing my makerspace hammerspace community workshop.

And of course looking forward to hope. I hope to see you in two years Hi everyone

How y'all doing Philip here from Portugal watching hope for the first time live always wanted to attend, never quite managed to make it beyond the great pond. Great to be here with you guys talking on the chat. Watching a lot of great talks, a lot of great content so far, and still a lot more stuff to go a little bit about me. I wonder this netlabel called enough records, you should go to enough records.scene.org we have tons of releases free for download. I'm a big advocate of free culture and copyright reform. And I'm also an active member of the demo scene doing some programming code for graphics and also artists, digital artists of different sorts, doing noise, music and that kind of stuff. So yeah, nice to meet you all and hope you have a great hope.

Hello, welcome back to hold 2020 I want to give a big thank you to all the attendees and presenters and volunteers here for the show. Our next session is with Keegan Rankin, an advocate for free software and platform co operativism. Higgins talk is about sex, big data and user autonomy. Remember, the matrix chat is available for your questions and when his presentation is over. We'll be right back with your questions and Keegan live. Take it away from grab control. Thank you.

Hello everyone here at hope 2020. My name is Keegan Rankin, and today I am presenting on six big data and user autonomy. Just to give a little bit of background about myself right now, I completed a code camp in the spring of 2019. Over the past year or so since then I have been collaborating with a Garrett tech cooperative. A lot of my work has been giving lectures and workshops on free software, digital self protection and trying to help activists learn how to use communications in a safe manner, and teach a little bit more about our current state of surveillance capitalism and the dangers of big data and predictive analytics. And just a little bit about what's going on some of the work that Gary has been doing over the past couple of months. They've set up a an instance of Canvas void of proprietary software. Eric's been doing a lot of work to provide students and teachers with a learning management system that that is not spying on the students, but also providing teachers with it with a great tool, including video conferencing with big blue button. And on that end, moving teachers and students away from using zoom yet another proprietary software. So I guess the start off with this conversation, a big part of what I'll be talking about through all of this. And generally my main focus is on big data. So what I'd like to do is first take a moment to sort of define what is what's different about big data in comparison to more traditional forms of data processing. The main thing about big data is that it is a data processing paradigm in which unstructured data can be analyzed and that allows as developers or entrepreneurs, anyone using these analytic software's to be able to perform analytics on much greater sets of data, because the, because they're not relying on a standard structured set of data. So now the analytics tools allow for the analysis of much larger data sets, since it doesn't all have to be exactly standardized or anything like that. And I think one of the main things that has occurred in relation with that is that within the Big Data paradigm now, people want to collect as much data as possible, because data is power. And so just about everything we do now is, is being surveilled, as our technologies increasingly find their way into our personal lives. And I suppose one of the one of the major threats of Big Data Systems finding their way into our lives in this manner is that not only are they improving the predictive capabilities of our technology, but they're also acting in ways to make humans more predictable. And I will get into some examples of that later on actually in this presentation. This quote I have here is an excerpt from dial source Comm. You can see the reference down at the bottom. And it's a article called predictive analytics to boost sales. This was just something I searched up to see how analytics is being used, and I pulled out this line. With predictive tools, your marketing and sales teams can better determine exactly which customers are most likely to convert and even their potential lifetime value. And I suppose this is kind of pretty typical business language anymore where customers are sort of in a way being turned into a commodity and being valued in terms of their likelihood of purchasing or their capacity to make purchases. That is, I suppose one of the ways in which users are being categorized. And as a result normalized, they're sort of being placed on a value curve. That is, that is based on their propensity to purchase certain products. And so the reason that I wanted to go into this conversation talking a little bit more about sex in the relationship with big data. Well, there are a few reasons. First, first, to consider how far big data really reaches into our personal lives. Obviously, sex is something that is extremely personal and if Big Data is operating in a way

in which our sexual information is being asked through that, then that's I think grounds for major concern. In addition to consider the implications of big data on sexual development, as I was suggesting earlier, the ways in which Big Data acts to sort of normalize individ normalize individuals, according to different categories, but also its capacity to be persuasive. And so I'm thinking about this in relationship with, in what ways does Big Data affect an end users of their of the various platforms that I'll be talking about? what way does it affect their sexual development and potentially remove their autonomy from really self determining sexual orientation or preferences or interests Lastly, to consider the roles of sex and big data and advertisement and consumer culture throughout this talk I'll be providing examples of four mechanisms in which the end users autonomy is being degraded. And the first mechanism that I'll be referring to is access to information or self determination and as well as the end users access to consenting to the collection and or usage of their information. And so when I say informational self determination, I'm referring to an individual's access to controlling what happens with information that's collected about them. Some of the questions that I'll be attempting to discuss here, how much control the users have over the collection, sharing and usage of their data, and our privacy policies sufficiently transparent. So to really discuss any of the things any of the mechanisms of autonomy degradation, but especially this one, I want to first describe sort of the shape of a lot of commercial platforms anymore and how they're being built. A lot of players forms are built on top of software development kits or SDKs. Typically, these software kits are provided by Facebook and or Google. And I suppose in exchange for the service the developers, sort of in the agreement, they are required to share information that is being collected on their app. So right there, that is actually something that a lot of developers might not necessarily be aware of that agreement that is not always fully transparent. But just to give you an idea of what the SDK is used for, with, I'll use using Facebook here as an example. It allows app developers to integrate certain Facebook features and the the primary ones are like logging in with Facebook just to make your login easier, so you don't have to create an entirely new account or anything like that. I suppose another typical feature that gets built in to a lot of these apps are the Facebook Like button. I guess a really important to note is typically the Facebook SDK is used as a way for apps to really be able to, to understand users behavior, and essentially to be able to sell targeted advertising on those platforms most effectively. One of the big arguments for allowing these types of apps to exist that are collecting so much information is if we could successfully find a way to anonymize individuals, effectively to make sure that their privacy is being protected. Well, but the problem that idea is that a lot of the information that's being collected through these apps and through these platforms is for the purpose of trying to identify the individuals and make inferences about them. It's really an integral aspect of the entire business model of big data. So as far as legislation, at least in the United States goes surrounding the collection and usage of data, there is the California consumer Privacy Act of 2018. While it, I think, may be the most effective privacy law in the United States, as far as predict protecting user data goes, but one of the problems with it is that it only prevents the sale of data legally, it doesn't actually prevent the sharing of data and doesn't do anything to prevent the collection of sensitive data. We're certainly in need of some stricter legislation. So throughout this talk, I'll be giving examples regarding several different types of technologies. several ways in which many of them do act to degrade user autonomy.

So on pornography sites For example, it's really typical that pornography sites are leaking their data to Facebook and or Google, I had seen a stick at one point, I think it was 93%. At least 93% of porn sites are leaking user metadata. And part of that data includes the URL that they're visiting. Right. And so typically, that URL might contain information regarding an individual's sexual orientation or their sexual interests, or at least that information can be inferred. Another big one, which I won't be talking about later, actually, our Wi Fi enabled sex toys. There had at one point been a a Wi Fi enabled vibrator, which actually collected biometric data including temperature and vibration intensity, and surreptitiously was also leaking that information to us. leaking that information to the company, the company who who manufactured the sex toy. And so the final example that I'd like to employ here are the allegedly reports of police using Grindr. In Egypt, I believe it was essentially where they would be using the location data, which wouldn't display the user's location, but rather the distance from a user's location. Keep in mind, Grindr is an app for specifically for homosexual men. And in Egypt, homosexuality is illegal.

So

the way that police would use this app a vulnerability in this app is by each of them would three police would find their distance away from one user. And then the three of them together could use that information to triangulate and actually pinpoint the location of the user.

So there was a whole

thing around that essentially where Grindr had changed some of their default settings so that so that by default, the distance from a user would not be shared unless they turned it on in at least in countries where homosexuality is illegal. So regardless of whether or not that report was true, it still it does represent a vulnerability in big data in just the collection and storage of information it represents, at times, quite literally a physical safety threat. Because so many of these platforms have actually very poor privacy policies which use very vague but very vague language in terms of the type of data that they're collecting. And what's being done with it, who it's being shared with that information is not always very transparent. And as a result, if they're not capable of providing meaningful consent, in regards to what's being done with their data, then there are arguments that we should basically be really considering that a form of digital sexual assault, which I think is a really good point, they should, they should have complete control over who has access to that. And as opposed to sort of being coerced into the usage of platforms where they don't really fully understand. So the second mechanism that I will be discussing is the exploitation of addiction and mental illness. The whole idea of addiction itself. Sort of is a removal of an individual's capacity to control their actions. If someone is addicted to something, then they struggle to conduct themselves in the ways that they might like to. So some of the questions that I'll be attempting to answer here are How are the apps designed to hold our attention? Why do developers create addictive apps? And besides just the way the apps are built, is there addictive content and our analytics used to reinforce addiction? Something that's important to note is that user behavior is typically tracked very specifically on a lot of sites that employ analytics. And so in that, that I'll provide space for a sort of iterative design process where I think a lot of apps can essentially perform type of psychological research in order to not optimize but maximize the time that individuals are spending using their app. This whole notion of maximizing the amount of time that a user might be spending

should be considered that,

that spending more time on an app

while it might bring in while it might allow the developers to collect more data, spending more time on an app is not in every end users interest and probably not in most end users interests, especially just given the state of our challenging economy. So before getting into talking about sex apps in particular here, I just wanted to point out this that Facebook's president Sean Parker had actually admitted that there that Facebook was designed purely with the intention of exploiting vulnerabilities in in human psychology. And that they understood what they were doing. So just just looking around into the development of addictive products and apps, I did come across when I was just looking into that, on one of the things on the front page of my DuckDuckGo search was this article eight secrets designing addict and addictive app. The author of this article referenced near al, I believe is his name, who is a seems to be a big wall street executive who has who's written some books actually on addiction and like control over attention, interestingly, but he's created this so called hooked model that consists of four key points. So in the first point he is referring to what are the triggers, like what triggers an individual to remain hooked on a product. And so there are internal and external triggers and the internal ones. The examples in this article anyway, were boredom and loneliness. I think having the an app, in a way depend on

an individual's loneliness

is, in my opinion, it's kind of evil. So if we go down to the next key point is action, accomplishing something with the app and a limited number of taps away. And, and the third one is variable reward. And so this points out that the goal of this is to satisfy their need, and leave them wanting more. And so I think the problem that I have with this point is just the idea that these apps are actually satisfying a need. I don't really think that they are necessary. They're sort of the the necessities sort of being manufactured. But the want aspect is really what it's feeding on. It's really, in my opinion, setting up users to never be satisfied actually, to never really fully be satisfied and to always want more. And I think that just plays right back into that feeling of loneliness is

Yeah, that feeling of loneliness.

So I was taking a look at the Tinder privacy policy and there is a page under the privacy policy, about profiling and automated decision making and that just caught my attention anyway, the idea of these ideas of profiling and automated decision making Especially the words automated decision making, for me, anyway, represent yet another very explicit removal of autonomy. I know that there are certainly good arguments for the for the functionality and the practicality of these tools. But just because something is convenient doesn't necessarily mean that you are maintaining the user is a complete control. And so I suppose those things are sort of always at a trade off. And so something that I another thing that I think was really interesting about Tinder, for example, which I'll actually be bringing up this slide again later, but this slide it shows the number of a set of statistics about the usage of Tinder And you'll notice that many of them point to statistics about like the likelihood of someone's profile being liked, based on what constitutes their, the, their profile pictures, right? So for example, like whether or not there's a dog in the image, whether or not it's a full body picture, whether or not they're inside or outside to meet in order to have this data it represents that the fact that that computer vision is being utilized in these instances, and and I think that makes a lot of sense, right? Because this is really one of the ways in which they would be trying to most automate and the decisions that an individual would be making is, of course on Tinder, where individuals are, you know, swiping left and right really quickly, at least, not maybe some people pay more attention to As than others, but I think the general kind of stigma around Tinder anyway is that it makes it really easy for users to make quick decisions based on appearance. And then that way the app does become sort of addictive and it sort of a game that people can play a sort of hot or not, again, that they can just play and not even necessarily be interested in actually meeting anyone. And so I guess one of the things that kind of interests me about this is the type of information that might be being inferred about an individual. Because I guess a lot of these statistics here anyway, point to the fact you know, that, you know, maybe someone had a dog in their picture and that made they made them more liked or something, but I would be willing to assume that Tinder is likely using computer vision as well, to keep track of what sorts of physical attributes A user might be attracted to, for example, whether it be facial structure, body structure, race, hair color hair cut. I mean, I don't really know some of these things for sure. But

But I think a lot of this

information is certainly valuable in the world of advertisement. And but also still within, within Tinder, where that information could certainly be used to put individuals that that user might potentially be more attracted to, because it sort of gives Big Data sort of gives these this platform the ability to understand very specifically what a user is attracted to. And when we're talking about exploring other areas of the internet or targeted advertisement, it seems like a pretty valuable marketing tool to be able to know Know, an individual's sexual preferences, including the things that they're physically visually attracted to.

Of course, all of this is

about all of this is in regards to inferences that are being made. And also content always is there's generally sexual content if you're going on free to use websites for for things or like the the internet is just sort of latent with sexual material. And I think that sort of poses a, an unfairness to users who are using the internet but aren't necessarily interested in engaging with a platform in a sexual manner. I sort of find it problematic that someone might be Might have sexual information put in front of them at a time where they aren't looking for it. So I guess the big, this is the next section of this. The next mechanism is really control of content over content exposure. How much control do we really have? And so just some of the questions here. Does Big Data make us see more than we really want to see? And are there ways in which we could have more control

over the content that we come into contact with?

And so I think pornography is actually a really interesting technology to be talking about in this regard, given just the whole recommended for you targeted content, as well as targeted ads That are really like all over like Pornhub, for example. And so I put at the top here of this slide, the porn maze. And the reason that I sort of made that play on of a corn maze anyways, that I feel that sometimes as a user who might be attempting to navigate pornography, it's really easy for an individual to take wrong turns or to come across content that that might they might not be looking for. And so one of the big, there's a big debate within porn studies around whether or not pornographic pornographic content should ever be considered harmful or not, or whether or not there is a such thing as harmful content. I think it's a really interesting discussion. Because on the one hand, I think people feel really strongly that violent content should be considered harmful. But on the other hand, I think there are views that if we consider content to be harmful, then how can how might we be shaming an individual and so I wouldn't necessarily take one stance or the other. I certainly have my opinion, which is that I think that any pornographic content has the capacity to be harmful if it's put in front of someone at the wrong time, which I think is a really good reason to, as I was suggesting, try to find ways of giving individuals more control of the content that they're exposed to. And so fortunately, or within a porn site anyway, or even like when you when someone might start looking for porn or some sexual content, the first thing that in that someone's going to be that a user is going to be brought to typically on the internet is Pornhub. And once they're inside of Pornhub A significant amount of the material is in fact violent. And it's actually relatively difficult to find material that isn't violent. So as far as the debate around whether or not we should or shouldn't have violent material on the internet, that is not a debate that I really want to discuss so much right now, as far as censorship ideas. I think it's a really important discussion. But just a very morally charged area. My thoughts on this anyway are that at the very least, we could have better rating systems. Another alternative might be to have a bit more of a filter in process for searching through content, as opposed to just having a bunch of things just kind of put in your face. So the final mechanism of user autonomy degradation that I'll be talking about today is data ism. Data ism is the trust in the objectivity of data. Even though data is the entering of data, as well as the algorithms that collect and process data and the inferences that are made about that data, all of those things are things that are made by humans. So it's a bit, it's a little bit dangerous to just assume that just because data was collected, that it's entirely objective. It's just simply not true. So some of the questions that are I would like to look into on this topic are, do people trust statistics enough to alter their behavior towards a norm? Do people trust apps that suggests actions in our lives based on statistics? And if so, does this make us as the user is more predictable?

back to this page, which we had looked at earlier,

These Tinder statistics, it's pretty typical. Actually, there are several threads online about sort of strategizing, what your profile picture should look like, or what sorts of things you should put in your bio to optimize your chances of getting a match or getting people to like you. And so this, for me sort of represents a way in which dating can sort of lose, at least like the the any amount of charm that there is with using a dating app can sort of be lost in the process of it being a game of looking for success, right where the the process of using a dating app becomes more scientific as opposed to artistic and self expressive. That's as we move into the world of data. That's certainly Something that we're starting to move a little bit more towards are just the sort of glorification of science and data. And I'm moving away from self expression and, and potentially, one worry is moving away from more natural organic form forms of intimate relations with each other. And so another example here is, as I'd mentioned before sex tracking apps, where a user, a user might be wearing actually a wearable device, and actually connecting simultaneously to a phone app they'd be using and the phone app will actually provide suggestions as to during sex, whether or not they should speed up or slow down.

As well as rate

the quality of their sex but according to database values that I don't think really have anything to do with the quality Have an intimate relationship or experience. It's just the data that happens to be able to be collected. And they sort of turn it into a game of attempting to quantify something that's entirely qualitative, right? The quality of an intimate relationship, the quality of our intimate experiences. So for example, the things that are being measured are this the speed of thrusting, how long the sexual encounter lasts for and the decibels in the room? I'm so i don't i don't think that these data points really have anything to do necessarily, with the quality of our intimate experiences. And I think it's sort of an affront actually, to suppose that these data points are related to the quality of our intimate relations. And so the last, the last example that I'll be talking about are the usage of periods. Tracking apps. And so I think it's important to note that it for these period tracking apps, some of them do have a built in. Some of them do sort of come with mirror apps so that other individuals such as a woman's partner who is using a period tracking up so that her partner would also be able to keep track of her cycle. And the way in which this gets used is they will do the apps will remind a woman anyway, on her most fertile days to wear nice underwear while simultaneously notifying her partner to bring home flowers and that's that's fine and all I think that there is a practicality of course, you know, when people are using this, to try to get pregnant or to try to avoid getting pregnant, obviously, you know that it's a really great function and A really convenient tool, but even going as far as suggesting exact actions to take sure some people might take it as a joke. But let's keep in mind that it's it's possible that that intimate relations might have some propensity as well, to become somewhat routine. Especially when we are sort of if, as I'm supposing that data can be sort of persuasive in regards to how we should be conducting ourselves, and the last. The last example I'll use here now is just is just gamification of intimate relationships. And so with that, with gamifying it you Now, when you introduce that into a relationship, it really makes you question Why are individuals engaging in the intimate relationship? Are they doing it because they want to win a game on an app? And could that facilitate tit for tat behavior, as opposed to just allowing yourself to be spontaneous as opposed to just allowing yourself to love your partner when you feel like loving your partner? After all, I think it's pretty typical that individuals can give and receive love, in different ways and in different capacities. In what ways can gamification of our intimate relationships potentially take away from our capacity to just be authentic and just show our love when we want to say thanks for listening To me talk about the ways in which big data might be degrading user autonomy. I'm excited to discuss this further with anyone. And as well, I am leaving my contact information here. So that anyone who might be interested in discussing even further can certainly contact me. Thanks so much, everyone. I really hope that this has sparked some thought, at least really helped to illuminate some of the problems that I see with big data.

Thanks. Thank you very much kicking. There was an excellent discussion about the next big data and user autonomy. Very interesting. Would you like to add something to your discussion? Yeah, sure.

That'd be great. Thanks. Go

ahead. All right.

So I think one of the things that I didn't really get to mention enough is that just thinking about the way in which What I would like to see moving forward or what I would like actually to get in touch with people about if anyone did collect my email address or anything from which I think was on on the slides, I think a lot about how we want to be moving forward in our education systems and having conversations with people, teachers, educators, and parents about really how to protect children from the systems of big data so that they have the most access to really being able to self determine their their selves and their sexuality. And so, for me, I feel like a big when I when I think about this, you know, I have to imagine that technology is really integrated, how just how much technology gets integrated into our lives anymore, and how much it how much people might be affected sexually by big data and many other ways all around, but but thinking about sex education, It seems like a conversation, which I think many people are saying this and for different reasons. But sex education is just something that needs to happen at a younger and younger age, especially now that, that we have so much access to technology, and there's pornography, and there's so much section advertisements and things that, you know, almost just kind of get put in front of us all the time, because they're almost screens in front of us just about all the time. And so, sexual information is always just right at our fingertips. And, you know, you never know when a young child is going to, you know, find their way somewhere that you know, maybe they heard a word and they found out how to type the word you never know, you know, when someone's really going to come into contact with sexual information. And so we need to be teaching people about it at a much younger and younger age, especially when we consider just how invasive big data can be. In terms of really, actually affecting our behavior and our sexual progress. propensity is that interests and desires and in so many different ways. So that's, that's I guess the conversation that I want to be having is, at at times with people if anyone's interested is how can we really, you know, integrate the topic of big data and technology into sex education, and just sort of supposing that maybe it's a really important conversation that that needs to be had on a larger scale. I

can't hear you right now.

I agree with you. It's a very important conversation to be had right now with the topics you spoke about. Are you ready for some questions? Yeah, I'd

love to take some questions.

So the first question is, can you point to us to any resources or groups to learn more or to get started in advocacy about this? About this As the as a follow up for the data being collected with respect to

pregnancy tests, etc. Oh, two

pregnancy tests and things. I'm

gonna show you the data like if you buy a pregnancy kit and and so the first question, let me go back to the first question. How do we build more awareness about privacy issues with period and fertility tracking apps that are necessary health tools, but the data being collected is being anonymized and resold for advertising and market research? How do we get more awareness about those privacy issues?

Okay. I mean, that's a really challenging question and one that I'm also really interested in trying to figure out the answer to, and frankly, me, holding this talk is, at least my first, my first my first solution to trying to bring some of these things to light. I don't really know exactly what to do besides, you know, just Just start having conversations about it. And as I, as I was saying before, you know, start talking to educators and and people and, and parents, and just basically, we, what what really is needed is, is a way to start talking about, about sexuality and intimate data and just what's done with data in general, like this conversation isn't just about sex. Obviously, it's really about being done with data. I think that sex just happens to be an interesting topic, because I think a lot of people, you know, recognize the importance of privacy around sex. And so that's the reason that I chose to bring up that example, or this, this whole topic as an example. So yeah, I don't know.

Yeah, that's together very nicely. We have about nine minutes. Let's go on to the next question. But where do you feel the line is between legitimate usage data for product improvement feedback and diagnostics, and then going stepping over the line to consumer harm?

Right.

So, I, I'm going to say that I probably draw the line in a very different place than then many people do. I generally, I mean, they're, of course, they're going to be apps where our data is collected, almost by the very nature of the app itself, in the way that it's used, of course, you know, apps and platforms in order to function, you know, require some user input and things I guess, in terms of trying to optimize apps and things it's it's really about having, how do we talk about how do we try to optimize the usage of The app the app or the platform for for the user is interests. And maybe stop trying to maximize the amount of time that individuals are spending on applications, basically building platforms that are more centered around the health of the individual using it. I'm sorry, I'm going a little bit off track. Can you read? Can I get the question one more time? Because I

think you did a very good job with that question, actually, was like, let's go on to the next one. Is that okay? Sure. What was what to go back? Okay. Um, while dating apps have generally added many more gender and sexuality options, it seems like both porn and big data have avoided higher fidelity genders. Could you comment on the relationship between non cis genders and data? Does it seem like anyone is including a broader understanding of gender to enable deeper statistics

I'm so I haven't I actually haven't found any particularly studies on this. I'm not I'm not completely aware of how different companies are formulating ideas around gender and all of this. What I would say about big data anyway, is that I think one of the things that is kind of interesting about it, and in ways really, you know, really compatible actually, with forming with the idea of conceptualizing a lot of different genders. And more just more nuanced understandings of gender, is that big data, of course, you know, is a system that operates to essentially formulate several categories, right. And so, I mean, I imagine that naturally as Big Data progresses alongside users, you know, different different individuals, you know, who are online and talking about their, you know, their different ideas of gender, that these are all things that are going to be constructed. Because I think that, you know, they want people who are utilizing big data for marketing, you know, they want to know as much about you, you know, they want to understand how you identify. And I think one of the things that worries me sometimes thinking in these regards, actually is our I personally when I think of gender, I sort of don't I personally don't like the idea of identifying myself in any way in particular, I know that that's not necessarily according that's not necessarily, you know, of the the status quo right now in terms of where of where feminism is sort of leading. But I think for myself anyway, I like to I want to I guess That's what my fear with big data in relationship to genders in the way that they can evolve together is that big data, I think has the capacity to sort of take advantage of the ways that individuals self identify and the ways that we formulate language and identifying each other and ourselves on social media and things. And so they're, you know, some there's always that I sort of imagine and afraid, a little bit of the capacity for big data itself to play a role in the development of nuance genders and, and but also in not just in the development of nuanced genders, but in the way that they get perceived and sort of that creation of several different categories of being that become normalized to some extent, right. I mean, if you're identifying in a certain way, then then you're sort of you're sort of, you know, putting yourself into a category Where you're comparing yourself to a norm, not necessarily, you know, a mean norm, but maybe some counterculture norm or something. So I mean that that's a really, that is a really interesting area of study. And I would love to look more deeply into that. But that I guess those are just how I think about big data and,

and nuanced genders definitions.

All right. Thank you very much. That was a great answer. Let's go on to the next question. We've got a few more minutes left. How does this work with not just the person who signed up for and downloaded an app, but their partner who may not have even been in a relationship with them when the app was signed up for?

I'm assuming that this is in reference to the period tracking apps with the mirror apps.

I'm

Not entirely. I mean, I assume that in order to use these apps to have you also have to download and install, like a partner would have to download it, install them to use the mirror applications. Personal, I haven't used them. I haven't. I haven't looked at their interface or anything. I've only read about them. But I, basically I imagine that in order to use them, you also have, of course, you have to also download and install the app. I'm not sure how to answer the question. More than that at this point, or if there's more to the question that I'm missing.

Well, well, thank you very much. And on behalf of all the whole 2020 attendees, the presenters and volunteers. Thank you very much, Kagan, for sharing your project with us today.

Thank you.