Principles of Digital Autonomy
7:54PM Aug 1, 2020
Welcome back to hope 2020 it's time for the next talk. But first as a reminder, we are raising funds for the electronic Electronic Frontier Foundation please donate to E FF. At the links above. We want to help out our friends defend internet freedom. Also, later today we are going to have hackers Got Talent. We have no idea what's gonna come at us. It's gonna be interesting. Our next talk is principal of digital autonomy, by Karen Sandler and Ollie to blonk. We have rights with respect to our technology, these rights are imperative to ensuring our digital autonomy, our right to be in control of our own destinies. As the border between the physical and digital breaks down, it is increasingly becoming necessary to re examine what we consider to be rights that protect our digital autonomy. I'd like to welcome Karen Sadler, a cyborg lawyer and an advocate for rights related to software on medical devices. And Molly LeBlanc, a digital rights activist who draws on her experience of using technology while bipolar. We're going to begin with a video presentation and after that, your questions, please ask them now in the matrix chat.
I'm Karen. Hi, I'm Ali. And this is the principles of digital autonomy. I hope 2020
My name is Karen Sadler. I'm the executive director of the software Freedom Conservancy, I got really interested in tech ethics. When I got a pacemaker defibrillator, I have a heart condition and I need a pacemaker defibrillator implanted in my body. And because I have engineering background, I started immediately asking about the safety of those devices. And that whole experience caused me to be passionate about making sure that we have access to the source code in the important technology that we rely on. And having the experience of having my defibrillator made me realize that a lot of our tech may not be made for our situations in particular, and being able to modify that software and have control over it ultimately, is, is going to be extremely important in all the situations that we rely on. So I think that the best thing that comes quiet as well, I'm a cyborg lawyer, and honestly, who else but Karen would have the audacity To ask the manager for the source code and her defibrillator, parents,
I'm all into blonk. I work with academic Foundation, I have a bunch of other affiliations within Detroit and free software. I basically consider myself a free software and digital rights activist. And I care a lot about these conversations from this high level philosophical rights. years ago after talking with Karen Actually, I was able to kind of realize all this, myself as bipolar and who uses technology, very integrally to it that we do while we have like, like fully proud of and excited about. We're here for this project that we worked on together. That's definitely a representation of our own ideas and does not speak for anyone else. To
secure, why does digital autonomy matter?
Well, the technology paradigm has changed Molly, the paradigm is really different than it was before. And it's more than just software freedom. When I started thinking about this, I needed when I need it was around the time I need any new new defibrillator. And, and it was almost impossible to get one that wasn't broadcasting wirelessly with no encryption by default all the time. And I became obsessed with this idea of not broadcasting. And it sort of led me down this path of well, it's not just about not broadcasting, it's about having control over the technology that you're relying on whether it's in your body or something else. So I moved from thinking that it's not just about the right to not broadcast, but it's actually digital autonomy.
Yeah, I'd like to add to that, that
I personally and I hope many people I think many people would agree that violence autonomy really matters. And this was an idea I got into my head a long time ago. And when I began thinking about digital spaces and the way we exist and live in them, it felt obvious to me or at least it felt necessary and important to acknowledge and create ways to talk about how our autonomy, how the bodily autonomy and those concepts extend into these digital spaces, and how all the rights that we have in one also exist to the other. So we think it's time to make a declaration of digital autonomy.
Yeah, the tools that we have are only one piece of the puzzle. We've been a community of activists that have been working on a lot of different causes, all of which are important, but we've been working in our niches, and it's time to tip the balance of power and stand up for these important rights that that introduced So in a coordinated way.
So to say a little bit, the principles about the principles of digital autonomy, which is a document we wrote that will be available is available for you to read online. It's a framework. It's not something that I would say is a complete prescriptive list of things. It's a way to look at technology and to look at like the world in general, and kind of analyzed or align your,
like what people are producing with this set of ideals.
So we cannot underscore enough the fact that this is an ongoing process, it's a work in progress. You'll see that if you take a look at the draft, that it is, in fact, an early draft and it comes From our discussions over the years with a lot of software freedom and other tech ethics activists, and I think that while we examine the principles and how, how they should be refined, will will establish a community where we work together and, and really bring our focus for how best we can empower technologists and everyone else to stand up for, for the tech ethics that we should be demanding.
So we're gonna talk a little bit about some background, which largely means we're going to be defining some terms that we use a lot. Some of them are terms we like a lot, and some of them are terms we don't like, but use a lot anyway, largely out of habit. And I just kind of want to make sure that we're all on the same page and you understand what we mean when we say Do these things.
Molly spoke very eloquently about the connection between bodily autonomy and her thoughts on digital autonomy. Basically, it's very simple. It's just we are self governing actors we are in control of our ourselves and we should have that autonomy that control.
And when we talk about autonomy, we largely talk about personal autonomy meaning like you yourself as an individual, um, but we also think it's really important to think about collective autonomy and what it means for us as communities and groups of people.
and basically that's the the, the
interface that we have with the, with the digital world apart that interact in, in cyberspace. Acknowledged The information about us exists everywhere. And even when we are not actively interacting with it, the impact of us remains and then the consequences of that digital impact continue after you cease to interact with it.
So digital autonomy is this idea of looking at taking taking these two parts and kind of smashing them together. So, it's the way that we think about our autonomy our rights are or existence as something that is self governing, I submit both has the right to be self governing, but also just like is by its very nature, in digital spaces and with response to the way the way our digital selves exist
in autonomy is shifting the power from those companies and organizations and others that can control the impact of technology on an individual's life. So it's all about this idea of, of empowering individuals empowering developers, and examining how how the power structures that currently exists can be shifted to a more reasonable one.
So REITs will be a term that we've used before and will use again. And the thing about REITs is that they're things to which were entitled, so who because we exist. So when we say we have the right to something, it's really like this is just something you deserve. This is something that belongs to you. Because you're alive.
Technology technology is simply much more pervasive than it ever has been before. We are using technology to touch upon parts of our lives that we haven't in the past, and where, where we had been in a situation where, for example, you might, you would never consider your toothbrush, per se to be technology in the way that we mean, a tech toothbrush can easily be connected to a network and have a camera and do all those things. So we really, we really generally mean technology to be quite expensive.
And then users so user is is the term that I don't love. But I definitely use it a lot out of habit. The reason I don't love it is because I think it creates a dichotomy, like it creates a binary dichotomy between people who make things and then people who interact with those things. And like kind of puts the people on the user end in a situation where they don't have a lot of power. That being said, it's also really Useful term because it specifically references the person interacting with a piece of technology. So when I say user, I want you to think about a person who can be empowered and who is collaborating with the things that they're interacting with. But also, it's just like, you are interacting with a piece of technology. So you're a user of a traffic light. So you are walking on the street,
so what we're gonna do now is we're going to talk about the principles and then to provide some context for what they mean is we're going to analyze zoom them using that framework. So just to like, make sure we're all on the same page here, as soon as a piece of technology that is used for video conferencing, you might be using it today might have used it today. And it's some like little important notes I think are It was created by a company called zoom. So when you talk about zoom, when we talk about zoom, we might be talking about the company or the product itself. It's you It's very popular and has a really robust set of features, some of which we'll be talking about.
I believe that that hope is using zoom for this presentation. I'm it's a little unclear from the materials, we're recording this in advance using the blue button. I want to also be clear that we generally try to avoid using zoom and we'll talk about some of those reasons why. But But zoom is a very good technology for us to use. Just as an example, the principles that we elaborate are only as good as they actually work out when you examine them against the real, real world cases of technology. And so I think it's very helpful when we talk about what we think are the most important principles of digital autonomy that we do so in the context of looking at a real piece of technology.
I also, it's important to note that we're giving some examples. This is not a comprehensive analysis of every aspect of zoom. In relation to the principles, it's just like, here's a few examples that relate to a given thing.
So, principle one.
So the first principle is that it's in service of the people who use it, which is exactly exactly how, how it sounds. It's that where a lot of technology is created, by Companies like in the case of zoom, often, the the technology is created with various agendas. And so we thought that this was the most important first principle to have, because it, it underlies who we're focusing on the benefit from for this technology.
The people using it in this case, it's a pretty big category, because we have, we focus largely on the end users. But the people who use it are also include like the individuals who are working on it. It includes the people in the background of the person who's like using a suit who's on a zoom chat. And it includes to an extent the power structures, that and the organizations that are like in the case of people using zoom in education, there there are also these powerful structures that we'll talk about more later on that make these decisions about who to use. And
Doom is super useful, is really great. It has a very robust feature set.
I use it to talk to my doctors and therapists right now or therapist, one therapist, multiple doctors. Um, you know, it's so to some extent it's HIPAA compliant, though Karen has some comments on that. I don't think it's important to
where we're, we don't have the time to get into it deeply. But I think that that some of the HIPAA compliance is overstated, but it is.
I agree with you entirely.
He cannot underscore the the usability of zoom and the fact that It is so pervasive now, which is why it's a it's not just because you're probably watching this via zoom right now, but also because it's everywhere.
It's, you know,
it's a part of this like really successful component of the software.
Oh, no, we got okay. So that as I was saying, it connects people, which is a really powerful thing. And this is one of the ways in which things that we don't necessarily like about zoom that we really don't like about zoom are, are are just a piece of the puzzle. Because when we talk about this first principle of in service to the people who use it, zoom is really connecting people in a way that otherwise they would be really isolated. During this pandemic, a lot of people are spending a lot of time by themselves and for them Zoo has been a little lifeline. I think it is easy for, especially for technologists, like the most of the audience here at home, people who are technical, we understand that there are other options but but for folks who are not as familiar and not as comfortable having access to zoom is huge for them. Like it's the way that they became familiar with video chat. And it's it's changed their lives. And I think that that is like a massive plus for zoom. And the reason why it does it is in fact, in service to the people who use it, even though in many ways it fails other parts of this test.
I think usability
is a really good in general is like a really good point to think about when analyzing a piece of technology, being in service to people like does it does it work does it actually just
that being said,
they you know, the design of zoom is not by you. It's they take feedback from people But they have their own. The company has its own agenda, its own roadmaps and its own plans for features based on what they think. So the service it provides, is it's a per script of service. It's someone else deciding what you need, and trying to meet what like, can construct their vision of what they want users to talk.
And again, this is just an
overview of, of some of the ways in which we could look at zoom in the context of principles, but certainly not everything. So principle two is informed consent. And that's that we must be able to understand the technology we rely on and it must be real consent, not just a default consent, which means that first of all, you have to really be able to understand the the terms that you're agreeing to. Studies have shown and we've studied have shown that the average person would have to do nothing but read Terms of Service for three months, in order to be able to read the terms of service that just the average person agrees to, and often those terms of service are in legal ease, it's not possible for people to actually understand those terms of service and consent to them. Furthermore, consent when you really need the device, you have no choice. So for example, you know, if I, you know, if you if you need a life saving medical device, and the question is, do you want the device, you know, do you consent to having device and the alternative is you can't have it and it's same with any other technology if you if there's no other real choice, then it's not it can't really be consent because it's coercive. So it needs to be consent in a way that doesn't sacrifice our autonomy. And, yeah, yeah, I think Carson
sorry if you're
going to take
Myself, I'll find myself the next time I read one. I read a set of those, which I do some.
I'm not because I'm using something but because I'm thinking about that piece of technology. Maybe I should do what I'm using things. Anyway.
ah, so I found them really understandable actually. And I will couch that by saying, I have been a professional to do this for a long time. Time. Um, I have read many pieces of legal information. My undergraduate degree is basically in reading things I don't understand trying to understand them. Um,
I have I come with a certain amount of training and privileged for someone who is not a lawyer and not a developer. I'm not a politician or a policy expert. I'm from like a an accreditation standpoint. So I actually got a pretty comprehensive and pretty readable and I thought some of them weren't that bad. Um, they their data retention policy talks about for example, how they'll store information only as long as as as they consider as necessary. They tears there's a thing about on totally blank on it now, I'm
doing things only as long
as necessary. Oh, like, like, they actually don't store recordings that you make on their servers either like you, I think the automatically put them to Dropbox or like suggest you put them in Dropbox or they'll store them until you take them away. But but they don't keep everything forever.
Okay. Yeah, I mean, I think there's a lot
of judgment there that they have. And you know, like that, for example that they think is necessary or that they determined not more than necessary. And that's a problem with a centralized service. I think that the other problem with zoom when looked through this lens is that the realistic way that most people encounter zoom and click so it's problem with click through Terms of Service. When most of us get access to software for the first time and have to click through it. We're only doing it because we have a need, right and with zoom in particular especially recently has meant on classes, you know, I teach classes on the law school and a mandated to teach that class via via zoom. So you know, the students have no choice. You know, public school systems have been using zoom. And there's a power dynamic involved often were the person who, who there's someone who decided to use zoom, and then there's the user. And that's when they click through those terms. And they're just doing it because they have to talk to this person. They don't necessarily even have time during that moment to read it, let alone think about it and understand it. And often there's no opportunity to suggest an alternative. And so that's a real problem. So I think this is like this is endemic to any kinds of click through policies, they are deeply problematic, and I feel like they can never be okay. It's not just
that there isn't time to suggest an alternative which is true, but like the power structures involved in that conversation might not make it possible. Right. Like When I was a freshman in college, and even through most of my university career, if my professor had said we're using this piece of technology, I would have just been like, Oh, well, I'm I don't want to argue with you about it. Even though I don't want to, like, just very few people have that power to argue with authority to to bring issues with authority. Yeah, we should all
be parents. We can all strive to be parents in the right situations.
Not the bad kind of Karen's, but yeah, so. So I completely agree. And a part of this is that because the software in zooms case is closed, proprietary, there's a limit for how much you can understand. So when a solution, like for example, Big Blue Button, or jitsi is deployed, instead, there's some ability to, especially collectively, maybe not you Individual on their own, especially those without technological back with a tech background. But but there's an ability collectively, for us to understand how it works, which we don't have. In proprietary software, it's always the answer will be that it's opaque. And we have to believe the companies that that the software is doing what it what they say it will do. And it often requires a scandal in order to make the company change, even when it's known that there's a problem or that it's doing something that it shouldn't do. And this is just a basic problem with proprietary software.
this is not just about the code that this this is in part about the code, but like systems architecture and design and workflows.
I don't code but
I do understand a lot about how many computing technologies work. And I can't build them but I can explain to you how they work.
I did a lot of research
and I could not find adequate documentation around zoom to understand
what's processed on
the server versus what's processed on my computer. And that's actually like a really big thing. So it's hard to understand how it works. Yeah. And on the flip side,
like just because something is free and open doesn't necessarily mean that it's understandable just because the code is available also, like that's not necessarily sufficient.
So we got excited,
and we talked about how social politics are complicated already.
Yay, this one.
And the next slide, right, so empowering individual and collective digital action. And that's our third principle.
we think this is really important for a few reasons. One of which is that We're not isolated, right, and especially since we're using this technology to connect to one another, and connect with one another and work together and collaborate in these new amazing heretofore on thought ways.
need to think about what it means to empower the individual but also to empower us as groups. Whether those collectives are geographically based into local communities, or its people rallying around a project or a mission or an effort. I think for some of us, we're involved in online communities that do things from being social to building software that goes into space. So there's a pretty broad range of like wonderful, amazing things we do, and that stuff that we do individually and collectively. So to talk about zoom a little bit by that one of the things That a conversation that exists in the space of technology and collective action is is who owns servers, right? And who controls technology. So zoom is owned by zoom and the servers are actually I don't know if this are reserved by steam where they use Akamai or AWS or whoever, but but the point is that we don't own our zoom server and we can't run a zoom server. That's right,
the service is centralized. So that means we rely on the zoom company as a gatekeeper. And that's deeply problematic in a variety of ways.
it, it's not just, you know, where the information is being stored. It's that you can't move it anywhere else and you don't have the ability to Change the functionality or, or or do anything else about that and because it's a corporation, there's no way for individuals to to influence that account. We have another slide about this. Yeah.
So this reminds me so in in our our setting up to record today on Big Blue Button had a few issues. And because we're using a server hosted by some people we know it was really easy like, again, very lucky, very privileged here. It was easy to say, hey, can can you fix this or if I'm like, get something handled right away, which was pretty nice. Um, and when something when you at least have the option of hosting things on your own server, you have the option of
controlling access to but also fixing things yourself if that's the thing you want to do.
Right And if we're analyzing
where zoom is on the within the principles and whether we can have both informed consent and also whether we we can empower individual and collective actions. Having alternative viable alternatives is an important part of that puzzle and, and really speaks to other people can both agree to that software and also have the right kind of action, can they move with their feet? Can they go somewhere else? Can they take their data? Can they have these conversations elsewhere? There are quite a lot of video chat alternatives available, which is nice. It means that you evaluate zoom differently in a rich ecosystem where there are both proprietary and, and free and open source software solutions that can be used. I also note that one of the things that is Good about zoom is that most of the I think it's by default, there's an opportunity to participate in a zoom meeting, simply by phoning in, there's like a regular dial in number. And that means that you can bypass much of the proprietary software and much of the surveillance, you know, of running something on your own computer. And so having that is, I think, a very real plus, I presume.
Mm hmm. I realized that this slide is kind of ambiguous because it could mean you can just use your phone to do your planning, or you can phone into zoom. Oh, what would you go for it?
for like a phone conversation is an alternative to zoom to just have zoom entirely or you can use phone as an option of using zoom and having those different, like different levels of ways that you can use a technology with different levels of centralization and connectivity is an important part of balancing the needs of corporate interest versus the needs of individuals and As we exist, and being able to have those options means you can have the appropriate skills that you need. And so zoom is one of the few pieces of technology that isn't all or nothing, so actually applaud that.
Um, something that I think is super important about collective action in general, is it's necessary to provide and have safe spaces where we can develop ideas as individuals and groups. Whether that is like me just saying dumb stuff when I'm 13 and figuring out who I am when I'm between the ages of 13 and my current age, which has been a very long process that is ongoing, but also for communities to do the same thing. And I think that by providing
like providing alternatives
is one of these things, but also like providing safe spaces is a big part of this and a big part of collective action. So I kind of like gives him two sides for this because it does. Like it connects people and it provides people these opportunities to even when they're physically isolated, create a space to have a conversation. But as we'll talk about soon, like how safe that spaces is unknown, someone else's controlling that space
and I want to reiterate that we avoid using zoom. I don't want to make it sound like we approve of zoom. We're just trying to evaluate it with an impartial eye towards what we think technology should be about.
Should here's talk
about free software we love in both us because we have bad things to say about that too. Um,
yeah, so the basic functions of zoom Really do empower users and empower collectives. Because it creates spaces for people.
Excellent. So our fourth principle, all right, yeah,
protected so so that protects citizens privacy and other rights by design. So basically to gather and record a minimum amount of data needed to provide service and like making sure that regular deletion of inessential data is telling from the outset not tacked on afterwards. I'm not going to skip to the zoom analysis on this but it's a great example. So I think you're, you're gonna give it but but what basically that our technology at this point, we need to have these issues in mind from the outset. It has to be planned. It has to be baked in. It can't be something that's just patched on.
And this is this is this other few sides whatever Which is comes from policies that the the creators make. It could be about the technology within the it could be about, like very crunchy bits of the technology itself. So like how are they managing security issues? Like what kind of encryption are they using? How do they do authentication? Are they using two factor authentication? Are they using your cell phone?
or anything? Yeah. Are they using phone? Which I have opinions about I won't get into right now.
And then there
are design issues. Right. So can you use a given piece of software with the camera off? I'm so so. So the ways that we think about privacy and other rights to and the ways that these things are implemented. There are a lot of different parts to that. So to talk about soon, we're going to talk about encryption. And this is, I think it's really horrible that it happened. But I also kind of want to like thank the company for providing this beautiful rich example.
We can't just like, like zoom wasn't encrypting. Like they said, We had to intend encryption on our video chats and they did.
And that came out it was a big scandal. And then they said, Okay, so we're gonna add it, but only for paid accounts. Please comment on that for a while and people were really upset. Okay, fine. Last I checked anyway, everyone can have it whenever that gets integrated. One of the problems with analyzing and talking about technology that you don't use is if you don't use it, you don't necessarily know what's happening with it unless you like, do your your regular checkups and there are a lot of things to regularly check up on so I'm, I'm a bit behind on this one particular piece of zoom policy. But yeah. So in so the way encryption was approached was that it was this, it was an acknowledgment that it was an important part of the process, both from a policy standpoint, and from like a technological standpoint, but there was a breakdown in that being implemented and delivered to the people using it.
Yeah, I mean, there's no
two ways about it. Zoom is a surveillance machine and it fails our tests in fundamental ways. Up and down all of the principles and and the relationship between privacy and other rights and, and security and the very fundamental surveillance component of business models that are being established, with the hopes of pivoting them to profitability over time.
I don't I mean, I know we don't have For this, but one of the things about zoom and privacy and free speech is that
I think it's like
organized like organizers can see the private communication like private text chats between people using a given like in their meeting.
So you don't even have
like private communication, you have the illusion of private communication. And I bringing this up to illustrate that. Like, if we talked, we talked a lot about privacy, and the idea that like, this is a right you have but there are other things that we can also talk about. And zoom does like affect those metrics?
I think we
covered this. Yeah. But just to say that in in Zoom's data policies, they don't sell data, right. They say that They say we don't sell your data. But they do share their data when they share it with people like, who they define as partners. Though, that data is used for things like marketing, providing service advertising. So while they might not be giving your information at data brokers, it is being like it's being used to like profit off of you anyway.
So long as data is
being collected, also, it's an attack vector. Yeah.
So that's basically what we have. We have this cool website up, that Karen made happen. It's
1993 all over again, everybody level
But the idea is
get a draft up there to have some thought and discussion about it. So
take a look. Let us know if you have any thoughts. I think we have some time for questions. I hope this is recorded, so we'll be checking in on a different day. But yeah, I think we have one more slide with our email addresses with some
more information on it. I just like that the color of that fish. Um,
so these are our email addresses and our Twitter accounts. There are other social medias too. that exists.
Yeah, all right. Oh, I should use mastodon I don't I feel shame for that.
Uh, yeah. So tech autonomy.org.
You can email support at Tech autonomy. org. You can tweet at us. Please only tweet nice things about how much you liked this. or thought critical
things about ideas you have. No. Um,
we look forward to hearing from you and talking with you more during the q&a time.
Zim gods are not smiling on us.
Welcome back with our speakers, Karen Sandler and Molly blonk.
And for a little while they're a cat for caturday. Now, to the first question that I'm sure everyone is thinking, How do you feel about the fact that right now, you're in a zoom meeting? Did you bring up Concerns about this to the organizers of Oh,
I'll answer the first part
of that myself, I have to tell you, it makes me really angry to be in a zoom meeting and a little bit helpless, like I try to avoid it wherever I can. I make exceptions where I feel like it's the only way that we can effectively carry the message. And sometimes you have to meet people where they are. So sometimes you have to use the tool that people are using. And in in this situation, I know that the hope organizers are working hard to not use zoom. And that I'm, I know that all of you would much, much, much rather not be using zoom, and being able to put up a big conference without a huge amount of planning and to make it stable and online and available for everyone. You know, I understand that you've had to use a hybrid of technologies and appreciate the free ones that you've used. Molly, did you want to add to that? Yeah,
um, so where I work at the gum Foundation, we just finished our conference and we had a lot of conversations very in depth about what technologies we were going to use, because you do need to use a hybrid of things. Um, and we did use big blue button, but that worked for us. And I actually was just like, I assume hope is going to be working with such a capacity of people, that certain things aren't going to be possible for them. I know that there was a lot of concern I asked like, you know, you send us lots of emails that were really great. That demonstrated you were thinking about these issues. So I assume that you came to this conclusion very consciously, conscientiously.
Okay, like, generally,
there's, I'm sorry, just one more thing, which is that like, generally you have to assess where in the power structure you are. So like, if you are the one who is choosing the meeting and choosing the way that you're talking to people, you should always make that choice of free tool. If you're not, you should use it as an opportunity for advocacy. But if it's going to frustrate the underlying purpose and mean that you can't effectively coordinate with other people, then you're going to lose more than you're going to gain. And it's just part of it.
Yeah, that feeds into a question that I saw floating through chat. How can one work to suggest alternatives to proprietary solutions. One example that an attendee mentioned is that they got in trouble recently, when they attempted to convince a local board president to switch to jitsi from zoom, and then people ran into problems. And I'm certain that a lot of attendees would have had similar experiences trying to get the general public to use a free and open solution sometimes can be pulling teeth.
Um, so I'm going to start by saying at my previous job, one of the things they did was work on talking with people about how to use better technologies. And the answer that I came up with, was that there there is no just one way. You have to think about who you're talking with what their needs are. One way you can talk about it is focusing on ethics. You can talk about your preferences and what makes you comfortable. You can talk about the practicalities. Software freedom
has had problems as a cause and a movement because we've done a really bad job of using empathy to connect with the people who we need to connect with through need to make better choices. And I think the way that we've demanded different solutions and stormed out of the room when people weren't willing to work with us, was part of why we lost traction and so finding ways that you can help people but forgive them when you understand that they live in a world where it's not as easy to avoid proprietary software is I think the the only way and then identify demonstrate that good behavior yourself whenever you can. So to choose those better sessions to demonstrate to other people that they work and that they're great.
Yeah. Now one thing that has been asked for as a people really want to get a brief rundown of the concerns about zoom. One that I noticed early on was that citizen lab and internet watchdog group based at the University of Toronto, they found that zoom, at least back in April used non industry standard encryption techniques with identifiable weaknesses besides maybe something like that. What other concerns are there about zoom? So one of
the things we did was we wrote different, we analyze technology in the context of the principles as part of the process of writing them. So one of the things you can do is you can email dots at Tech autonomy.org or Malia tech autonomy.org, and I can send you that document. Which is a few pages of some analysis. I'm trying to
rapidly think about some things to highlight quickly. But we don't have a lot of time left. So I'll just say email me and I'll send you something.
Okay? The other challenges to more widespread adoption of free software tools for teleconferencing and collaboration are these mainly technical, commercial practical issues.
I think it's
a it's a combination of factors. I would just say like never has the opportunity been so great, like folks that I introduced to jitsi at the beginning of the pandemic, many of them are, that's Molly's cat bash was adorable, I would add, but and bash is a great name for a cat. But like so when when people hear about these free solutions at a time when they need them and it solves a problem. It will change their technology use for the foreseeable future. Once they're comfortable using that tool, they'll recommend it to other people. I have elderly family members who are evangelical about jitsi now because they're like, No, no, we should just jitsi it. And I never heard anybody use jitsi as a verb. But But then I did. And so like, I think that that, you know, I think there's a huge opportunity, there are huge challenges, which is that like, you know, we need to have more stable instances that are available to the public. And, and, and better education and better ways to get the word out there that these solutions exist.
companies like zoom do have marketing teams. I haven't interacted with the zoom sales team, but I have interacted with other sales teams, and they will come to the companies, they'll reach out to you, they'll offer deals, and that kind of marketing makes a really big difference.
Right, so we've got time for one last question. If hope as indies want to advocate for free software, do you think they should start with something like jitsi or Linux or the concepts of free software or With something smaller and simpler, um,
you can start caring, and
then I'll finish. I mean, I think it's a combination.
So I think you have to start with the big concept and then start with something small that people can use right away. So highlighting the problems with proprietary software and with surveillance technology, from a holistic way and then drawing it down to the examples of what they're using at the time with something specific that they can use right then. So jitsi is a good type of thing that people can use, or, or Linux but also something more like Lieber office or Firefox, things that they can use right away without disrupting their setup. And I tell you, it is always a good time to talk about software freedom. So in normal times, if you're or or maybe before times or after times, who knows? If you're on a plane is a perfect time, you know, like, just big and small. What do you want to say, Molly? I wanted to
first say that I really like talking about free software. And I want people to walk away knowing that this isn't just about free software, and that free software is part of the conversation.
then I'll just like
I'll add a plus one to like, look at what situations people are in, and then offer alternatives or suggest those things when you're planning and stuff. So like when you plan your next Hangout, I suggest everyone use jitsi because you don't have to log in or create an account.
Right. Well, we're out of time right now. We would like to invite you both join us in the matrix chat for our attendees that have more questions. And Karen Sadler, Molly to blunt thank you very much for being here. I hope 2020
This is my psychic.
We can with it. We can send any message We want we will construct this device everywhere. In the end Americans will be ours.