Identifying and Dismantling Tech's Deep Systems of Bias
YYashad KulkarniMar 3, 2021 at 8:06 pm32min
D
00:04Devin Coldewey
Hey, everyone, welcome. Thanks for joining us here today. This panel is about the systems of BIOS that are deeply embedded in tech, we kind of can't just sort of patch out as it were. But I'd like to start out by allowing Haben Girma, our guest today to introduce herself and her means of communicating, and how but maybe you can also tell us a little more while you're speaking about the how the benefits of the latest technology compared with the barriers that it introduces for people with disabilities.
S
00:40Speaker 1
Hello, everyone, this is haben speaking, I'm a Disability Justice advocate. And one of the reasons I came to this work, it's because I'm deaf blind world, mostly designed for people who could see and hear. So most of the technology that's built was not imagined for disabled people, which is frustrating. And also absolutely ridiculous. Tech has so much potential to exist in visual forms, in auditory forms, in tactile forms, and even smell and taste. It's up to the designers to create tools that everyone can use. And I want to describe one of the tools I'm using right now. So to speak, I'm using my own voice. But to know what other panelists are saying, I'm using a Braille computer. I'm holding it up right now, along the bottom are Braille dots. So as people speak, I have an assistant typing what people are saying, and the words pop up in Braille. So I'll be reading the words and then responding by voice. So you might notice a delay between when someone speaks. And when I respond. Back to you, Devin.
D
02:09Devin Coldewey
Thank you very much, Robin. So the whole idea that technologies, as you've mentioned, can and do harm or benefit different groups specifically, is very much at the heart of your work mentality? Have you found that the the bias there is primarily in the technology itself? Or is it in the people that are using it? Or is that distinction not really meaningful anymore?
Unknown Speaker
Um, hi, everybody.
S
02:33Speaker 2
Thanks for that. Great question. So I came to this work really through policy. And in the policy world, we're thinking about impact, right. So we're not necessarily thinking about technical infrastructure, or, or HR, but one of the things that I found throughout my career, both in industry and in research is that the two are linked. So there is a problem of technologies which are inherently racist, or sexist, or ablest, as heaven, so beautifully pointed out. But there is another part, which have haven't really spoke to as well is an imaginary for technologies that could actually serve all people. And if the if the scientists who are creating those technologies don't have experience outside of their own experiences, or we're sitting in a moment where Google AI has got rid of Michel, Michel and gebru, both of whom were technologists, from researchers from minoritized, communities who are thinking about new and different ways that tools could be designed, then you may not see them coming to products, I'd say that the two are definitely married. And I'm really looking forward to the rest of this conversation to kind of get much more deeper into that.
D
04:05Devin Coldewey
Absolutely. Thank you for that. And Sophia, your your work has found bias. That's a little it's both more obvious and more subtle, specifically in, among other things, search engines. Do you would you say that what would you say is the main failure by companies like Google and others who have attempted to organize and present information from the internet?
Processing audio...