Hello and welcome to the Digital Alchemy podcast brought to you by the International Communication Association Podcast Network. My name is Moya Bailey. I'm an assistant professor in the Department of Communication Studies at Northwestern University and the founder of the Digital Apothecary Lab. I've invited Timnit Gebru to discuss the Distributed AI Research Institute, or DAIR, an interdisciplinary and globally distributed AI research institute centering people and communities over technology. Timnit is the founder and executive-director of DAIR and former co-lead of the Google Ethical AI Research team. It is such a pleasure and honor to be able to speak with you today. I've been so moved by your work - just the way you show up in the world. Maybe we can start there. Can you say a little bit about how you came to your work and how you started to become politicized? Or do you feel like you always were?
I was always politicized. I hadn't connected politics to technology until way later. I was politicized in the sense that, I had to leave my country at 15. My sisters had to do that. It was like guerilla warfare. Many people in my family were involved in revolutions, going to prison, or fighting and die in actual wars. In that sense, yeah, I was. And then coming to the States and being like, "What is going on? What?" So that was always a part of me, I very much understood the systemic discrimination going on in, like, my high school, at work, or in university. But I just did not connect the tech that is built, who builds it, and how that's connected to this. That is kind of like the work of Simone Brown or Safiya Noble. That came way later. About 2015 is when I started to see "Oh, these are the people who build the stuff that I'm around." And then "This is the stuff that's been built? Oh, okay." And then having more and more knowledge around that. That is the part I would say that happened later.
And would you say that working at Google was part of that realization?
When I was asked to join Google, I was skeptical, because a whole bunch of people sat me down and talk to me about harassment that they experienced and how it was handled. I definitely knew it was not going to be a good place for Black people. I had some experiences during my negotiation process. And I was like, "I don't know if I want to." But then Meg Mitchell asked me if I could co-lead her team and I'm like, "I have one person that I can work with." When I went to Google, I went with eyes wide open. I was super exhausted. I was trying to make things better. I was trying to make the environment better. I was building our team, just raising issues, whatever I saw. The realisation that I had after Google is that even the kinds of proposals that we would come up with, that seem very straightforward and that we really didn't have any opposition to - so for example, we came up with frameworks. One of them is called "model cards" or "data sheets". And this is saying, before you put stuff out there have lots of testing and documentation. The idea didn't get a lot of opposition at all because it's like a basic engineering thing. But in retrospect, they kind of don't want to do the implementation because translation is: per thing that you want to put out, put that thing out way later, not tomorrow, but like a year or two years from now, because you got to do all these tests, and then also hire a lot more people per thing. So I'm translating it to, "Yeah, make a lot less money than you are right now." And nothing else is telling you to do that. There's no regulation. So the realization that came after this: nobody's going to do any of this stuff, even the most banal stuff that you're talking about, without being forced to do it. Now you see this race, that this company comes out with this thing, this other company has come out with it faster. Car companies don't get to do that - I bet they would want to - because they're going to be held liable. I was just like, "There's just no way." Nothing's going to happen without some sort of regulation that makes all of them have to behave the same way.
And so that takes me into your founding of DAIR. Can you say more about DAIR and what DAIR stands for? And also, what made you feel like that's the move?
So DAIR stands for the Distributed AI Research Institute. After I got fired from Google, for a few months my full time job was dealing with that. I just couldn't really think about what I'm going to do next. Working at large companies was just not an option, for many reasons. One is obviously, they wouldn't hire me. And two, at that point, I had worked at three. It was clear, I wouldn't be able to do that again, even mentally. And so I had been thinking about a distributed research institute for a long time, having an institute that can hire people, but they don't have to uproot themselves, because that contributes to some sort of brain drain. If they're living somewhere, they're embedded in that community. And then they all have to move to a place like Silicon Valley. There's just so many issues because what you want is to not have that be the center anymore. It's to have various people in various places be able to impact the future of technology. So that idea was really important to me. And I saw during my time at Google, our team was very distributed. My co-lead was very good at handling that and most people were remote. Many of us didn't really see each other in person many times and I saw how important that was. There were people we hired, for instance, Mahdi who is Moroccan, who was seeing all of these issues with YouTube and was a victim of it. And nobody else saw that, it wasn't a priority for anybody else. People are like, "Well, we're gonna deal with the US election." That's fine, but YouTube is also causing havoc in Morocco. If you're going to operate there, you can't do this stuff to people and not deal with it. I realized how important it is to even understand what the issues are to have people like that. So the distributed part was the first thing that came to my mind. The second one was definitely to have an interdisciplinary team: whether it is sociology, computer science, engineering, or history, even we need tech historians. So that interdisciplinarity was important. And the third one is interdisciplinarity not only in discipline but also in the way you learn activism and organizing in our institute. For instance, Meron, she's a refugee advocate. She's had such a huge impact but she didn't go get a PhD or anything like that. I don't even know if she went to college. And then Adrienne is a labor organizer and she was a delivery worker at Amazon. So she first-hand saw what those issues are. It was really important for me to have people who are able to be lifted up in the process of doing research as well and able to speak on it because they know the topic. They might not know exactly how to write things in a way that gets them the prestige, but they have the most important ingredients. That was what I wanted to do, and honestly it was the only thing I could imagine being excited about - the only thing I could imagine trying to get done. If I wasn't fired, I would have probably built it up super slowly first on the side but because the whole thing blew up, I started full-blown. So yeah, that's DAIR.
And we are so grateful that you started it full-blown. I mean, just what you're talking about, particularly the distributed nature of it makes me think about how I'm thinking through alchemy. As I say, in my work, alchemy is the science of turning regular metals into gold. When I'm thinking about digital alchemy, I'm thinking of the ways that Black women and women of color transform everyday digital media into valuable social justice media magic, and it feels like we're turning something that people don't pay attention to into something precious. And so I'm thinking about your work as being really connected to digital alchemy. Does that language resonate with you?
Yeah, I absolutely love that. We definitely turn things into gold. And then people didn't know that this thing was gonna be gold. And once it is gold, they want in. 100%.
In terms of the people that you've brought together, have you seen that come to be part of how DAIR is moving forward? The different voices that you're lifting up is so important. Do you think that there's something about those communities coming together that isn't possible through other forums?
Yeah, for example, Meron and Adrienne have a connection because they do similar things but in completely different contexts. And so Meron is in Uganda, Adrienne is in Richmond, California. There's an 11-hour difference but they want to start a podcast called "Coffee and Cocktails" because when it's coffee time for one of them it's cocktail time for another. I can't imagine how would they have gotten together? If either of them had to be uprooted and somewhere else, the knowledge that they have and the roots that they have would also be uprooted. And for many people, that's not even possible. I like the fact that people can still be where they are and do their work, embedded in that community. For instance, Meron works a lot with the East African refugee community. And Adrienne worked at Amazon as a delivery driver, and she knows the Richmond area very well, she knows the Bay Area very well. And so she has a lot of connections there. Each of them have very local knowledge and work, also have this transnational solidarity across oceans, right? I think both of those things are important because they are both learning about the issues that affect the other and they're very related in many ways. And they can both work together while also making an impact wherever they are. I think that is the magic that you're talking about - the gold that you're talking about. Many times they have this knowledge that is taken by other - research is generally a very exploitative process. That's one thing that's really important to us is to not do that - how to have a non-exploitative extractive research process where we are doing community-rooted research, but we're not extracting from people and not giving anything back. This is the kind of stuff that I love. When I peek into the regular tech demographic, I'm shocked now. My everyday was these kind of people. And then I would have dinner with my former colleague or something like that and I would be near Google or near all of these tech hubs. And like, "Oh, I forgot what it was like." This is the space that we're usually put into. It's more energizing to be around these groups of people.
I love that. And your energy is something that just comes through. So in terms of just how you're talking and how I've experienced you in other spaces, I wonder what keeps you energized. I am always thinking about the apocalypse that's unfolding in our world, always, all the time. Given that we have perhaps not that much longer as humans, given the way we were behaving, I wonder what gives you hope - keeps you feeling encouraged at this moment?
I think there are multiple things. One is, anger can both be catalyzing but also draining. There's just the right amount of anger and fear that's just like, "Well, I gotta do something," because it's not gonna be fine if we don't, that's one. But then I feel like seeing the people I work with, being in those spaces, seeing what they're doing, and what they were able to accomplish themselves with very few resources. And just seeing people send me so many emails. They're so sweet, like high school kids or other people. It is more energizing to see that even the little thing that you're doing is maybe having some impact because it can be super demoralizing. You do all the stuff and then like one swoop, and then it's set back decades. Because you're right, it's a scary time right now, the way things are going in many directions. To have that balance of the right amount of, "Oh, we got to do something," and also give yourself time to imagine an affirmative future that you want to have and execute on it, it's a very difficult balance. And actually, it's hard for me to do the latter. I'm trying to figure out how to move - force myself and our Institute to also be in that kind of space as well. The people I get to talk to and the people I get to work with really are energizing.
Oh, I love that you've talked so much about people as they're talking about AI because people tend to think of AI is totally just a computer space, but I love that you've really flipped it.
I was just thinking about what you were talking about. And I wrote a paper recently. It's called "Eugenics and the Promise of Utopia Through Artificial General Intelligence". Basically, I was trying to figure out where does this concept of artificial general intelligence come from? Because what they want is literally a being that's a machine, whatever. And we traced it back to 20th-century eugenics. Then there's the transhumanist movement that was founded by the chair of the British Eugenics Society. And then there's like the modern wave of transhumanists, which brings us back to to this AGI, Artificial General Intelligence. They're fanatics and what they want is to basically have a merger with machines and transcend humanity. It makes sense that everything human is bad for them like, "We have AI art because we don't want human art." And for us, it's not. What we want is something that helps humans and if that can be built with very specific, well-defined forms of methodologies, like maybe AI, fine. If that cannot, no. It's not like this thing has to exist. The way in which it's been conceptualized is like that. I don't want this humanless world full of machines. Why would I want to live in that world?
Yes. And I'm so glad that we're in alignment about staying where the humans are and that humans are important. I want to say thank you. That was lovely and perfect. Do you have anything else that you want to make sure that you say or wish that I had asked you?
I would like to thank you for coining the term misogynoir. I was thinking the other day about how words are so important in my career. There was a before and after. Before I had those words, even just microaggression, half the battle is fighting your own gaslighting. Having these words really means a lot.
Oh, thank you so much. So for all those listening. Thanks for checking in for another episode of Digital Alchemy. This has been Moya Bailey and I was joined by Timnit Gebru and it was such a pleasure to chat with you.
It was great talking to you too.
Digital Alchemy is a production of the International Communication Association Podcast Network. This series is sponsored by the School of Communication at Northwestern University. Our producer is Dominic Bonelli. Our executive producer is DeVante Brown. The theme music is by Matt Oakley. Please check the show notes in the episode description to learn more about me, my guests, and digital alchemy overall. For more information about our participants on this episode, as well as our sponsor, be sure to check the episode description. Thanks so much for listening!