All right. Well, it's five after according to my my clock. So I think we'll go ahead and slowly get started. Good morning. Good evening. Good afternoon.
Thank you so much for being here today. Today's session is available in both English and French. If you could take a moment right now and pick the appropriate channel. You should see at the bottom here as the screen shot indicates where interpretation is. And there you will want to select either English
So take a moment for people to pick the appropriate channel. Welcome and thank you to Scott who is doing our French interpretation today.
So today, here's a little bit of the agenda of how we'll be spending our time together. We'll have where we are now in the welcome in housekeeping. We'll review our brave friendly space agreement. Then we'll welcome Isabel to present and share more about her research. We'll also have time for a q&a today. Just to kind of have a discussion about any questions or thoughts or reflections you have about the research and then we'll wrap
up from there.
I am putting into the chat right now. The link for the data internet connectivity reimbursement. So as you probably saw registration for community hours, attendees who reside in regions where expenses for private internet or data costs are prohibitively expensive, art and feminism is able to offer a $5 food excuse me stipend to offset that cost. So if that applies to you, please fill out this form and payments will be processed within 10 business days. You can send any questions to accounting at R and feminism.org. Also put that email in the chat as well for any questions we got me back. So as we like to start all of our sessions with our brave friendly spaces agreement. The goal of this session is to create an encouraging space for collective learning. This requires intentional behavior, wherein participants are conscious of and accountable for the effects of their statements and actions on others. Respect our experiences and the experience of others and recognize that we can't do this work without one another. We agree to hold each other accountable to foster a brave and friendly space. So now I want to just give notice that we are going to record this session so anyone who wishes to not be in the recording, please consider turning off your camera and making your participant name as anonymous. So I'll pause just for a moment there for anyone to make those adjustments they need to and then we will be recording and just bear with me because I'm going to record on two different devices just so we can make sure to have the French the French one as well.
Oh there we have it. The recording has started. Now, we'll we'll go ahead and we'll get into it. So Isabelle is a fifth year PhD candidate at the Annenberg School for Communication at the University of Pennsylvania, where she researches inequality, technology and information. her dissertation project in this study investigates the prevalence of digital gender gaps, especially within open technologies, like Wikipedia, and open source software, and activist attempts to solve them. Prior to graduate school. She worked for Girls Who Code a US based nonprofit teaching girls and young women computer science. She received her BA from the University of Chicago.
So with that, I'd love to hand it over to Isabel.
Thanks, Kara. I'm I'm so pleased to be here. I'm gonna take a minute to share my screen cover. Can everyone see that? Great. Um, so yeah, I'm really excited to be here. I've been doing this research for several years. Now. And I mostly talk to other academics about it. So it's absolutely delightful to kind of speak to a much wider Wikipedia community invested in kind of the same issues and solutions that I have been looking at so I'll kind of review some of my research here. But I want to start by kind of diving into the state of the art research on Wikipedia as content, gender divide, and then I'll kind of quickly position my focus on studying Wikipedia or how I approach studying Wikipedia before getting into really the meat of our discussion here, which is a recent article I published with my advisor on bridging the gender gap in Wikipedia. I have a couple of implications for editing that arise from this research and hopefully that will be of interest to you guys, as well as a nice launchpad for some discussion at the end. So as many of you I'm sure are familiar, Wikipedia has a large content gender divide, there's about 1.8 million biographical articles on the English version of Wikipedia. However, only 20% are about women and non binary people. So this gap is more than a million articles large we would need to add over a million articles to even start to approach kind of an equity benchmark. Um, however, just this missing content is just one small part of the content gender divide on Wikipedia. So fewer women have biographies on Wikipedia than comparable men. And this is based on research that looks at coverage and other encyclopedic project projects, particularly I believe Britannica here. So women are less likely to be covered in Wikipedia than they are in Britannica. And other studies have looked at more specific disciplinary focuses. So comparing sociology, sociology, academics, women are less likely to be covered than men even holding constant like things like position, citation levels, and type of work. So we do see that
even women that we should maybe should maybe have a biography on Wikipedia are not covered. Women's biographies are also more likely to be nominated, so nominated for deletion. So even if someone writes a biography that might not last. We know this in cases like the pretty famous case of Donna Strickland, who didn't have a Wikipedia page until after she won the Nobel Prize. In physics in 2018. I believe, um, but she someone had tried to create a page for her in 2014 That was then deleted, or I believe it was in 2014, um, women and then even if we do have a biography, and it passes any sort of deletion debate, so that it stays on the encyclopedia, women's content has left less coverage than men's content. So pages are more likely to be shorter. And this is true, I think, for biographies as well as for content that's more stereotypically kind of feminine. So we see this missing content exists on several different levels. And continuing looking at pages that do exist so that are written we can also see evidence of what's called linguistic bias, which means that women's pages are more likely to include more information about family relationships, things that specifically mentioned gender than men's pages. So you can think about how this reinforces certain stereotypes that exist. And then finally, we also see a gender divide in the way that the like relational Knowledge Network of Wikipedia so if you think of how Wikipedia is really built as a series of links connecting different articles, we also see a gender divide here so women are on the edges of the Wikipedia they become kind of marginalized or less likely to be found than men. And while women link to men, men don't link back and this is what I call asymmetrical linking and works to really make women less central than men within this kind of knowledge network of Wikipedia. So given these different content, gender divides, we can see several different consequences. Of how this impacts the wider information environment. And so the first of these is the quality of information search. And we can obviously see it in Wikipedia. If you're searching Wikipedia for someone and they don't have a Wikipedia page, or if they're not linked. You might assume that person's not important because they don't have an entry. Or you might not even know that they're missing if you can't find if you're if they're not linked. For example, you might not even know that they exist. But this Wikipedia information also feeds back into the wider information environment of the internet. So Wikipedia often populates the Google Knowledge Graph, which when you do a Google search pops up on the kind of the right hand side. Wikipedia is also a lot of the data that Siri Alexa and other kind of AI virtual assistants draw from when they're asked a question. So the same kind of missing information, where you either think something's unimportant because there's not enough information about it, or not even sure that not even sure what is missing, can kind of reach out into wider domains just because it's missing on Wikipedia. The other issue is the quality of data. A lot of projects use Wikipedia data as the basis of their work. So this happens a lot in research people will kind of use Wikipedia to train different models or to produce a certain set of results. But don't really think about how those results might be impacted by the gender gap on Wikipedia. GPT three, which is one of the new kind of large language models you might have seen or heard news of recently, was recently released about 3% of its training data comes from Wikipedia. So all of the issues and the biases we see on Wikipedia are also kind of being fed into this model, which is then used in a whole range of projects again, so again, Wikipedia has content gap while obvious on or our focus is on Wikipedia. It has a much larger impact within the larger information environment.
Hey, Isabel. Yeah, I'm sorry. Did you slow down just a little bit? Yes, yes,
I will. I will do my best.
Um, thank you.
So my focus on studying Wikipedia. I try to focus on the work being done so not the lack of it. And what this means is that a lot of the previous research has focused on mapping the gaps or identifying what's not on Wikipedia or who's not editing. But organizations like art and feminism, among others are actually doing a lot of the work to improve Wikipedia. And so studying this type of knowledge activism, I think is a really exciting way to try to begin to understand how we can improve our information environments. The other thing I tried to prioritize in my research questions, kind of the content gap over the editor gap. And these things are obviously very related, who edits Wikipedia impacts what content is on Wikipedia. However, thinking about this wider information environment we really want to focus on improving the content and the different measures and or the different tactics we can do to really improve the content. So this kind of explains my my focus here on content, there's a lot of research as well on editing, which is very, of course, very important, but I do want to draw our attention specifically to the content gap. And the way it impacts the wider kind of digital information availability. So now I'm going to move into the results from this article. So I published this with my advisor was like, officially published this past spring, I believe, in the Journal of Communication. And here we asked two main questions. So the first one was looking at how feminist movements define success for for themselves, so looking at their missions and goals, and we identified kind of a through line of these groups in in trying to tell women's stories, and our results suggest that the interventions are indeed telling women's stories as we can operationalize it as producing longer and more viewed articles than we would otherwise expect to see and we'll get into how we came up to this answer. And then our second question was looking beyond just the mission of these organizations and movements. We also wanted to see if there are if the inequalities that have been identified within the structural features. So if we think back a couple slides ago to these kind of more specific content gaps, are these also being addressed by the movements and we see that these continue to persist Despite the success at the first kind of question, despite the success at telling women's stories. And so to do this, we looked at art and feminism would you are all very familiar with, as well as 500 women scientists which has kind of a similar professional focus, but specifically on scientists both or He's also worked through edited zones, which is another important kind of comparison. And then we basically built a dataset that had two different comparisons that we were able to make. So we identified groups of women from two other professional categories that weren't associated specifically with a feminist edit a THON, so these are the athletes and politicians here. And then for all four of our professional categories, we grabbed a sample of men of kind of comparable men pages that we could then kind of measure our our intervention work against. And so we have about 11,000 biographical articles, and a little over half of them are our about women, and about a quarter of them are from our intervention work. And this was the 2018 interventions and everything listed on kind of the dashboards for each work. Which again, I'm pretty sure you are all very familiar with.
So to return to this first question, which was success in telling women's stories we operationalize this idea of telling women's stories in two ways. First was article length thinking that the longer the article, the longer the Wikipedia entry, the more stories can be told or the more complete story could be told. And then the second way was looking at views. So this would be kind of the the hearing of the telling, are people actually reading these are people looking at them? Is there interest? Are these experiences actually being shared? And then we kind of control for because there's a lot of variability or you can imagine that length is really associated with kind of the age of the Wikipedia page. So we control for the age as well as the number of editors so we kind of take that out of our or use that to help build a more robust comparison. And here's so we built these kinds of statistical models to test what was happening or the relationships between these groups. So when you look at these charts, this this dotted line here is our kind of min benchmark. So this is what the kind of standard men's page is at. And then we're comparing women's pages, as well as specifically our intervention pages. And so here we can see that typically women's pages women's biographies on Wikipedia, are shorter or like negative when compared to men's pages. What the interventions do is then flip that relationship, so that the interventions are actually longer then, in comparison to men's pages to the same men's pictures. So we're looking at these here. And we see that same relationship persist when we look at views as well. So this is kind of a very exciting I hope exciting to you. result that suggests that both movements are working towards their mission of telling women's stories. We really do see more content and ones that receiving more views. We also looked at quality which you'll notice is our third figure here. And we don't see any significant change. So because this line, the confidence interval is overlapping with this kind of the our male benchmark line. It's just there's no significant change or difference between our groups. However, I do think this is mostly because of the lack of granularity within Wikipedia is quality indicator. So most of the articles kind of all fall within one group anyway, so and we can't get more. We didn't get more fine tuned, look into this. So that's all this is saying. But hopefully, the length and views really speak to the success of the movements here. So turning now to are the structural features we examined. So these are the more kind of specific content gaps that we're looking at. So I'm sure you're all familiar with the info box that exists on many right hand side of a Wikipedia page. So we looked at two things related to this. So one was whether it existed or not many articles did not have one, and then how robust it was, so how many labels it included. We did do some kind of exploratory analysis looking at if there were any specifics to the type of label. So if women included like husband more often or some or family members more often and didn't find anything that surprising most of it had to do with professions. So that was probably an artifact of our of our data or the way we kind of set up these comparisons. Um, what you will see is that both intervention groups here have fewer and VO boxes than our other kind of comparison professions.
Again, this is probably because of the way that we've collected the data, where athletes and politicians have kind of more specific job roles to include in an info box. It's easier, I think, to identify those short summaries of like team or elected position than necessarily artists or scientists which can be much more variable. So my results here aren't necessarily conclusive, but lead us to some interesting I think questions that we can kind of continue to study. But I did want to present them to you just the same, so you could get an idea of what's happening here. So again, we do find the same sort of flipped relationship with the presence of the infobox. Although I do believe that's, again, an artifact of our data where most of the men artists don't have an info box to begin with. I'm not sure why that's the case. So I wouldn't read too much into that there and then we see no change with the quantity of info box labels. So this might be this could be an area for impact. But I think really suggests we have to have a better understanding of how these work. The other one which is I think, very interesting goes back to the this idea of Wikipedia as this large kind of linked knowledge database. So here we have Dr. Bacon's Wikipedia page, who was one of the first PhDs in math from Johns Hopkins or the first woman PhDs. So she links to her colleague who was also woman mathematician, who then links back so that's kind of as symmetrical linking they both point to each other. So can drive views or create make the other one kind of visible across the encyclopedia. However, Dr. Bacon links to her advisor, but he doesn't link back to her. And we might think that he shouldn't because I think it's interesting that he was the advisor for one of the first women PhDs but yet, there is no returning link. So this serves to make her kind of less Central. You can get to his page from hers, but you can't return from his so if you've ever played kind of the link game, or like, followed any sort of link on Wikipedia, you're much like you're much less likely to get to a woman's biography than you are a man's. And we see this confirmed in our in our data set as well. So if we think of the ratio of incoming to outgoing links is a very simple metric of kind of like how central are how kind of connected they are within this knowledge network. Most men receive about four links in for every five links they send out, whereas for women, for every five links, some may only receive about two as little as two. Back in, so this really does serve to push women to the periphery, make them and make them less visible across the encyclopedia. So here's just kind of a schematic representation of what that would look like. So for men's pages, you can see they're getting a lot of attention back in, it's easier to stumble upon a man's page, you have four different avenues for it. Whereas for women, there's only two different avenues. But I think it's important to note that they're still pretty much holding the same. They still link out just as well, if not better, so they're still being an important connective link within the encyclopedia, and then again, this is kind of confirmed within our models that we see this real inequality here. And what's interesting is this really seems to be an overlooked area for both interventions. So women's articles are remaining at the periphery. Women generally have fewer incoming links and then for two intervention groups. It's even less.
And then this one's a little bit different. Where women generally have slightly more outgoing links, the interventions have slightly less so are really not as connected as we might want these pages to be. So this is kind of the last big part of that research. I'm happy to answer kind of questions in the q&a about this. Um, but I just wanted to surface a couple of implications that certainly come to mind for editing for me. So one is that the interventions are having success in changing Wikipedia. There is this kind of knowledge activism is definitely building something exciting, and I think we can see that in this research. I also want to say that or like identify that while content is vital for closing the content gap in Wikipedia, building new content, adding new content. We can't overlook structure, especially as, as it's an important way that people kind of use Wikipedia in a way for information to journey outward from Wikipedia. And then thinking specifically about the links and I know there's kind of a wider training being built around this but to edit to just identify pages to edit that are around the target article. So you can use the What links here tool or kind of identifying any asymmetrical links and then make them symmetrical so often, someone's biography will point to an advisor or an influence, which is a really easy thing to then make symmetrical and have that article point back. And so thank you so much. That's all I have. Please stay in touch on Twitter, and I'm looking forward to your questions. And any discussion.
Thank you as well. So yes, now we will open up for discussion if you want to use the raise hand function at the bottom of your screen, you'll see something that says reactions, and then you can raise your hand. There's probably another way to do it too. That I don't know. But yeah, go ahead and raise your hand. You can also use the chat. That is more comfortable for you to ask a question getting lots of thanks in the chat.
Yeah, go ahead, sage. Sure. Um, thank you so much. This was wonderful. Are there any specific like policies or practices within the Wikipedia community that like jumped out to you as sort of like, the most salient for the gender gap?
Or the reason why it exists or like how we could? Yeah, that's
right for the reason why it exists. I mean, like, you know, there's sort of a balance between, like, activism to use Wikipedias current sort of structure and culture and rules to make progress on the gender gap. And then also like trying to change the things that are inherently structurally problematic about the way that we do things on Wikipedia. And so I just want and this is a great sort of overview of like the effectiveness of the former of like using Wikipedia as systems to try to make progress here. But I wondered if there were things that you came across in your research that that you hit upon that were like, yeah, and on the other side of it like we could this work would be easier if Wikipedia changed in certain ways.
Yeah, I think the two biggest is a great question something I think about a lot the two biggest are one kind of the technological, I think issues so like the access issues that we can think of where people don't know they can edit, don't know how to edit, and I do think like Wikimedia is making a pretty big push like, you know, they launched the visual editor a couple of years ago, to make it easier for people to start editing and the wiki education group is doing a lot of that as well. So I think that one's one area that can always use work. But I see promise in and then the other issue that I think is more kind of gets at your your idea is this issue with, like the notability requirements. And so that's something that I'm looking at in my dissertation continuing to understand how notability like is operationalized or is understood, and how kind of the structural features of Wikipedia play into our ideas of Notability. So how a page looks maybe determines how it's assigned notable, but then Francesca trippity, who I think is that? The University of North Carolina I recently had a piece about notability as well, where she finds that women are more likely to be wrapped up in these notability discussions or like notability sorry, then nominated for deletion discussions around their lack of Notability. So really driving a lot of editors kind of focus to defending their edits instead of making new ones. So I think that is one area of policy that I think needs a lot of kind of improvement or work and a lot more kind of advocacy around to try to improve that process.
Thanks, Isabel negozi
Okay. Yeah. So thank you for this presentation is a wonderful one. I never looked at it from this angle, the structure and the gap created by the structure to the infobus and the links. Now, I think I'm it came to or do I say that I just heard about it. From this presentation. Now, I have a question. Is there a way of creating more awareness that apart from adding content that we need to constrain concentrate also, on these structures that continue to create a gap in Wikipedia?
Yeah, I think um, hopefully, that's one of the things that I'm hoping to do here with you all, to take it back and you know, feed it into your communities, your editing communities. I think for other ways of raising awareness, I mean, I mostly do research. But I think it's definitely something that needs but hopefully will happen. I think one of the big promising things is that you know, working with some local editor thoughts here in Philadelphia, I've had a lot of success of teaching people how to edit links, and they actually find it really fun and really easy. So it's like one of the easier things I think to especially for new editors to do and a lot of kind of university students are already thinking and like networked terms, having grown up on social media. So I think that helps them also think around this issue a little bit. The Info box one is a little bit harder. I think. But definitely with the links, I think it's like, at least for me, framing it around. Kind of a fun and easy way to edit Wikipedia, especially for beginners has been really successful, but I would love to hear your ideas or to see more attention drawn to this in the future. Thank you.
Yeah, thanks for that question. negozi. Ri are actually as Isabelle alluded to, we are art and feminism is working on putting together a new training or specifically around wiki interlinking really informed as a result of Isabel's research. And so we'll have a community hours about that in the future going through that workshop. So hopefully, folks can take that back to their communities to be able to, to use that new training. Thank you for that. So overview, Chris.
Thanks, Kara. Sorry, the lower hand button was escaping me for a second. Isabel, thanks so much for your presentation, overview of your research. It was really neat to see kind of the impact of the intervention with respect to some of the factors you were evaluating and I think is really it paints a really good picture of kind of what benefits you know, the these kinds of interventions can have for for these types of articles. I did want to ask a quick question around the sort of quality assessment and there's kind of two related somewhat related questions I wanted to ask. One was it looked like from the graphs you were showing that before the intervention, the quality of articles about women were actually above baseline? For from from Matt and I wondered if you had any kind of suppositions about why that might be or what might be informing that that outcome. And then secondly, I know you had mentioned that like, evaluations of quality. You were had some concerns around kind of the granularity granularity of quality information on Wikipedia articles, I've wondered kind of what what tools are being used or what information is being used to kind of gather that whether that was sort of the editor article assessments and like raid like article braids or if there was kind of other measures being used. So those are my kind of two questions I wanted to gauge your thoughts on thanks, Isabel.
Um, yeah, the quality. We do use the kind of grades for i i think they call it the way the quality assessment. So it's like the Wikipedia, community assessment and all sorts of factors go into them. So but it's like, several broad buckets. So most articles are like I think a B or a C, like B or C class, and then there's a couple that are a little bit above and a lot of stubs kind of thing. So it's hard to tell exactly. The variations within base given this kind of grouping. Which I think is part of the issue. I'm not sure. I know like quality assessment has been a big thing in Wikipedia research, like more generally, like how do you know if the article is reliable or good? I'm not totally up to date on what's happening there. And if they're kind of launching a new way to assess or not. I will so the graph I showed there's no statistical difference between the intervention in the women's articles with compared to the men sample the like the
I say, Okay, I misunderstood that. Okay.
Yeah. So the, the arms reaching out from the circle or the like, confidence.
Baseline, so that makes sense. Okay, yeah. clarifying that so that that was my misunderstanding.
Yeah. No, no worries at all. Um, so yeah, they overlap. So it's potential that women's articles are greater quality. I think that could kind of related to the discussion of notability like women have to be more notable to get on the encyclopedia to begin with, which could lead to a better quality page like you would have better references you would have maybe more reasons to have a longer page like things like that, which could lead to a higher ranking and quality which is just my if, if that relationship persists, or became statistically significant is my hypothesis of why that would be. But yeah, I mean, great question. Quality on Wikipedia is an exciting world unto itself. And really gets at this idea of how do you create like good information?
Yeah, there is one other tool I'll just briefly mention. That I've seen used from a few organizations in the movement. There's a tool called low res, which Mark gets used for quite a number of things, but some have used it to look at what's called structural completeness of articles as a as a you know, an approach to looking at quality certainly not conclusive on its own maybe but an approach that some organizations have found effective based in the way that they're their own programs kind of approach quality in some ways or support the development of articles. So this might be another tool that other that you or others could consider using around quality if the kind of article assessments tool is you know, lacking or incomplete and not kind of providing a good picture on
its own. Yeah, definitely.
It's a great resource.
Thanks, Chris. Thanks, Isabel. Sophia.
Hi, Isabel, thank you so much for your presentation. I actually read your paper and was thrilled because I'm one of the ambassadors for art and feminism, but I also am a grad student and I research AI and I've become really fascinated by these overlap between the bias that everyone talks about in AI training and how it's so problematic, and then learning that they are using wiki articles. And as you mentioned, there's also you know, when we ask our virtual assistants for a question, you get that answer. That's coming, getting pulled off and from Wikipedia. So I was so excited about your paper, because you you mentioned that too, and there isn't a lot of people who are we're kind of like talking about that symmetry. And so I was actually curious about because I've been looking for more evidence of the outcomes of this. And so it's kind of curious to know if like that was something that you'd look into more like, have you found evidence of, you know, biases that exist on Wikipedia being like overtly carried over into, like algorithmic outputs?
Um, I personally haven't, I think it's a like vital question to start to understand better. What I do know is, there's a lot, this isn't algorithmic, I will grant you but there's an emerging trend of research of looking at like how people use Wikipedia to think of as kind of the beginning point into some sort of question and then following the references. So if you look at who gets like cited on Wikipedia, or like in a in an article about something, those people who get those references, get a ton more citations and like the academic literature. So this is one point where I think inequality in Wikipedia is driving inequality and other domains. And so if you don't have a balanced citation list, in which on the Wikipedia article for some given topic, say, you know, like some sort of chemistry mechanism or whatever, you're likely to see that same inequality, play out or get exacerbated within the citation model for kind of that academic literature. So given that, that removes the AI from the question, I think we're just thinking about, like how that could become even worse and maybe even more invisible through kind of computational means. So it's pretty easy. To trace the citation issue because, like, there's a human aspect of someone looking at the page and following the link. And I think it gets harder when it's kind of masked by some computing process. I think like the audit studies, which like I'm feel you're probably more familiar with and I am of algorithms might be a useful way to start to approach this question of kind of how how bias in Wikipedia kind of plays out. elsewhere. I think the problem with the really big machine learning models is that they're not just Wikipedia, right? So it's like Wikipedia, plus all sorts of other content from elsewhere on the web. So combining so I think it'd be very hard to identify exclusively what sort of bias is within is just because of the Wikipedia data versus this other data. But I think your questions great, I don't have any answers I have like, I definitely confirm your intuition that it is being manifested in some way through the algorithms, but it will take some kind of creativity to design some research that can test that but yeah, definitely stay in touch. Maybe there's a cool project to be done.
Yeah, absolutely. I mean, I feel like I'm gonna probably want to email you tons of questions. And and what you're saying like, is definitely when I mean when you were answering goes used question earlier about that, that disappearing, linking or that lack of feedback, it really seems like a parallel between the disappearing citations of women in academia. And so again, we just see like, this echo effect where Wikipedia reproduces biases that exist in the world and then they reinforced them because they get kind of like shuttled back and forth. Totally.
So I don't see any other hand raised right now, but I do say the K layup. I don't know if you want to unmute and talk more about the comment that you added in the chat.
raise my hand.
Oh, I see it now.
Go ahead. negozi. Okay, I want to x
is it out of curiosity? Do you feel that the gender gap has not been closed or reduced? I want to know how did you get into this research?
Oh, well, I've been long interested in like gender gaps within the technology space. And then also, at the same time interested in kind of collaborative forms of like knowledge building, I guess, you could say. And so Wikipedia was kind of an excellent place to combine those interests in both collaborating and creating knowledge as well as this gender gap. So as soon as I I kind of read some statistics, my first year of grad school was the year where Donna Strickland won the Nobel Prize, and then the Wikipedia page happened. So that was kind of my entry into this issue, and then something that I really wanted to explore more so that that's probably your most straightforward interest, but I'm sure there's a lot of other kind of factors alongside that built this research agenda
form. Thank you very much. I have gained a lot from this. And
thank you for coming.
Yeah. Go ahead, sage. Um,
I have a bunch of questions, but I'll try to just ask one or two of them. Do you have any kind of like overview of how like edit a thon participants differ from the kind of like general population of Wikipedia editors and how that like interplays with the intervention that you're looking at?
I don't I I mean, part of the editing gap, I think is tricky to study because of the anon m&e on Wikipedia. And so I think most people do surveys is what I've seen, to look at it, but I and then I think and again to Francesca Tripoli's be she might get a she doesn't have any numbers I don't think but she might go a little bit at describing the the editors on she attends. But yeah, it's not something that I can really speak to. I think for me it's like I know that this is a feminist or has a feminist like activism orientation to the information, which I which I take as kind of the base point. So I think of all the participants here as participating in some sort of feminist activism around information, regardless of like who they might be or the different identities that they might have. So that was that was my kind of way around, not necessarily knowing because I didn't
thanks, I asked her because I I remember reading somewhere but I'm not sure where that like edit a thon participants were more likely to be actually like experienced editors or editors with large edit counts that edit kind of like across a lot of different areas as opposed to the editor, the typical editor who creates a random biography about a man who's more likely to be someone who's only really edited in that little like sort of small niche and that that was like one component in in terms of like the structural components of how, you know, different sets of articles look in terms of like Infoboxes links, these kinds of things. But I I didn't remember it well enough to I thought maybe you anyway. The other one question that I have is that self interested one? How could the dashboard tool which you use to get data about these editor funds have been better for you as a researcher?
Oh, I guess this is
your Sage made the dashboard.
Yeah. Um, I guess like, it works very well. I mean, to get the data from it, I mean, yeah, do some data cleaning, but that was that was pretty good. I think the completeness, you know, like, not everyone associates with it. Or knows how or things like that is probably the biggest issue. Like if all of this data is limited by you know, the people who say that they are editing with our feminism. So, in one hand, I think it's a pretty conservative estimate of your impact because it's only those people. So I think that's kind of good, but on the other it's not, not necessarily every edit the org makes. So yeah, I think anyway, to make it more complete, or more reliable is probably the best but I will think about this. I'll let you know if I have more thoughts.
Thank you. And then we'll go to Abraham for our last question.
Will you evening everyone. My question goes to that question is like I want to make sure that in many ways, what is rolling the conditions? case, the opportunity zones were right. Then the quality material conforms to all your condition. Ratings, several several awareness regarding the support force, which was to have a finance committee thank you.
So can you repeat that slightly? My volume was cutting in and out
Yeah, I can hear you better now. Thank you
in Nigerian represent gender inequality is very, very, very, very high. To try to awareness, a new car group, good, local organization of people, enlightening people. That really was Wikimedia Foundation. What has been trying to do to support us such movies last night people will be opportunities giving rise to all people in the LGBT and not more to us to enhance inclusivity in all the movement strategies additional videos where they hide and they call it
ranges so if I hear you correctly, and again, sorry, if I don't so what's the Wikimedia or Wikipedia foundation doing to support particularly in Nigeria or African context? Or for like language versions or other social issues? I guess my research is mostly focused on the English language version. So I'm not sure if this is exactly what you're asking. But other versions of Wikipedia, the gender gap varies pretty widely, especially in content around other versions. So that is something to I think there is I'll try to find it and maybe Kara can send it around later, a paper that looks at those different versions. And then I think for kind of the larger activism issues, hopefully, I think it's like orgs, like this one who are doing some of the training and outreach to communities who aren't as representative represented on Wikipedia are hopefully kind of building that support, especially I think, through the edit of found model. I think what I found here is that at its own model is making is definitely having an impact on kind of the information that's represented. So hopefully that gives us support for efforts to expand those and to make them more robust to provide more more resources for contributors. So yeah, I hope that starts to get at your question
when they grant what what are we trying to do foundation should see the full term support to bridge the gap because in the African country, Nigeria, I see in the inequality that is very disturbing. So something is moving. You're supposed to at least file in support of the Euro foundation so as to support all projects that that entail, anything gender equality, was to bring that up to encryption. Bye for now, rights for supporting projects currently currently apply for a grant on gender equality awareness, which is going to take up most of us remember to give to term support, because we neglect all the blood normally the human body especially Nigeria model to try as much as possible and
yes, thanks Abraham for for joining us and for raising this. Isabelle isn't with the foundation so doesn't really have any, any ability about the funding there, but it's great to hear about the work that you're doing and if there are ways that are in feminism can support your work you feel free to reach out to us so we can talk more about that. But we are slightly over time. So I do want to kind of try to make sure that we are ending somewhere on time. I want to thank you all for joining us today. I want to thank Isabelle so much for taking the time to share more about her research. We will share I see Miranda's question in the chat about sharing slides. We will share the slides as well as the recording later this week on our website. We'll also do a follow up email to everyone who registers so you'll be sure to get the links to the recordings as well as the slide deck. I do want to highlight that art and feminism is celebrating 10 years with this upcoming campaign. So please be on the lookout for ways that you can hopefully help us celebrate. And I want to thank you all again for joining us today. And we will see you next time. So thank you so much. If anybody has outstanding questions, please feel free to email us okay, my
hand is up.
Okay, it's not a question that we're talking about a power tent by the the app and feminism tent bed day. I have been checking my hand I have not seen because for events or contributions.
Yes, that's coming. It's forthcoming. It's not out yet. So just be on the lookout for it. Okay. All right. Thanks so much. You can go. Yeah, thank you. All right. Well, thank you so much, everyone. We'll see you soon.