BEBS Podcast #3: Evaluation Can Have an Equity Center
5:52PM Dec 4, 2023
Speakers:
Caitlin K. Martin
Linda
LaKesha (she/her)
Keywords:
data
evaluation
people
caitlin
questions
work
collecting
feel
qualitative
library
participatory
project
distributes
lakeisha
numbers
equity
themes
partner
takes
summer
Hi, everyone, welcome to our beds podcast building equity based summers. This is our third episode pretty exciting three months in. And as a reminder, we launch we publish these on the first Friday of every month. I'm Linda Brown, and I'm one of your co hosts. And I'm here with
Hi, everyone. I'm Lakeisha Kimbrough and I am a healing centered equity consultant and coach working with the BEPS project.
And we are very fortunate to have a guest with us today. Our guest is Caitlin Martin, who is our evaluator for the building equity based summer's bibs project. And Caitlin, give us an introduction to yourself.
So exciting to be here, my first podcast, attendance contribution ever, very exciting. I am an independent consultant in research, design and evaluation. My background is in the learning sciences. So how to understand how people learn primarily outside of school time. So at home and community centers and libraries. We're
really excited to have you here. As location, I will speak for Lakeisha. And I think it's true that for each of us, both of us, working with you as an evaluator is an amazing experience and different than Lakeisha would you say that it's an experience different than any evaluator I've ever worked with?
Caitlin's amazing.
So all right, we're just gonna jump in. Caitlin, just give us a quick overview of what you think about when you think what evaluation and assessment means to you.
I really do think about evaluation and assessment as there is no one way to do it, that people talk about the difference between research and evaluation. Research can be very kind of open ended questions, they may be impossible to measure questions like how do these things happen? Or, you know, how do these things unfold under time over time, kind of the how questions and evaluation questions are often much more narrow? Like did this work in the ways in which it was intended? So things that can be more quantified? More yes or no? And the way I kind of think about what's valuable, is, you know, neither research nor evaluation is useful, unless people are asking the same people asking the questions or looking at the data, and can use the data. And so that's sort of how I go about it is, you know, what are the questions not, you know, if I'm doing a project about leadership, or effective summer services, it's not for me to say, what effective means or what success means is sort of for the people who are leading the project, the people who are stakeholders in a project so that anybody who might be influenced or impacted, what are their measures of success. And so I'm kind of, in a way blending research, in evaluation, and my background is in research. And so I really think about Reese evaluation and assessment as something that is in place for improving the work that you're doing to find out how you are doing and to do it better. And to do that, you really need to ask those questions about well, what does success look like? How do we know are these indicators of the right things? You need to ask those questions at every level of that system that you're working within? And so often I feel like in this project, and in so you know, in, in nonprofits, especially, evaluation and assessment can often be tied to somebody else asking questions and somebody else's measures of success. And that's where there can be this disconnect, or people being like, oh, assessment is, you know, is terrible. It's like, I don't have time. The data doesn't mean anything to me. I'm, I'm collecting data to give it to somebody else, because they have questions, and they have measures, but it's not necessarily informing my project or my work or our work. So I'm really interested in that to that, like, disconnect between who's answering the questions, and who's using the data. In the people, you know, the stakeholders in the work, how are they involved is often like very disconnected. So I think there's a lot of room for new ways of thinking about how to use data and why to use data. And new new practices that that kind of put this the anybody who's influenced by whatever the project is, or the services at the center of asking the questions,
Katelyn, the way you talk about evaluation, and talk about data and talk about how to make it relevant for all the vested folks involved, right? So, yes, there are these tears? And how is this relevant for everyone from that frontline person who's collecting the data? All the way to the person who is looking at that and saying, Yeah, we're gonna keep funding this or yay, that program is doing so well. How do we replicate something similar or, you know, whatever that the purpose is for that, but because people are so have various vested interests, across the groups, it's, it can be challenging to do that. And I think one of the ways that you help form questions and those things and talk about things in the data and how to analyze it, and what we're looking for, and what does it tell us? It does, I think, excite people in ways that provides a sense of ownership. And then it's actionable. It's not just collecting data that's going to go somewhere else. That doesn't mean anything to me, like you mentioned earlier. And so I think, and I think in my mind, that's part of what makes that that helps it become equitable. Right. And that is one of the ways in which that an evaluation really can have an equity center, is because it helps us do those things. And it makes me think about it as like, transformative data work, the work that's being done, and liberate it, that'll work.
Yeah, I agree. And I think again, it's like prioritizing different voices in saying what the questions are, and what the measures should be. Because one thing that we hear again, and again and again, right, is that like, there's a lot of collection of numbers. But if it's just the numbers, that doesn't necessarily mean quality, it doesn't necessarily mean doesn't even necessarily mean, people, right? If you're collecting the number, if the same person or families are coming every day in the summer, that number looks really big, but it could just be the same set of 100 families. So there's a lot of questioning assumptions, by the people who really know what the data mean, that that's not really heard. So I think more conversation between different vested partners is super critical and expanding what the questions are really like doing some collective work to understand and set the questions for a year and then revisiting it every year? Like are these the right questions? And I agree, like you should, like ideally, those, the data that's collected and the stories that are told, can, can be appreciated by all the levels. And if you have that collective understanding about your your questions for the year, what are what are those? What are those questions you really want to dig into deeper? Then there's also a recognition of what the different, what different partners care about to instead of feeling like, I just collect the data, I don't use the data. It's somebody else or feeling like I know what success is I am funding this project. Here's what I you know, here's, here's what matters to me and not really hearing about the families or the library staff.
Yeah, okay. As Caitlin, you were talking, I was thinking unlined. I'd love to hear your thoughts. And Caitlin, your thoughts as well. You reminded me of how closely you know, data and evaluation should be or can be connected to our why and how that can make it so much more meaningful. Why are we collecting this data? Why are we asking this question? So asking why have those things and we talk a lot about that in the beds work is understanding our why and I think having that be applicable to all the things we do why why this question, why you know why? This number of questions why this question this way? Is I think that you just made me think about that. And how tying those together can really help folks feel part of it at the different stages. So if I am a frontline staff person understanding why we're collecting this, the way I bring it to the community to get that information may be very different than if I'm disconnected. And it's just one more thing you asked him to do. Right? Yeah.
And I could see that you bring it back here, why not? Why the question, why whatever? And also, how does this help us find out if we're reaching our why, right. So does the question, actually, or questions actually connect to our Wi Fi? And I think, yeah, I, because surveying is such a big part of what library staff do. It's, it's, I don't want to say quick and easy. It's convenient. It does seem quick and easy, even though it might not be. And I think that if library staff started asking, Well, does this help us find out? Does this survey do the survey questions? Help us find out? If we're reaching our why that could be a really useful tool and locations? You're talking about that? I was also thinking their first few podcasts? Why in time, is that the other thing that's really difficult for library staff, Caitlin, what you're talking about is that it takes time to have the conversations about the what you're looking for, why you're looking for it. And then it also so many times I talked to library staff, and they collect data, and then they don't do anything with it because of time. And so I think that's a really huge barrier to this work.
I totally agree. And, and I absolutely agree that like doing meaningful development of methods, and not only collecting it, but then, you know, analyzing data, not just reporting your percentages, but really analyzing Well, what is this? If we have these percentages? How does this metric correspond with this one? And what does that mean? Or what can that tell us? And that takes a long time. And then if you're doing qualitative data collection, like really doing interviews or focus groups, that takes even longer to analyze qualitative data. And I think they're because the questions are not coming internally. I also think there's maybe a, an idea of like, if we're doing an evaluation of our summer services, it has to be all of our summer services and everything. And those numbers that are collected are everything, right? Like, how many young people how many people in this age bracket, how many services and that's completely overwhelming and daunting. And that takes, you know, hundreds of 1000s of dollars in a research team to do some really meaningful evaluation of a summer. So I would really suggest to like thinking, small if you have a why question, what do you have the capacity to really explore? Could you explore one particular lens through one program, one event and really go deep and do that qualitative exploration? You know, if there, you know, there's also the you setting the questions that are really big is great, but then doing an evaluation that's very small and saying, Well, you know, we really want to dig into this question, what is one angle we can take this summer, and instead of collecting everything from everybody do like a mini investigation of of one week, you know, you can bound it by time or one event and then get as many perspectives as you can like, what what did the people designing that program say about it? What to the participants in that event say about it, you know, trying to get like all angles you can on this one event to inform your question, but that like that smaller bounded evaluation can help to get at those bigger why's but can see much more approachable. You know, you can do a snapshot evaluation or like a mini investigation that that then has all this data generated to synthesize with your team to synthesize with participant unit you can do participatory analysis and interpretation. data, like show those numbers show kind of themes coming up and have a conversation about it. That's evaluation. But it doesn't have to be like everything. People get overwhelmed and exhausted. And it's not fun or interesting. But to make it, you know, to think about capacity to think about bounding something, a little mini station, if you if your questions are really big, what, what can you really do deeply and meaningfully? So you're like, this data is really interesting, this data really does tell an authentic story. It's not meaningless numbers, I really want to dig in here and then invite more people to dig in with you in interpreting what that means.
I love that. And you're making me think that's also a way to build relationships with community, right? So if you invite people from the community, and perhaps those who are systemically marginalized, to join in, that could be really exciting. Yeah,
yes. I love like that kind of community participatory interpretation of data can be a party, like it can be literally like, let's look at some maps, let's look at some data. And let's, let's try to answer the questions we came in with, and what more questions do you have? What should we ask next summer, you know, it could be like, something that's, that's exploratory, that's rolling, and where the data is coming. And then that that experience right of the meeting, or the participatory interpretation can be another dataset you have, you know, that can be part of making sure voices are, are part of this interpretation and answering the questions. And coming up with other questions.
I, as you were talking, and then Linda, your question, I feel like I heard a collective sigh. From listeners are like, relief, right? Like, oh, my gosh, yes, we can do that. Sometimes we don't remember that we, how we can lean into some permissions. And then sometimes I think that may be that, that like, Ah, I feel seen, I feel heard in all that Caitlin just shared with that, right? And, oh, maybe that's something that I can take back to my library, to our team, to my director to whomever is helping to make those decisions about what we collect, how we collect it, and then what we do with it. So I'm like, Oh, I think I just heard and felt like this relief for folks. And as you were talking, Caitlin, and Linda, I was thinking, how amazing what an amazing way to build trust, when it's participatory. And when you can come back to community, or come back to, to whichever communities, right, the library, community, library, staff, community, the communities in, you know, neighborhoods that that we're working to reach and work with. But to come back and say, remember, when we did this survey, or we have this focus group or whatever, that those memes were to come back and say, This is what we learned them, this is what we think we can do with what we learned, what do you think we can do with what we learned, right? So that that builds that trust that, oh, they actually do want to partner with me. And so it helps build and strengthen relationship, I think in some are has the potential to do that in some really powerful ways.
Maybe it also kind of gets away from the idea that data is collected in order to tell a success story, right, in order to report our success story. Instead, it's like, data is messy. The story is messy. The interpretation of the data might be different, depending on who is looking at it. And so better to work on that together. And the messiness brings up new questions and new ideas, without people having to feel like I need to synthesize this toward a success story. And if I don't have that, there's some there's a failure, which is that instead of continuous improvement, there's this message of like, Oh, we didn't do a good job. So I can't share this data. So maybe even like yeah, what what evaluation is used for and and who uses it can really be expanded.
That was that just was beautiful. When we do come together and say, what did we learn from this? Maybe it was maybe we learnt something completely different than we thought was going to come up. And so it's, it may feel like or seem like on paper. We not successful, but it was actually super successful. Because look at what we learned from this, and how that will help inform other things. And, and as you were saying that too, I thought about it brought back the equity piece for me because well for one of the various voices and then bringing in those lenses, right, but it also, or and it also when we do that we're helping to dissenter, like so many things and really bring in multiple ways of looking at the questions, multiple ways of interpreting multiple ways of seeing. And that is really helps people feel like they belong in the process helps people feel included in the process, right? And so brings in those diverse viewpoints, diverse lived experiences, and all the diversity, the inclusion, the belonging are all the things we need for equity to really take place and for liberation and transformation to really happen. And it can happen partly through data and evaluation, right.
And what you do with that, going back to Caitlin, what you were saying is it does if something if the data shows you it didn't go as you expected. And we're seeing that right now, with Deb's, and I think it's important for people to understand it as they learn about using data or as they learn about equity, that they may learn, they have to learn more, right? Like, your confidence in how you do things might go down, or you might realize like we're not really doing this the way we should be doing it. Let's think differently. And when you're working participatory Lee, you have to really be open to those other voices and not do it sort of superficially, like Yeah, I want to hear from people but not really, you
know, it can also distribute, you know, the Keisha earlier you were saying, you know, ideally, data reveals things that are actionable, right, and so distributes the responsibility of who, whose action, because depending on who you are looking at the data, you can see different things that might spark or ignite or catalyze an idea for what to do next. And it's not just, it doesn't have to be and I think webs is all about not just it's not just the library staff who planned a specific event, I think, interpreting data together. And even when things were, you know, when a question comes up, when something went, like it wasn't expected, other people might see an opportunity to collaborate or to partner or to do something differently, that one person will never see alone, because, you know, they maybe are too close to it, or, you know, it just kind of distributes that idea of like, it's not a one to one, it's not a failure. And for one person or one institution, it's like a collective, how do we do this better? What are we trying to do? Conversation and data is a really nice way to spark that
when we we often are encouraging folks to partner with community, folks and community. And everyone is doing some amount of, of data collecting of, you know, trying to understand how did this go for various reasons. And what I'm wondering is how often we sometimes maybe the library is having a specific program, and we're partnering with an organization to help, you know, reach folks with marginalized experiences or something. And so they're gonna probably do some type of evaluation, and then we're probably going to do some type of evaluation. So how do we maybe have like, Are there one or two questions that we both want to ask you? Or you know, how we still then also are deepening our partnerships that way and maybe starting to think about oh, are why is this your why is very similar? And the what that can do for transformation? of things. I was going to ask Caitlin a question. And that a lot of when we were talking about the qualitative pieces, which are so beautiful folks, often sometimes they will live, those are just the stories. They don't give the numbers right. And so I have been in some spaces where where that has come up and then my question has been of those stories, what percentage of the folks right did or so those kinds of things so I'm just wondering if you if, if I'm on the right track with that or if you want to share anything About that, that you actually pull numbers from qualitative. That Yeah, so
one it can if you do a survey or something like that, and you see a pattern, the qualitative can be called for, like, Hey, I saw this pattern like, you know, 40% of teens wanted more intergenerational. If you have some pattern coming up like that, you can then like your qualitative and see if people are where people kind of unpack things. So your qualitative are used as examples to give more detail about some quantitative pattern you found. Or if you saw, if you had, you know, saw that caregivers of young people were coming more than mom or dads were coming more than mom and to kind of pull out any qualitative ideas you had from that particular user group, you know, so they can be used as examples to unpack survey data. But then they also can be used on their own if you say, I have 250 entries, about what people liked about a specific program. 33%. And then you can cluster up if you say, oh, there are three big themes I can I can identify for each of these 250 a big theme. Some people say the temperature or climate, some people say the facilitator, some people say, you know, my my friends are in the program or something like that. And then you can actually quantify and say, you know, of the 250 responses, you know, 75% said this, these three things came up 75% of them were this one theme. And so all of a sudden your qualitative becomes quantitative in a way that surveys could never do, because you didn't know what those themes were going into it so they can you can kind of backwards map themes, or do you like emergent new things coming up and quantify those? And quantify quantifying qualitative data is something that's really interesting, and it makes some particular listed partners. Take note because all of a sudden it turns into numbers. But it starts with real authentic conversation language, open ended responses that can be more meaningful.
Talking about something that's meaningful seems like a great place to end this podcast. Thank you so much for being with us. This time around Kaitlyn, I have a feeling we will have you back in the future. Thanks, everybody. Hi, thank you.