All right, okay, well, welcome everybody. As people start funneling in as as per the use. Let's get here. I want to ask you a question, what's in your coffee cup this morning? Just out of curiosity, just put that into the chat for me, what's in your coffee cup this morning? And you know what? It's Labor Day weekend, and if you have wine in your coffee cup, I ain't telling. If you've got a mimosa in your coffee cup, Your secret is safe with me, at least until the podcast is published, and then everybody will know, no, I'm nah, we'll edit that out. What you guys got going on today? Hazelnut was coffee with hazelnut. Love it. Chai. Love it. Strong coffee. Yes please. Black all the way. Vanilla latte. Yum. Oh, and iced coffee. I like a good iced coffee. It's not my usual, you know, but decaf. Douglas, seriously, decaf. That surprises me, man, it
shouldn't. You don't want to see me caffeinated, if you think I'm if you think I'm too much. Now this is uncaffeinated and on alcohol, driven meat. Add those other layers.
Then we all have trouble. Then we all have trouble. Let's see Bailey's thank you for being honest with your Baileys. I have some of that downstairs. I probably should have thrown a shot of it into my own coffee. See what have we got. Yours is Connie decaf. Also, Connie, you didn't strike me as a decaf person, either, key lime water. Oh, that sounds so good.
I'm not a decaf person. But
my body says I am a decaf person now, Elon water
tell me I want that too. Hot coffee this morning, cold brew this afternoon. Yes, we had it's nice and hot and humid. I know Elena was saying it was hot and humid where she's at as well, and that. All right, I think we got, we got a nice crowd in today, and we're going to go ahead and get this party started. So let's do the officials. Shall we? All right, welcome everybody to another, learning rebels, Coffee Chat. I am Shannon Tipton, the owner of learning rebels, and today, super excited. We have Alana schlatt, no, is that right?
Practice? We practice.
Lata, I know we did practice, and it
just gave you. We did put three it's Elena schlachta. It sounds easier than it looks, I know,
but I'm really excited. I met Elena a couple of years ago, two or three years ago, I think at ATD and a joyful person to talk to and to share ideas with. I mean, super inquisitive. Totally fits with the learning rebel vibe. As far as being inquisitive and innovative and curious driven all of the things that we look for as being part of the, you know, learning rebels community. And when she contacted me and told me that she's got a book coming out, I went, Yes, please. I want to hear all about that. And so we decided to have her come in and do a special AMA and ask me anything, type of discussion, if you will, around, uh, measurement, learning impact, performance impact, etc, because I know that she's got some different ways of looking at it, and ways that certainly make it more practical for everyone. Because I know that when we think about measuring learning and what does impact look like? It becomes kind of overwhelming for everybody, right? And I'm really looking forward to this conversation, but before we start turning it over, I would also like to know in the chat, if you are new to our coffee chat discussions. Please let me know in the chat so that way we can give you the nice, warm welcome that you deserve. You know, so if you if it's been a while, if you have never been here before, that's okay. Shauna, you're new. Thank you, Andrew, thank you. Andrea. Thank you couple of Andreas. Tanya. Jamie, yeah, no, like, look at these new people. Deanna, Amanda, well, Elena, we've got some people that are interested in impact, apparently, yes, welcome everybody. That's right.
Marketing's paying off there. Shannon,
well, it's, it's all about the guest. I think, you know, everybody was like, hmm, we've got to talk to her, and we've got to talk about this topic. Yes, welcome, hi, Jerry. I
do recognize some names. I do recognize some names, and it's nice actually, to see some faces. A lot of the stuff I do on LinkedIn is all like behind the scenes. You know, voice only, so seeing some faces of people that I only know names is awesome, right,
right? And speaking of faces, thank you for that nice reminder. Is for those of you who are new to the coffee chat, this is a very informal, structured conversation. If you want to have your video off, that's fine. But if you have your video off because you think that you know your background isn't appropriate, or your hair is not having a good hair day, or maybe you're even eating lunch. We, honestly, we don't care if you're in your jammies. We don't care. We if you got a cat running around. We don't care. You know, we do want to see your faces, but if you feel more comfortable not showing your face, that's okay, too. All right. So let's get to this. Shall we So, Elena, why don't you go ahead and formally introduce yourself to the group?
Well, welcome everyone and thank you, Shannon, for having me. I am Dr Elena schlachta, and everyone mispronounces my name so Shannon, don't you worry about it. I respond to most things. Anyone in the group has a crazy name. You know what it's like, and I've just come to terms I am a data nerd. If you have checked out any of my stuff on LinkedIn, you 100% will recognize the nerdiness and the geekiness that I take into the world of learning and measurement. I'm on a mission to make the education sector more data literate. So I truly, honestly believe that data literacy is like that domino that if we knock it over in our skill set and mindset, everything else gets easier. Promotions happen. Learning is more effective. Businesses are more successful, and all of that stems from being more data literate and getting over our fear of math. So we will dive into more of that today. But I fell in love with teaching many, many years ago when I finished my bachelor's degree and was like, now, what do I do with my life? How many of you was that true for you? Yep, so I went to Mexico. I never left the country. I'm from the US. I went to Mexico and I decided to teach English as a foreign language, not at all, but I got my bachelor's degree in, by the way, fell in love with teaching, and that really launched me into getting a doctorate degree in education and teaching educational research methods to graduate students their least favorite class. So I got really good at helping people fall in love with something that they absolutely hate. And a lot of what I wrote the book on are those small, simple strategies using story and case studies to help people like, hey, it's not that hard to measure, but start somewhere. Where do we begin? And answering those questions is really important so that we don't become overwhelmed. And that's that's what I'm up to.
Fascinating. I love that you went to Mexico to teach English. How long did you do that? I
was only there three months. I wanted to stay longer, but alas, I ran out of money in my savings account. They needed six months of volunteering before I could become a staff member. So you know, when you're 20 years old, you don't have a big savings account, so I did the best I could. It was every penny was well spent in that, in that experience, for sure,
that is fun, that's fun. And I love that. You know, we all have these different experiences that round out who we are. And I think from an L, D perspective, I think that's in particular. I think that's true, because a lot of us have come from different backgrounds and have accomplished different things in our lives, and that is really what drives a lot of us forward, you know, is to keep that interest in different things right, to keep them alive. And one of my questions then, is, how did you find yourself in this area of, you know, data literacy, and can you help me really understand and help the group understand what you mean by data literacy.
Yeah. So my career, like many of us, if, if I was to sit down with every one of you, one on one, you'd probably have the same thread of your story, which was like, if I had, if I could look back 20 years ago from where I am today, I would have. Never imagined that I would have ended up here. So many of you probably have a story like that. I'm also that way. But the through line in my career is that I had the luxury, and I can say for honest to goodness, it was a privilege and luxury the kind of experiences I had my very first job, when I got back from Mexico, I worked with a small business that was data like they were an adopter of Salesforce, and every single staff member had to do a data training, because the quality of our data and how we used it as a whole organization was critical. If something was not spelled correctly, or if there was a period instead of a dash, the data was messy and it wasn't useful. That was the very first job, 21 years old. I worked for this woman, this incredible CEO, for three years, and she just kicked my butt into professional gear. But it was all using data to make informed decisions. Then I went and got my master's degree and Doctorate degree and as a researcher, where I was being trained as a researcher, everything was about data. And so really, my experience has been that without data, how can we possibly make informed decisions? So that's like the context of where I come from, and when I went to the corporate world, which was about six years ago, I it was different, like learning and development in the corporate sector wasn't using data like other places that I had been. And so I was like, Okay, how can I help? And that launched me into my practice today. But your your question, Shannon, is like, when you say data literacy, what do you mean? And I simply mean using data to help us make informed decisions. Data is is this conduit? It's visibility that helps us say, this is where I should invest my time. This is where I should not invest my time. I can see relationships between key performance indicators driving business revenue. I can see indicators of what is helping our people perform or what's getting in the way, if we don't have any evidence or data or insight, we're not going to be able to make really smart choices. And so data literacy is all about making smart choices with good, clean data.
I like it makes sense, right? And I think we all struggle with that and we think about data. I you know what I would ask the group. What questions do you have about the use of data and how we might interweave this more into what we do? Just not necessarily about proving impact, but about just making it part of our workflow, right? Making data part of our workflow. So I'll have you ponder on that. But what I would my follow up question to you is, when you think about impact and you think about data impact, and let's pull that phrase apart. So we've talked about what data or data means. Now we What do you mean then by the impact part, you know? How do we put these things together? Because I think a lot of organizations, they have different views of what impact can mean for them, you know. So they're like, Okay, well, butts in seats, all right, who who attended? Well, that's impact, but it's not really impact. So when we think about that word in general, how are you using it in in your vocabulary?
So my answer, and I'm curious other folks answer, did we do what we said we were going to do is impact, and do we have the evidence to prove it? And what that requires then is, what promises are we making, what goals are we setting out to accomplish? And looking at that having targets, one of the most important things I think, that we could do as an industry is to start setting targets. And the reason being is that if we're hitting our targets, we don't need to invest any more resources there until something changes, and then we, all of a sudden, do, but we don't have targets, and then we don't have then the indicators, are we hitting our target or not, to then help us figure out, where do we invest our precious time, we don't have enough time, we never have enough money, we don't ever have enough buy in from our stakeholders. So we have to be equipped with the right data to make those smart choices, so that our stakeholders believe in what we're doing, our learners are getting what they expect out of our investments in learning. And all that, I think, has to do with setting really clear targets, but we can't set targets unless we know where we're going and what it is that we are trying to do. And to Shannon's point, maybe the target, and I think it's an important one, is that we have 80% engagement and completion in our programming, because if you think about it, in a chain of evidence of impact, longer. The term impact, if nobody's engaging in our programming, they're not showing up, they're not doing the work well, then nothing else is possible. So we do need targets at all of those, like Kirkpatrick levels, if you will. Those are all really important, but we don't want to stop there. And I think getting really clear on where are we going and what are we doing for the organization? And having that information and vision up front allows us to set targets and then collect the right data, which we can dive into around. How do we know if we've met our target or not? And with that, we can make smart choices, which is really what it's all about in the end. Well, what do we do when our program is over? How do we keep developing people, when the businesses KPIs change, how do we sunset certain programs and know to build others? We have to have visibility into what's working and what's not, what's aligned with what matters to the business. And it starts with knowing where we're going with all of our programming, documenting that, setting targets and then monitoring all the time. I know that I love that, but yeah, yeah, we can know where you're going, nowhere you going, and how you going to get there. Douglas, well,
you know, the second part of that sounds like taking what we should be doing with instructional design and just applying it to instead of a specific course, but the entire learning impact, right? What is my end result? Where am I going? What do I want them to walk away with? So what do I want our organization to go to? But my original question was, you know, all these LMSs are talking, oh, we've got better data, analytics of this and data, and you look at the you look at the splash page, or, you know, write the analytics page, and it's a level of gamification, right? You get a badge for this, and you get something this, and, oh, look how many training hours we've done. Yeah, everybody's smiling. It's like, you know, that's the equivalent of, like, a hey, was the instructor on time? I mean, it's a smile sheet in a way, versus some actual analytics. I and I don't know if you've got some other avenues that would give you better results, but the one that of all of those that I look to is, how long did it take the people to take the train? Right? If, if it's booked at a 30 minute and takes most people, or a large number of them, 55 minutes. It tells me they're not engaged in the training. They got a playing in the other back, right on the side. Or, you know, it was so engaging that they got a good nap out of it.
So what you just described, Douglas, a lot of the data from our regular LMSs I think of as useless, truly useless. Did the instructor show up on time? Was the lunch good? I mean, at the end of the day, we have control over the kind of data that we're collecting, and you're not going to like that what I'm about to say. But in some ways, we are the problem of survey fatigue, because we're asking questions and surveys that don't really matter. And so then when we do insert questions that the data and the outputs do matter, people are exhausted from all the scrap survey questions that they're not going to give the right time and energy to the things that really matter. So I'm my book is called measurement and evaluation on a shoestring. I was honored to be asked to write this book, and my entire career has been shoestring. It's not for lack of expertise, it's for a lack of budget. I have one or two people on staff helping me. I have managers who are donating time to help me, but they really have their priorities in many other places, and I only have so much of their energy and focus. So I've got to be really strategic, which means I don't want to be collecting all of the data. I want to be collecting the data that matters most for what I'm trying to do. And so if you're trying to figure out which instructors you should fire because you have to lay people off, well then sure, I should be asking questions about people showing up on time and learner satisfaction with instructors, because I need data to help me figure out who I'm going to lay off in the next six months. But if we're not doing any of that again, it goes back to and will talheimer talks about this in one of his latest books. Everything we do with data is about the decisions that we want to make with that data should then inform the questions that we ask, and we should be testing. I have a really fun test for how to make sure that the questions that we ask are going to give us the data that we need. So I can talk about that too, but yeah, I think we're the problem with survey fatigue. We're not We're not asking the right questions or prioritizing the data. We're just doing the kitchen sink, and that's leading to overwhelm from the person who's taking it, and our data is not so good.
Yeah, I totally agree with that. I've always said. And ask the questions that you need, the answers to you know, and then you reverse engineer from that you know and and incidentally, any of you who've been to my website knows that my latest downloadable that shows up on my website is a whole guide on training evaluation questions that will help you target the answers that you really need. So I'm really happy to you know, hear you say that, because I think the importance of really funneling down to what to where we need to go, and asking those questions about where we need to go. And I especially appreciate that you said, you know, the survey fatigue is our fault. I completely agree with that. So instead of asking 10, I love that term scrap questions. Instead of asking 10 of those, can't you just ask just three that are very important, right?
Yeah, Shannon, one of the reasons that I appreciated you so much when we first met, I was sitting in the back of the room at one of your core four sessions, and you were talking to people about how to write learning objectives, and you were advocating that it like learning is the vehicle through which people change their behavior and improve their performance. And so our objective should really be about what are people doing differently because they've engaged in your programming and writing objectives from that perspective, which is so much then easier to figure out. Well, what data should we collect based upon the performance objectives that we've written? It makes it so much easier versus writing a learning objective people will have a better understanding of Well, so what? What should they do differently? Yeah, yeah,
yeah, our impact questions need to be able to answer that. So what question? For sure, but we won't. It's not supposed to be about me, supposed to be about you. And I, although I could talk forever on how we really screw up the whole learning objectives thing. But what I and I'm curious, as Douglas is as well, in the chat about your test, what's that? What is that all about?
Okay, so here it is. You've got a 10 question survey, and actually, the test is more about, are you writing your answer options correctly? Really? Okay? Because our our answer options, in some cases, can be like, totally meaningless, and our answer options also tell us if the question should be written differently. Okay, so you've got 10 questions, pick randomly three of them, and then do jump into the analytics and just pick an option. It doesn't even matter what the question is or what the options are. You pick a question, and then you say, Okay, if 80% of people selected option C, what does that tell me? And what does that tell you? Something that you can use to make an informed decision about whether you did what you said you were going to do in your program most of the time, the answer is no. So an example one of my clients, she gave me a final feedback evaluation, and she did a first draft of her questions, and so she's like, can you please review this? So I reviewed it, and one of the questions was, how useful was the content? And she gave me, ultimately, options of like, a like, art scale. You know, it's so you have five options on a scale. How useful was the content? And I said, I said, Okay, if 80% of people disagree that the content was useful, what are you going to do? She's like, well, I guess I'd have to reevaluate the content. Where are you going to prioritize investments in change? I don't really know. That's not the right question to be asking.
Oh, I love that. Okay, so then what's the question?
Well, it all goes back to what decision do you want to make, and how are you using the data from answers to your survey questions to make informed decisions? And so one of my favorite questions just keeping with the same example. So I think what my client was trying to get at was some kind of like, did you find the time you invested in the program, was it worth your time? And so one of my favorite questions is, please rate how satisfied you are with the results that you got from the program. And the only way that that question works is if you've got a very clear message that says, Here are the results that are promised, you're going. Trying to save 50% of time in your sales closing so that you could answer more calls and sell to more clients. So what is the promise from the program down to the measurable like I could as a fly on the wall, I could actually see that that's happening or not. That's the kind of promise that we want to put in our outcomes and results from our programming such that it's not ambiguous. I'm not wondering, well, what results should I have got? We know going into the program what the result is, and then we can get feedback on the experience. So I love starting with, how satisfied are you with the results of the program, and fill that in. And then you can ask more questions that give you insight well, and you can branch, do branching logic. If somebody said that they weren't really satisfied, you can ask, if you, you know, tailored follow up questions that are relevant to someone who's not very satisfied, versus someone who is to give you then feedback on what worked really well for certain people and what didn't, and what changes to make, but it all has to roll back up to, what did you promise? What did you say we're going to do? And then how do you get data that reflects back to you? Did we deliver on that promise or not? And how could we have better delivered on that promise like that's the perspective to go into it with, such that we can make informed, smart choices about investing in improvements, scrapping certain things from the program or other sources of support to help people get those results.
I love that, and I have thoughts, as I'm sure everybody in this group knows, I have thoughts, but really, I really want to know what your thoughts are so based on what we were just talking about in regards to evaluation, questions and measurement and impact. What is you? What are your thoughts? What are your questions? Does this sound familiar to you? What are you doing in regards to writing your questions, I would love to hear from you. What are your experiences in regards to evaluation questions as they pertain to how you how are you using those questions to assess success, or what success looks like, or even measurement and impact of those questions? What are your thoughts?
So why folks share in the chat? Could I share a story? I feel really interesting. Okay, so I host a meetup group on Mondays where we're called the measurement Made Easy Monday meetup because I love alliteration in any event. In a recent Monday, someone came and she was like, I want to prove to my stakeholders that one of our most recent programs was, like, worth it to the organization. And I said, Well, what was the purpose of the program? Like, how was it supposed to serve the organization? And she's like, well, it's a training that we do every year. It's part of our compliance. And I said, Okay, well, what is it supposed to do? Like, is it supposed to help the legal team prove that people did the safety training so that they can't be sued, like, that kind of compliance? And she's like, Yeah, actually, it is. And her original idea around proving the value of the program was by evaluating skill growth. And I said, Well, in order to evaluate skill growth, we either have to adopt an assessment that has been created and validated by somebody else to then show change and skills, or you'd have to write your own all of that takes time and money. And I said, I don't think that data is something your stakeholders are going to give a crap about. I'm sorry they're not going to care about skills, because all they want to know is, did the program get done by the people that it needed to be done by? And I said, but there may be an opportunity for you to get feedback from people while you have access to them, because these were field workers. We rarely have access to them at one time, so take advantage of having access to those folks and figure out, well, what kind of data or feedback would really help the business to better reduce safety incidents. And so she was like, well, we don't really know, because it was about de escalation for public transit. We wanted to do de escalation without getting to violence, you know, being able to use language and emotional intelligence to de escalate. And so I said, Well, you know what would be really useful for if I was the stakeholder understanding what challenges your field workers have in de escalation that maybe should be incorporated into your next training. Like, is there apples to apples in the content and the scenarios that you've got in this program? Do they actually meet the needs? As needs are always changing, and you know, covid happened, and that context was way different than before. Covid and de escalation during covid would look way. Different than before or after. So are you collecting a feedback on like, what challenges do people experience when they're trying to de escalate? What are the barriers and what are where what are their fears and concerns? Where would they like more support? That data can be answered in one or two questions would be very little time. And if you are proving that your training is incorporating some of that feedback. Well, then the loop gets stronger. People will give you better feedback and more honest feedback, and then the training improves. Incidents go down, hopefully, and stakeholders aren't happy. So it's a longer story, but to prove that sometimes where we think we should be investing our resources and measurement and creating long questionnaires that take us time to make it's not worth our time at all. Instead, we think from the business perspective, well, what feedback is the business not getting that we can help them get in one or two questions that then we are adding value because we're giving visibility to something that stakeholders don't currently have visibility on. So that's the kind of thinking that's about being super smart, not necessarily falling into the patterns of evaluating skills or knowledge or performance, even, right,
right? I love that. That's a great story, and it really does help to shift the perspective on what data we need to collect, right? And I think that's the whole point. Is looking at this from from the mindset of maybe somebody else's shoes, right? So if it goes back to learning objectives, and you think about all right, I'm writing this from my perspective, not necessarily from their perspective. And so then what are we measuring from their perspective, right? And I think that's a great way to begin thinking about it, if we can, you know, even just ask them questions about how they are, how they work, what success looks like to them. Then that becomes easier to measure at the end of the day, you know. And then I also think from when you think about, well if, if what we're looking for is for this program to address incidences so less people are falling off a ladder. Well, then check that against ladder reports, right? How many people are reporting that they fell off a ladder? Well, 10 people did last year and five people did this year. Okay, all right, you know. So it's, it doesn't have to necessarily be rocket science per se, you know. So what? So out there in the group, I'd love to hear from somebody, what are your thoughts? How are you how are you using information such as this, or what is your version? How are you incorporating evaluations, or even, how are you measuring impact within your business, even if it's small, baby steps? I'd love to hear from you as far as what you are currently doing.
So I work for a large construction company that actually is in Canada. We're in Australia, we're in the US, and we are currently not measure. We do smile sheets, but we're not effectively measuring our learning, learning effectiveness. And so we have started a committee, which I'm on the committee, I'm a learning and development specialist, and to see what we need to do to really be able to effectively measure what our learning is. We teach a lot of technical courses for those in the construction field, but we also teach a lot of behavioral courses, and so we're at the very beginning stages. So this is why I'm here, because I agree it's great to see these little smile sheets, and great me as the facilitator. I love getting the feedback, but it's not really giving us helpful information, because it's right after they take the course or they take the program, and so there's not a follow up. And so my big thing is we need to have something, a three month, six months, something where we're following back up with what is their performance like? Although then I heard you all mentioned, maybe that's not important to the stakeholders. Is it the performance, is it the skill? So I don't know. I'm kind of in the beginning stages. Don't have a lot to offer, looking to gain more knowledge about it and what we can do
well. Thank you for sharing that, because I'm sure you're not alone. I know you're not alone. It's, it's not I think it's an I know we're we're all in this together, and we continue to move through this together as we discover ideas. Um, so I'm going to turn this over to, you know, Elena and Elena, what? What are your thoughts in regards, you know, to what she just said.
Yeah. So is it Shauna or Shannon? Shauna? Shauna. Um. Shawna, you're missing a hypothesis. What is it that you and I say hypothesis very intentionally, because we never have a guarantee that what we're doing is going to have the impact that we hope or the outcomes that we hope. And that is something, if you bring to your stakeholders your hypothesis, it's an honest thing. We've done our best to investigate what are the problems in the organization, what's causing those problems? Here's how we believe a training solution, or a people development solution, not necessarily training, can help them. And here's what we believe becomes possible when they've received that help. And so you're building a hypothesis. So I I'm 100% an educational researcher, and where I learned I never heard of Kirkpatrick until I started working for corporate I've always been trained in a logic model. Every program should have a logic model, and you can google search that to get insight on it. But in the world of public health and academia, where we are we're forced into, which is why I left academia. We're forced into publish or perish. And I couldn't. I love writing. I didn't want to be the pressure of that anyway. Side note, everything was based in logic models and hypotheses and testing, and so I think we're missing that we need to if we don't know where we're going, then how do we know what data to collect? And then how do they analyze it appropriately? So build a hypothesis, and it's okay for it to be an experiment, a test, until we test, we don't really know how it happened, how we can improve it, and things like that. So anyone who is in Shawna situation, what's your hypothesis? And there's a few great resources. Kathy Moore's action mapping is a great framework to help you build a hypothesis. Google search a logic model, and you can learn about all the components of a logic model, and you're just putting the hypothesis loosely together. It doesn't have to be right, but you got to start somewhere. And that logic model that action map, another good resources, Bonnie bears Ford's measurement map. That's another great she's got some good examples of how to map out performance as it relates to businesses, KPIs, and then, of course, revenue and profit. So we're trying to map all of that out in a hypothesis, because once we do that, well, then we can ask ourselves, how do we prove it, and what data do we need to prove it? And so one thing I want to mention, because I know Shannon, you led with, how are we creating our own evaluations? And I would push back and say we don't need to be creating our own surveys and assessments for everything. I do think feedback on our programming. We've got to create that, you know, how can we have improved the program based upon the results and outcomes that we wanted to give you? We need to create our own questions, because that feedback and that data is not going to exist any other way besides asking, but oftentimes we already have the data that we need to then evaluate what is going on in the organization, so stakeholders look to data to tell them we've got a problem. Well, what is that data? And if our solution was effective, that data should tell us that something has shifted or changed or not, and so I like to think about data and artifacts. Artifacts are qualitative, quantitative. Sometimes artifacts are conversations, and we just have to find a way to capture those conversations and then use that data to then make informed decisions find themes and patterns or sentiment. So Shannon, once you build your hypothesis, the question is, well, what artifacts or what indicators would help me see if we're moving in the right direction or not? And sometimes that's as simple as we should just be recording monthly meetings when people, when they're already meeting to talk about problems and solutions, or like what's happening on the floor in the field, whatever. Where are there already conversations? Where's data already being shared that we can just tap into? Sometimes it's performance data that lives in something like Salesforce, or sometimes they're informal meetings that are happening on a weekly basis that we can get permission to record, and then find themes that are relevant to, you know, whatever it is that we're working on. So think outside of your LMS, and think about where's the business already collecting data that we can tap into to be our indicators of success, so that we don't have to create those you know less than accurate because we are not researchers or statisticians or psychometric assessment creators, that's not our job. We can create great questions, but we're never going to be as good as the data that's already being collected in the business.
Great. Great. Thank you and Andrew, I see your hand. One quick note about the hypothesis, which is great. I think having a hypothesis is a very smart way to start. And for some of you, you think about a hypothesis in a very simple way. It's an if then statement, if we do this, then this will happen. Or if we don't do this, then this won't happen, right? So it's an if then think of it really, there are more complicated hypotheses statements, but if you want to keep it simple, it's an if then statement. So think about it in that way, and that might be your jumping off point for writing one to help you get to where you need to go. So Andrew,
yeah, I designed a onboarding for experienced hires at a consulting firm. We were able to get a question on the exit interviews with people who are leaving the company, to ask whether they had attended that onboarding for experienced hires. And what that showed us was that those who didn't attend it, it was not required, and and experienced hires typically needed to be productive immediately. That's why they were being hired. So those who tended to attend it stayed with the company, and those who did not attend the onboarding. It was a two week blended you know, some of it was self study, some of it was sessions to cover specific topics. But those who did not attend, there was some tendency that they didn't stay with the company, which affects recruiting costs. Those are typically very expensive people to acquire and and we're not getting the business benefit out of those things. The result, we didn't do anything to change our training, but we worked with marketing to market to, you know, quick videos and encouragement to attend, leading even the week before they started with the company, so that they knew that this existed, and this was an asset that's going to help them do their job better. So it had that business effect. It had nothing to do with us designing, you know, changing our learning, or even necessarily a measurement in this, in the in the Kirkpatrick Patrick style, but it gave us, like you were saying, it gave us data that we're like, yeah. Also based on that, we also reached out to people who had stayed, and we were able to engage with certain folks who had attended and stayed with the company, and that was also valuable, but the real insight was those who tended not to come, that tended not to stay. So valuable insight.
Yeah,
so someone in the chat mentioned earlier about qualitative and quantitative data, and I think Andrew, your example is a great one of you had a question that was close ended, that was more your quantitative style. It's going to give you, you know, a close ended response. You don't have to do a thematic analysis. You can usually use that to do some kind of aggregate, sums, averages, etc. And then you had a follow up question, a focus group, a survey, however, one on ones, however you engage with people that did stay well. Now you've got narrative data that then gives you your themes, your insights, your sentiment that help provide more context and insight into that qualitative, or excuse me, quantitative correlation that you saw between people attending onboarding and staying longer. Another thing that I think is great is working backwards. So someone else in my meetup group came, they do customer education training, and they wanted to prove that their focus on video education was having an impact. And I said, well, because we lead into a hypothesis always, there's always a chance that your video training isn't having any impact at all, and you need to be open to that reality. And so one of the ways to look at that would be, if the business's goal and doing customer education training is to retain customers, what we should really be doing is looking at the customers who stay with the business. How are they engaging with the organization? How are they engaging with the customer success team? How are they engaging with marketing materials? How are they engaging in assets that are created by the customer education team? What correlations and patterns can we draw? And so, Andrew, your example is a great one. Similarly, we could look at people who stay in the organization longer than people who have higher turnover. If we look at how have they engaged in training, how often they meet with their manager, how were they performing? What did their performance reviews say? There's a lot of data that we can look backwards to find patterns in our high performers, our customers that stay around. The people that we you know, employees that stay around, what do we notice with the data that we have? And then how do we double down on that in our learning and support strategy? So I think that's just a great approach. Andrew, and there's a lot of other ways to apply that too.
Yeah, and I think we don't, I don't think we mix and match the qualitative and quantitative as well as we could and but I also think that there's just a general misunderstanding about how to use summative data versus formative data, right? And one in essence, informs the other, you know, so that way you can make a decision. And I think when you talk about data literacy, that's, I think those are key conversational points, right? It's the use or the discovery of the qualitative the quantitative. And then how do you use that for formative versus summative, right?
Yeah, and I would even say the reality is that we don't need to over complicate it with formative versus summative if we just think from common sense, because I know sometimes learning jargon or what we learned in our masters or doctorate programs can overcomplicate things, really. We just want to know, are we moving in the right direction? And what data do we need that helps us to get visibility into are we doing what we said we would do? And I think of it as like we should be monitoring. So your formative is more about monitoring progress along the way, and your summative is like, Well, what happened at the end? Really, we should be monitoring constantly. We should have some data artifacts that we monitor throughout our learning programs and as people journey on into the workforce and continue learning. That's the other thing too that I'm frustrated by is that we're so focused on pre post, pre post, pre post. Learning doesn't just happen in the classroom. Learning happens every single moment that we are living and breathing. And so you know, learning is happening on the job as people, like, make mistakes and learn from them. Learning is happening as you're having a conversation with your manager, when you're getting critical feedback in your performance evaluations. Learning is always happening. And there's data that we can, like, lean on and lean into that exists in the organ. There's so much data in our organizations. Honestly, it's like scary. It's so scary,
right? We're surrounded by it, if we're surrounded by it, if we just tap into it, right? Well, I think that goes to your point in regards to indicators. And so another, another easy way for me, when you think about indicators. So what do we mean by indicators? Well, indicators for data is simply all of the things that you see and all the things that you hear that's happening around you. So if you're taking note of what you see happening on the you know, on the construction site, and you listen to what they're saying on the construction site, then that's going to give you data that you can use to help inform future decisions, right?
Yeah, I think the problem with data is that it lives everywhere, and anything, almost anything, could be a data artifact. We just have to harness it and make it useful. And I think that's one of our struggles, is we have to think a little creatively, like the example, Shawna that I mentioned, like there's probably informal meetings that are happening talking about safety incidents or things happening in the field. How do you find out about them? Maybe you talk to Foreman managers. I'm making language up. I don't work in construction, forgive me, but like there are managers that probably have informal conversations on some regular cadence, though, those data points could be insanely informative in terms of how we support people and the business to, you know, reduce the safety expenses or things that are problematic for the business. So how do we harness the data that's already around us, especially the data that happens informally? That just takes some really creative thinking, and if you have a hypothesis, that's, I think, the answer to the question No one has asked yet, but it's that, how do we get access to the data that is in our organization? So I know a lot of learning professionals, that's a that's a struggle to get permission to get behind those walls. And my again, you're not going to like it, but my experience has been, we haven't given them a compelling enough reason why. And that's where the hypothesis is so important. What are you going to do with the data? I'm going to just a story hypothesis that these things are related. And here are the data artifacts that I think would be useful. And then there's a it's a negotiation. Well, sometimes you get pushed back because there's conflict. Information that needs to be scrubbed and that's too much work. Well, if not this, then what other data artifacts Do you know of that we could use that maybe I'm not aware of, that helps us to test this hypothesis, but it all begins with the hypothesis, and that compelling reason why to then get access to the data, or to get support from your IT team to get access to the data, things like that.
Thank you for that, and I just put in the link if you guys did, if you missed it. This week, I wrote a post about this very topic, and part of that post was, how do you add, how do you get access to the data that might be behind the walls? So I just put that link in there for you guys to to look at. And we did have a question from Monica about, how do you apply all of this to new hire training, which kind of goes to what Andrew was talking about earlier. You know. So how do you apply that, that sort of thinking, to what is usually generic onboarding training? And I have a that's another soapbox I could get on. First is, stop making your onboarding generic. How about we start there? But that's a whole different topic, you know, for me. But when you think about, you know, what is it do you want to and I it's pretty much, what is it that you want to measure in regards to new hire training, it really doesn't have anything to do with people learning differently or at a different pace. It's really what do you want to accomplish with your new hires, right? Is that the direction?
Yeah, time to value Monica. That's your metric.
That's it. Yeah. Ramp up to mastery, is what I call it. You know what's what's the ramp up time to mastery, whatever that mastery is within your organization. And
I would argue that it's not about mastery, because people could still be valuable without having mastered. It True, good point. They're still offering, they're still doing what the organization needs, so that their time that they're giving and getting salary and exchange is valuable in some ways. You know, it might be. We need someone to answer calls so that when people call to get information, they at least get it. Whether that information is complete and is masterfully presented and fully accurate, they at least got something that got conversations going. So what is the value? And I think of it first as an onboarding what's the minimum viable value for the employees that you want to see? You know we have 3060 90s. What is the minimum value at 30 days? What's the minimum value at 60 and 90? Because the mastery comes with coaching, meeting with your manager experience time. We can't expect mastery right away, but we can expect that people are valuable, and if they're not, that's why we have 90 day programs that are like, hey, you know what? You couldn't meet this minimum. And we've got data now that proves that maybe we didn't put you in the right position, maybe you're not the right fit, and now we have evidence that says we're going to let you go. I've seen this happen in organizations because they had really smart 3060, 90, time to value with the with the role and the job description, and so it makes us so that we are not losing money because we invested in the wrong employees,
right? I love that, and I've seen other organizations do this as well, and I wish more of our brothers and sisters would do this, which is, include a 3060 90 in the onboarding process. And I know that, you know, a lot of us struggle just to get 30 days, you know, from people, from our bosses, much less a 3060 90. But I think that you'll see, you'll get the performance measurement by doing that.
Yeah, that's right, yep. And I'll tell you the last corporate job that I had. I had a 3060, 90, and because I was coming from a leadership position, I made the intention to go from a director role to an individual contributor. Because I'd never worked in corporate before. I wanted the experience. I don't regret it, but I regret it at the same time, not kidding, but kidding anyway. So in the 3060, 91, of my first questions was, how is our team and my individual contribution to the team? How is that going to help the business? What are the business's goals? What are our department's key performance indicators that help show that we're supporting the business? And my manager did not have an answer to that. And so for anyone that is training new people, I think the number one thing that we could do is have them do a scavenger hunt to find and and we have to have the data and create it. But like, what are the department's KPIs? What's the business's annual goals? Who are, who's on the board of directors, read the most recent annual report, or read like the mission statement and the most recent Impact Report, if you're working for a nonprofit, like, there's a book called it's Christopher Neubauer book, Shannon. I never remember the name, but it's like aligning learning or aligning. Actual design to business goals. It's pink and it's by Christopher Neubauer. We both know him very well, and I he's going to kill me for not remembering his book's name correctly, but thank me for mentioning it in his book. He has a chapter that literally gives a section on these are the questions that every single learning employee, whether you're individual contributor or leader should do a scavenger hunt. Talk to people, go on the intranet, go on line, go to Google and find the answers to these 10 questions. And again, I don't have them memorized, but it's a business intelligence scavenger hunt that gives the employee the basic business details to be able to do their jobs and understand that in the context of the larger business function and goal, etc. So I think that I would, if our industry could make one change, it would be that every employee is doing that level of institutional exploration for the organization they're serving.
Lately, agree, and I'll find the book, and I'll put it into the into the resources for everyone. And we are, you know, we're at the top of the hour, which it's always amazing how fast this time flies. I know how fast and I put into the chat. Our next Coffee Chat is not next week, but the week after. Our coffee chats are every other Friday, and our next conversation was kind of ties in here, so I don't know. Maybe, you know, Elena, maybe you have time, but it's about, how can we make compliance training interesting? Is it even possible to make compliance training interesting? And thank you, Connie, put a link to Elena's book. I didn't know it was out. I thought it was, is that already
it is available for pre order? Yeah, pre order.
Pre order. Okay, all right, awesome.
One more link in there, sure, for y'all, which is the link to my newsletter. So I am, I'm a writer and I'm a researcher. So my newsletter is all about small, simple strategies to improve impact measurement one idea or concept at a time, and it comes out every Saturday. I love it as like, grab your favorite beverage and sit down and just noodle on some new thought leadership. So if that's of interest, you have the link there.
There you go. There you go. Thank you. And Heather, I see your question there is, could we get directors to complete the scavenger hunt questions before so we're aligned, and I think that's a great idea, you know. And I you guys know, I love me a good bingo card. I love me a good scavenger hunt. And I've always said that we need to be more familiar with the business, specifically understanding how your business makes money. Even if it's a nonprofit, it has to make money. How does it make money? And you find those things out by reading the annual reports, by, you know, asking about profit and loss statements, or asking about revenue generation through marketing, but building those relationships, I think right when Elena it's it is about before you can even think about measuring impact. I At first, you have to build those relationships and trust within the business.
Yeah, you can't make a hypothesis that's accurate in terms of what the business needs, without understanding the business's problems. And it does. It takes us. It requires that we step outside of learning. It really does. We need to be a business unit. And Shannon, you said this. A lot of other folks that our peers are saying this, but think like a business unit, not like a learning expert, and it will be sanely revolutionary. What comes about because of that
absolutely so Elena, why don't you tell people where they can find you?
So I'm on LinkedIn, Dr Elena schlachta. I also have a website. I'm at Dr elena.com and you have my sub stack link there for my newsletter. And I'm on LinkedIn, I do a industry leader chat. I know some of you who are here have come to those and thank you for being with us today. But I also do 15 minute conversations with leaders about these kinds of topics too, usually every Thursday, so you can find me in any of those places.
Great. Well, thank you so much. I appreciate you joining us. And for everybody here, you will get the resources. Amanda will send you the resources. After everything is said and done here, you'll get the video and the audio and the transcript, etc, and all those other good things. And I want to thank you again for joining us today. It's been really interesting. I always love having this conversation. Hopefully everybody walked away with a good idea, and let's just go ahead and close it with that, right? So what's the in in the chat, you guys, tell me what's the one thing you think you can do right here, right now? What's the one thing you know? Not I'm going to do it in six months, but something that you. Absolutely start doing right now based on some of the things that we talked about today, and while they put that into the chat, so I see, there you go, developing a hypothesis. There you are. Yep, ask better survey questions. Consider what the artifacts are. Yes, I love that. See, there you go, and then here's, here's Elena's LinkedIn profile. This is wonderful. So great. So, you know, start doing that. Get out there. Hypothesis statements. Find those artifacts. Listen, watch, observe, find those things that are out there that we're surrounded by a so thank you everybody for joining us. I hope you have great anyone got good plans for the long weekend coming up? What's, what's the plans this weekend? Hopefully,
hopefully relaxation. Woo hoo.
Relax. Yes, right, gotta buy the book and drop it on the desk of people who care. Yes,
luckily, buy 10 copies, please, and thank you.
Yeah, your daughter's 15th birthday. Oh, Andrew, that's so sweet. Going sailing. Oh, all right, go to the country and mow. All right, I suppose that's yeah, if that's what you enjoy doing, you do you? I'm going to, I've got a bunch of angry looking peppers. I got, like, some ghost peppers and Carolina Reapers all grown in the backyard. And I'm going to dehydrate all that crop outside this weekend, and we'll see what we can't come up with. I think that's what's on my agenda. Yeah,
I'm going to be making tomato sauce. So tomato
sauce, love it. Love it. Yeah, so if you guys want hot pepper powder or something. I'm sure I'm going to have tons of it. I'll send some to you. But thank you everybody for joining me today and again. Well, thank you to everybody who is new, and I look forward to seeing you all at the next Coffee Chat. Thank you. Shannon.