L&D Round Table: Deep Dives into Ecosystems, AI, and Measurement
6:56PM Nov 12, 2024
Speakers:
Shannon Tipton
Jessica
Chris
JD Dillon
JD
Alaina
Trisha
Keywords:
double duty
virtual facilitation
breakout rooms
forest bathing
data measurement
AI impact
learning tech
contact centers
performance support
virtual commute
mental health
productivity
business metrics
training challenges
measurement strategies
Hello everyone. Happy Friday.
It's good to see you. Jean, great to see you. I see some familiar faces. It's always nice to see you guys.
Victoria, it's good to see you again, too. I'm glad you could make it.
Let's see who else we got in here. Oh, I see lots of familiar faces, Trisha
And Michelle and Amy. I see all sorts of familiar faces, which is fabulous. And so here's my question, how many of you did some sort of double duty by being at Dev learn, and now you're doing this. Anybody do that?
I know Elena did.
Let's see. Yes. Well, good, so we don't have two people who are too terribly exhausted from all of the events, that's great. So go ahead and in the chat, share with me what you know who you are. If this is the first time with learning rebels Coffee Chat, be sure to let me know, so that way we can give you the warm welcome you deserve
and share with us. What are you up to today, or what's your plans for this weekend?
I'm very excited for today. Yes, Stella, it's always good to have you, Stella, and I appreciate that you know it's cocktail hour for you out there or a little bit beyond it, so I'm always happy that you're able to join.
All right, let's see niece's birthday. Well, Amy, I hope she has a great birthday.
Yes. Leslie, so good to see you again.
Chet, welcome. Chet, I believe you've been here before.
Yeah.
Welcome everybody,
sweet Omaha, Trisha,
most of the weekend crafting. Oh, you gotta have family birthdays. That's always nice. Big group gatherings. What would also be fun is for you guys to put into the chat. What
was there something in particular you wanted to take away from today's round table. I would love to know that.
And you guys know the you guys know the rules. The rule is there aren't any rules. But if you want to come off chat, I mean off mute, and speak to the group and share what you were hoping to get out of today, that would be wonderful.
You can feel free to do that also. Yeah, go ahead. I'd like to share because I don't want to write this all in the chat. But okay, maybe I I've been facilitating a long time and doing virtual facilitation, and I know how to engage people with polls and blah, blah, blah. But how do you engage sales people in their 40s. Ish, that feel like any sort of engagement activity is baby ish,
oh, well,
that's a really good question, not really a high level topic for today. However, that being said is hopefully what you can get out of our interactions today will give you a good insight as to how you might manage a group and facilitate a group, especially as we're doing the breakout we're going to do breakouts today. Nobody be scared. We're going to do breakouts today, but I'm gonna do them in kind of a different way, you know. And hopefully you're going to see the three different speakers. You'll see JD, you'll see Elena, you'll see Chris, and they're all going to have different styles. And I know serendipitously, you're going to be able to pick up some great tips for how they speak to you. Maybe something that they might do in their groups. It might provoke some ideas, also as an aside,
as an aside
our next coffee chat, so not next week, but the week after, we're going to be talking more about that particular topic. We're we're talking about, how are we creating moments of creative problem solving? How are we helping people think through that process? How are we doing it virtually and live so that may give you some really good ideas during our next coffee chat as well. Thank you.
Yeah.
How will.
This Act? Yeah. JD, exactly. It's hitting them where they count, right? So salespeople are all about making that money, and if you can help them, if you can say that this activity is going to help you increase your commissions, it's going to help you line your pocketbook, I'll tell you what, they'll be engaged,
right?
For this coin operated
absolutely all right. And so now let me as people start, you know, they're still sort of trickling in.
Um,
what we're going to do today, like I said, is that we are going to be in breakout rooms, and I am so happy to have Chris. Coladonato with us. JD, Dillon and Elena, I'm going to have you say your last name, because I always mess that up too.
And I'm going to give them a moment to go ahead and introduce themselves to you as a group, and then I'm going to explain how all of this is going to work. So Chris, I'll go ahead and start off with you.
Oh, boy. Okay, so I am Chris cola Donato. I am a former talent development leader turned forest bathing guide. So yeah, that's a little bit of a switch. Look at JD. He's like, I know JD. Space is like, whoa.
So yeah, I spent over 20 years in talent development and left the workplace in 2022
as a virtual leadership and hybrid, hybrid workplace leader for a large insurance company. And I left because I felt like there was something more calling me. And I found out through a lot of different coaching and things like that that I really wanted to help people learn how to slow down, because I never had that when I was working and so, yeah, part of it my work. I take people out in the forest and help them learn how to be more mindful and to slow down. But I also do this work virtually. I also do this work and speak at retreats and conferences too, to help people learn how to slow down, not just in the forest, but I know the talent development world very, very well. Spent a lot of years in it, and I know how it can be busy and lead to burnout. So that's why I'm here talk about a little, little thing called busyness and how to break free from it. Yes, especially with the holidays coming up, right? You know, we need to take care of ourselves. We don't do enough of that. Nope. All right,
I'm going to turn it over to Elena.
Hello. It's morning for me, but it could be afternoon or evening for you all. So good morning, afternoon or evening. I am Elena schlachta, and my entire life, people have been mispronouncing my name, so I really go by anything. In fact, these days, I'm going more by Dr a because it's really, really easy. So either of those things work for me. I'm a big data nerd, and I did just jump off of DEV learn an amazing event here in Vegas, and talked a lot about data. Data enabled instructional design, the good data to use to measure our outcomes, using our resources wisely, thinking strategically about measurement. And I'm an educational researcher by trade, so I teach about how to create hypothesis and research questions as we look to data to then measure and evaluate our work, and my mission is to simplify and take the mystery out of measurement so that we can start measuring more and more effectively. So my time with you all today is just to figure out what are some practical things to do that are easy that will build your measurement practice, because it doesn't have to be hard, but the goal is to be consistent and to learn from your mistakes and be okay being in the space of trial and error, which a good researcher is in every single day of their lives. And we learn from our mistakes, we learn from our data, and we improve. And that's really the value. So anything in and around that is what I talk about, and absolutely love it. So thanks Shannon for having me. Thank you. I appreciate you being here. I'm sure you're exhausted, but you know, it's always fun to have you can't see my dark circles,
yeah,
last, but certainly not least. JD, how you doing? I've been sitting here for the last three minutes trying to figure out how to explain what I do by integrating the forest in some way, and I have no way to do that.
I'm a corporate stumped him, Chris, you stumped JD, I think that's it. That's a good win for a Friday. I could do things, but I don't think I want to do in a recorded session in a professional context. So my background, I'm a corporate ops person who used to manage movie theaters and theme parks, who turned L and D professional at Disney. I spent 10 years at Disney and then eventually turned into a technology person. So I work in the learning tech space. I specialize in enabling frontline employees, and right now, I'm doing my best to help learning and development practitioners and other folks kind of break through the noise and the hype as.
Associated to AI, and kind of take a meaningful look at how I believe AI is going to fundamentally shift the nature of learning and performance work, and as a result, the nature of what learning and development professionals do. So that's what I do. Fabulous. Thank you. Okay, so now that you've got some insight into our speakers today. What we're going to do, or what I'm going to do, is I'm going to put you into three breakout rooms. You are not going to have to rotate. The speakers are going to rotate. And so all you need to do is be in one room, have your pens, pencils and questions ready for these lovely people, and just, you know, sit back and absorb and really, it's all about the questions you don't often get one on one time, especially, you know, with three busy people, and we have a lot of people register for this, but, you know, sometimes not everyone can show up. And this is great that we've got, you know,
a decent amount of people to bounce ideas off of, but yet small enough still to be able to address concerns. So I'm loving this. Also be aware that the speakers are going to be recording the groups, so that way we can send it out later on. Okay, is everyone ready?
You're going to have 15 minutes with each speaker, and that'll take us to almost the close of our time together. But we'll have some closing questions. If you have them, we'll be able to address them there. All right, any questions, somebody give me in give me a reaction, give me a smiley face in the chat. If everybody knows what the plan is, we're all Good to go,
all right, And away we go. You.
Hello, Hello, welcome back, so everybody feel connected.
How to take care of all that busyness. What was your takeaway? Anyone want to volunteer? Hello, hello. Everyone coming back? Well, so main takeaways, who's going to volunteer something? The rest of you put it into the chat for me. Did you have an aha moment?
Shannon, I'll share for for me, it's, I don't want to say validation, reassurance that it's okay to continue to try to find that separation between work and personal life and be intentional about protecting that time and not feeling guilty about it.
Yes, right? We do get hung up on that, where it's like we feel like we have to be doing something, and if we're not doing something, that we should be feel bad about not doing something. We all end up in that wheel, right? Awesome. Anyone else, someone from JDS or Elena's session. There she is. I'm like, where did Elena go? On my screen? There you are. And then also, while you guys are thinking about that, I put into the chat, everyone's so JDS book, Elena's book, and the nine ways to find stillness, guide from Chris and I hope that you guys found this exercise beneficial and useful, that you were able to gather some you know, ideas, or some reassurance or just something that might say, You know what, I want to try this, you know. Thank you. Candice Katie's point about call centers being at the front of qualitative AI analytics, okay, I missed that part. So I'm going to trust JD, you want to sum that up.
We were just talking about the the potential for advanced analytics through artificial intelligence, and the idea that contact centers are going to help teach us a lot, because there's such data rich environments that everything is measured. So in terms of connecting the dots between how we support people and the outcomes that they see in their performance contact centers are an interesting place to start, right?
Yes, for sure, sales as well, right? So sales, I mean, there's a data rich environment, especially if you use a formalized CRM type of tool, and Elena's question, learning is going to be the means to what end? I love that. I love that question so much, you know, because it really is. And in my perspective is, you know, we were expecting change. We're expecting some sort of behavior shift, or some sort of skill adjustment. What is it? What is it that you're looking for? Right? Dr, a,
I mean, yes, that's all I have to say. Just Yes, right? I mean, at the end of the day, we just need to build the practice of asking these kinds of questions so that we can get the information we need. Because I heard a lot of people aren't getting the information that they need to create a learning solution to some kind of organizational problem. So that's a great way to frame it, and there are many others, but this one's really simple.
Thank you for that. Yes, well, and I appreciate so everybody's putting something in here. Yeah, the reminder to work backwards when planning to measure, yeah, begin with the end in mind. Or, you know, what is it that you want to achieve? How are we going to get there? I think those are all great starting points for trying to achieve what we hope to achieve, from an L and D environment, and from that, I'm going to actually end this with Chris. I think Chris, you might have some good words of wisdom as we go into the holiday season. You know, as time starts getting busier around us, what would you leave us with?
Oh, I
have so many thoughts. Go out in the forest. There you go, everybody. No, I
think the biggest thing in Orlando find that forest.
There's a tree. There is trees everywhere. The sun is nature is all around you. People, no while I highly recommend getting out in nature. I think the biggest thing that I'd love to share is give yourself permission. Give yourself permission to take a break, to do something that refreshes and recharges you, and that's the biggest thing, is finding what works for you. We've had a lot of great conversations and found out, you know, everything from petting dogs to rebuilding a v, v, w beetle again, what recharges and refreshes you and make time for that these holidays, especially these holidays all the time, but especially during this really busy, busy, busy season. You're, you're the only one who can champion for yourself. So,
oh, now there's a statement. Yeah,
I'll get a little I'll get a little morbid. I mean, you're the only person that's going to be with yourself birth through death, which is you, so take care of yourself.
Yeah. So that's wonderful words of wisdom, really, it is, and I'm going to leave it at that. So I hoped, I hope you guys enjoyed yourselves. I hope you learned something new and again, for those of you who haven't been with us before, the next Coffee Chat is not this Friday, not next Friday, but the Friday after they happen every other Friday and the next one is all about building that creative mindset and some tools that we can use in our virtual learning situations, in our live situations, to encourage that kind of creativity and creative thought. So I hope to see you all there on that note, everybody, have a great and wonderful weekend. Feel free to share. You know, if you're whatever it is that you're going to be doing this weekend, but on that note, you guys take care and have a wonderful rest of your day.
Gord, so
Shannon, you don't have to get mad at me. I hit record. Um, but I'm curious when it comes to AI and the impact on learning and development. What curiosities Do you have? Where's your head? Is there something that we could talk about to help provide some clarity or insight in our conversation today?
I actually do have a question. Fire away. All right. I love AI. I have it do like 80% of my work. So that's always good, whether it's an email or
we have a AI tool in our LMS. But for example, I just did one on paid search, because in my company, they were just like, I just need to know more about paid search if I don't know.
Know anything about it, and I ask AI, actually, what I did is I asked chat GPT to write a really good prompt to put into the LMS, to write a course on
on paid search. But then, how do I know if it's right? My only idea was to go back to the people in my company and say, does this all look right to you?
That's one way. So are there subject matter experts that can validate the information, just like we would do with anything we made, like, okay? And if, even if you just called an intern and said, make a course on this topic, you would vet that with someone who's an expert, right? And it's always good to look at AI as a really savvy intern, as opposed to anything else, but I was talking to the previous group about the importance of sources. So one, is the content generator that's taking information and turning it into something. Is it citing any sources that it's using that you did not provide? And two, do you understand the details behind how that tool works so that you have assurance with regards to how it generates content. Example, being like, can it see the internet? Is it operating off of a certain knowledge base, or is it just limited to the information that I provided? It
can't be a black box, right? It needs to have an explanation and sources. So when I use I use chat, GBT and Gemini a lot for ideation and things like that. Like, yes, I'm writing an article about and I really want to take this perspective what information is available in this area. And I'll ask, it's a lot faster than searching for that and going through links and whatnot, because it can often get me straight to the inside I'm looking for. But then I say, I want, Where's where's that from? And then I go look at where it's from, because I don't necessarily trust that that is either, you know, it could I'm not having as many hallucination problems as we did in the very beginning, because the texture is getting better, but I don't necessarily believe that it's interpreting the information correctly all the time, right? Because it doesn't know what I'm looking for, necessarily exactly or doesn't know the context around the content. So we just have to do the homework. So if you rather than gain 90% efficiency, because AI is doing all this for us, Let's carve that back to like 64%
efficiency and make sure that we're validating sources. And the other thing we were talking about before the session started with Chris Shannon, Elaine and I
also telling people when things are AI generated. I think people have a right to know how content they consume was created, even if we're not required right now, at least in most municipalities, to tell so. As an example, whenever I write an article now there's an AI statement at the bottom of my draft, if the publisher wants to put it in or not. It's their call, but I give it to them to say, How was this article written? So to specify what I did versus where I might have used something for either ideation, proofreading, making it snappier, whatever I was doing with technology. So that's it's in my new book. In more substantial ways, it's in every article i
i write. And then whenever I do slides that have aI generated images, there's a footer note every time that says, image generated by AI, so that you know how I'm putting together the story that I'm telling. Our elements, our LMS, has that statement like this content was generated by AI. Okay, so I have a question for you about that. JD, so if I use AI to write something, I don't go verbatim with it. I alter it, I improvise, I add to it. So when you say this is AI generated, do you say 75% AI generated. 50% AI generated.
Is that something that you're doing or, because I don't know too many people that when something, when AI says something, they just do a cut and paste and go with it. How many marketing professionals do
you know one?
Because i Has anyone noticed that a lot of marketing language has gotten suddenly similar and dull?
Because not that it was always exciting, but you can start, if you stare real hard, you can start to
see, oh, people didn't write a lot of this. So I get less specific than that, because, like I said, a lot of people won't care, like, if they get value from it and they can trust it, that's what really matters. I'm just trying to kind of, one, demonstrate kind of good practices, and two, clarify to people in a world where, pretty sure, a lot of content in the space, like industry articles and things like that, I can start to if you read enough of this stuff, you start to realize, hmm,
where, where did this come from? So I'm trying to just be, like, meaningfully transparent, so I will more explain what I did, versus trying to, like, estimate numbers. So if I, if I had generated a first draft with AI, I don't do that usually, but if I had, and then I updated it, I would just say that.
Did that, versus try to say and 60% of it is, then we start to split hairs, and you don't really know, and
got it yeah. And then if an anytime, an image is a I generated that's tagged that way, but again, most people can tell Yeah, either people are real glossy or it's an illustration that no stock image has ever created before, like people are savvy enough to figure it out. So I think, I think it more as a boost to tell people to help people trust and kind of differentiate yourself, versus just the avalanche of content that AI is enabling. Because the problem is no one has ever said, You know what? I wish we had more content that wasn't the problem, but we're using the tech in a way to solve a problem that didn't exist by just piling on. So now the question becomes, well, how do you break through that noise? And I think it's solving meaningful problems, not talking about fluffy hype things, just talking about problems people really care about. So making AI kind of the secondary part of the story, not the headline, and then being demonstrating good practices, being transparent, being unique and different, and even when you do use AI, like clever, meaningful applications thereof that solve meaningful problems, versus just like shiny objects, because people are getting real sick of AI everything at this point,
another question. Sorry. I mean, for for non native speakers, it's just fantastic to write text, and you see immediately when someone is using AI to write something. But what I really enjoy very much that you can translate it in so many different languages,
you know. And I think everywhere it is promoted, yeah, translated in 40 languages, in 60 languages, but there, I don't know which translator is the best, and is it deeper? Or is it and Synthesia? Are they all using the same or, how do I find out for, let's say, scientific biotech vocabulary, which is the best? Or do I have to train it myself? But I cannot train it. I can only train it for two or three languages, but not for 10 or 20. So what is the best approach for for a specific vocabulary? So one, I think language translation is one of the best things we're going to get out of this next evolution of tech, because so many instances, people just were limited to either understand information yet alone be able to learn it because of the complexity and expense of translations in the past. We're not quite there, especially at the scale of global language,
especially with complex kind of proprietary or complex information. I think I don't have a specific tool to recommend, but I would say, when you are looking at tools, look for ones that allow you to install some type of dictionary of terminology. That's something I'm seeing asked for more and more, is we have words that either don't exist outside of our company or we use them in very specific ways. So how do we make sure the technology understands how to use those words the same way? So when you're vetting different tools? Do they have that kind of proprietary dictionary capability?
I don't think we're in a great place to say what is best right now, because the tools are moving so fast, so anything that I would say is the best today. Give it three weeks, and it might not be the best anymore, but I think we're very quickly getting to a place where it's pretty level, and even in terms of our capability to do different things, to the point where I use so I'll point to the links before I get thrown out of the room. In the chat, I put two different links. One's learngeek.ai, that's my resource website, and on that page you can see some examples of me speaking alternative languages using different tools. So I recently did a session in South Korea, and I spoke Korean for part of it. I don't speak Korean. My avatar speaks Korean. So I think there's there's examples there. And then I also put a link to my book website, because there's a free download in it that talks about how AI, I believe AI is going to transform learning and development. So the secret code is 1004, so if you go to JD, wrote a book.com Next chapter, 1004, it'll give you a PDF document. It's a 12,000 word handout for today's conversation. So heavy. Thank you. But, but yeah, so I don't have a great use this, but I think really, anytime you're looking at a tool or maybe thinking about purchasing something that how do you handle proprietary language is going to be huge. I don't think tools are great at that yet.
Thank you very much. No problem. If I could, I would bring my avatar to meetings like this, but I don't. I don't have that yet.
That'd be cool. Just
talk to myself all time.
Yeah, I guess that would be the question, how do you know it's translated? I guess you just have to assume that it's translating as well as possible. So the way I, when I, when I did that event in South Korea, I translated all my materials myself. In the past, I used to rely on the event organizers. Now I can, I can handle it, and I again, know nothing about Korean. So what I did was I took all my I stripped the text out of my PowerPoint, dropped it to chat. GPT, asked it to translate all that text into Korean, took all of the translation to Google, translated it back,
and started to notice where there were differences, where, like the when it language went back from Korean to English. It didn't quite make sense, right? Something was off, which allowed me to go back to the original text and tweak or simplify or kind of find the problems, and then do it again. It took me for an hour presentation, maybe 45 minutes, to do that process. And then I went back into the slides and replaced all the text with Korean text, and then I sent it across to the organizers, 95% accurate, nice. Two years ago, couldn't even try to do something like that. So it's we're getting there when it comes to that capability. Yeah, nice, that's a good
lesson. I know
I'm going to get thrown out any second now. But any any other questions or curiosities anyone has
no thank you. Appreciate it My pleasure. Happy hangout. The all thanks for being here. Now I'm going to vamp until I just get suddenly tossed from Thank you. Dylan, I enjoyed this very beneficial. Yeah, I think my biggest thing is the more we can focus on solving meaningful problems, unless we get distracted by the hype and the tech. As someone who's in technology, we don't start the conversation with AI. We start the conversation with the problem you need to solve, and then maybe AI is an enabler in the way that we try to solve that problem, because
lot of noise around AI, not a lot of meaningful conversation about solving meaning problems.
There should be
a warning from Shannon, I
feel like I'm getting beamed up, like it's Star Trek.
That's my level of Star Trek knowledge.
Shannon, I'm ready.
Doesn't work that way,
but when she sees the recording, hopefully she'll Chuck off you forget to be me up.
Oh, here comes my replacement.
I know.
Hi everybody. I'm not Chris.
Going from the conversation about the forest into a conversation about AI in learning and development,
what do you wanna talk about? What questions or curiosities Do you have? Where's your head when it comes to the very noisy conversation around AI and the transformation of L and D,
how hard is it to set up your own LLM,
to build your entire your own using anyone else's kind of core technology? Yeah, let's say, if I wanted to maybe create, like an AI, I
want to say chat bot, but like a virtual assistant based on
maybe, like a 30 page document,
it's a lit I would say one. You're going to want to use someone else's tech to do that.
So the question, I think more becomes, like, what's the audience for that? What's the like? The big, big question around AI right now is cost structure, like, Who's paying for that process? Like, how much volume do you expect to come into that type of tool? Because every query to an LLM is significantly more expensive than a Google search. I can't never remember the actual numbers. So
to kind of work out the business model of what you're trying to use it for, it gives you the answer to say, Well, what tech can you afford to leverage to do it? The cheap example is what I did when chat GPT launched its custom GPT store. So if anyone's not familiar, if you have a paid chat GPT account, I pay 20 bucks a month. It allows you to create a custom GPT so use their underlying LLM, but then feed your own source information to it, and then it can respond based on that source information. So I have a chat bot called Learn geek AI, where you can just talk to it about learning and development related strategy conversations. And it answers kind of like me, not quite the same as me, but it's actually like my book is in it,
and I basically put a million words of content into it, and it comes back with the types of things that I would say.
Okay, and there's no cost to the users, because it's they opened the custom GPT access to anybody that has a chat GPT account. You have to pay for it. So that's the example of, like, no frills. I pay 20 bucks a month for the ability to create that, and anyone can get into it. No big like, cost model required if I had to, like, build something that was that was limited down to a specific audience or hosted somewhere specifically, then we're getting into a more complex and expensive conversation.
Thank you. No problem.
What else is on y'all mind about AI,
tired of hearing about it, excited about using it, not quite sure
it saved so much time for me, and I cheat. So I probably said this before. I hate being recorded. So I'm state government, and they're absolutely against it, but if they knew the time savings in terms of our, you know, curriculum design and whatever it is we need to write marketing materials so I can never really come up with enough
justification for them to back off,
whoever them is, yeah, my general perspective is, it's kind of like the cloud, right?
Heavily regulated institutions were entirely against the cloud, and it was all about on prem technology. And then slowly that started to inch away, and now a lot of highly regulated institutions are in the cloud because the technology evolved to a point where they could trust it, and they got tired of the cost and inefficiencies of having to maintain different instances of different software locally. Some still very much do that, but it started to break away. I see this happening faster with this technology, not necessarily because the institutions are going to be like if only we had this efficiency. I don't think the technology companies are going to let them hold out as much because the Microsofts of the world can carve off specific instances and specific capability and put it behind certain types of walls that different types of institutions trust. So while it, it may limit these options that heavily regulated organizations have, I think this adoption is going to move faster than previous adoptions have, because of the FOMO and the money that's involved in it and things like that. But,
yeah, I think, I think it's still critical for us, even if you're in one of those institutions, to understand how the tech works, to be able to be ready for when the door opens, even if the door opens a little bit. Because if we're not ready, the organization might make decisions for us in terms of how things get done, or what they can do with or without learning and development, that type of thing. But yeah, most of my conversations with organizations around AI the scent. The conversation starts with the words, my company won't let me dot, dot, dot. And it turns into a conversation about, well, what can we do to prepare and then as a function, and then, what can you do as a professional to prepare? Because, like in my case, I have a whole tool set of ai enabled tools that are not part of my day job, because that's how I
figure things out, instead of waiting for my team and my IT department and whatnot to evolve. So we recently adopted Gemini as a digital assistant type tool, but I've been using digital assistance in some capacity on different machines and different accounts for a year and a half. So I moved faster than a lot of other people did, because I wasn't waiting for the company to be okay. But I wasn't breaking any rules. That's the other thing. Wasn't doing the bad things. I was doing my own thing until it was
time.
Yes, when we mentioned something like, oh, we really like Gemini, then what does our IT? Department do block us from Gemini, right? So I move on to the next one. So now I'm in perplexity, you know, and we do have the Microsoft everything, right? So co pilot, whatever, you know, it'll probably be okay. So I just need some good words. So thank you for using the example of the cloud. You know, we were the last ones in there, but now it's trusted. Just prepare. That's what happens. You know, be ready.
Yep, there's, there's too much money in AI for organizations to go backwards, or IT departments to to not find ways to enable organizations to adopt it. It's just too big a conversation at this point. Did I mention the links I dropped in the chat? I've been having the same conversation repeatedly for the last half an hour, but if I possible rerun in the chat, I dropped two links. One's learngeek.ai, that's my resource website for AI related stuff. So there's a list on that page of tools that I use for different things. Some people, I recommend following some examples of using AI in different ways. The link to my chat GPT bot is there. So if you're curious, feel free to check that out. And then the other ones. JD wrote a book.com it's my book website because there's a free chapter download on it, and.
The code is 1004, it's in the chat. It's a chapter about my perspective on how AI changes learning and development. So if you put in the code where you find the button on the website, it's a PDF document. So it's a 12,000 word handout, basically, if you're looking for some light reading, partially written by my chat bot and mostly written by me.
What else is on your minds, AI and learning and development?
What are some of the coolest use cases you've seen recently in learning and development, using AI as your partner? I mean, the most exciting thing for me is still translation. I do most of my work around my entire career has been around frontline employees. So as an example, at Disney, we translated our training material we had. It was in English, Spanish and usually Vietnamese for Walt, Disney World. After that, it took a lot of time. It was really expensive to try to reach a larger audience. And when I was a custodial manager, a vast majority of my team members were Haitian Creole speakers. So a lot of people just had a hard time understanding information, yet alone, trying to learn information or use information, right? So the just the fact that we're getting this much better, it's not perfect, but it's very serviceable. Is where I would say a lot of tools are now different tools translate use different types of systems and different types of technology, but all of them, broadly speaking, are getting to the point where for conversational purposes, like informal sharing, like messaging and emails and things like that, very good in a large number of languages. The tech I use is in the 60s. At this point, like solidly in the 60s. Number of languages for stuff it requires, like approval. You need to be very specific about you still need someone to review and tweak the content. But language translation is a big one for me, and it's one. It's the reason I use avatars, because that's the other big conversation right now that everyone seems to want to talk about is fake people. I
don't completely get it, because when I was an instructional designer and content developer, I didn't usually have this like, you know what? I wish I had a head talking. That would really make this
course, like I usually when you put a person on screen, I find the value is who the person is. Like, it's their voice adding some value, not just a person saying the words that said, I have multiple platforms where I have multiple versions of myself, usually because I want to speak a different language in some case. So I did a session in South Korea. I had my avatar introduce me in Korean for the session, and I could translate everything in the session myself into Korean using AI, which is not something I could do in the past. So that type of stuff I find particularly exciting. In addition to the digital assistant conversation, the whole conversation you're having around, what if you had a really smart co worker, not perfect? What if they were always there and you could always ask them, and they could always help you do stuff, I think that fundamentally changes. Like, if that works out, it changes the idea of learning. Like, what do you have to learn now versus what can you rely on your friend for? How many times we slide the wrong direction with that and over rotate to the digital assistant, versus helping people learn how to do things themselves and solve problems themselves.
But I think there's some fundamental capabilities that are really more exciting than the overall content generation story and things like that. If
you want to see me talk to an avatar, learn geek.ai
I did it in March 2022,
and 2023,
is where I had a conversation with AI. It was really cool back then. Now everyone can do that.
Data Analytics seems like the most challenging aspect of this to me,
in an area that I haven't seen a lot of L and D people talk about. We tend to talk about, like, you know, storyline, or, you know, articulate. Has there, right? We talk about how to build things, but wow, what if I had all the data the company had access to, and I could use AI and machine learning to actually understand and interpret that? You talk about being able to, you know, evaluate the effectiveness of your learning, but also to get into predictive analytics. That, to me, seems like a very powerful application that's,
it feels like further out, further from my reach than, you know, simply using the tools that I already have,
you know, available to me. Yeah, I think I absolutely agree. And I think a lot of the challenges actually has nothing to do with the tech. I think a lot of it has to do with existing measurement practices and kind of how data is managed, because so many L and D teams are still in a silo when it comes to everything about data and measurement, that there's only so much we can do, even if the technology gets significantly better, or if we go from a world of ad hoc reporting where there's kind of a consistent set of reports that you generate using the data to a world where.
You can query your data and ask for insights, or it can automatically share insights. But imagine you can go in your LMS and just say, Who from this team, and this team hasn't completed this training yet, as opposed to having to generate a report about those types of things. So I think before we even get to the tech side, it's how well are we connected to data teams within our organization to the types of data we would need to be able to connect those dots? Because I know a lot of companies where L and D wants to do that kind of thing, they just have the trust and relationship with their sales team or their business operations team to get access to data or to be able to share their data with the rest of the organization. So I think before we even get to an AI enabled side of the story, it's how are measurement practices doing? Because course completions and test scores only
going to get us so far when it comes to even advanced technology. So do we have access to or connections to more rigorous data sets and a bigger data warehouse that we could then take advantage of to actually start to draw lines between the value of the different types of things we do. I mean, I think about, you know, what, what information can we get from sales and marketing, but also, like customer support, but they were talking about 1000s of, you know, qualitative statements. But if I had AI to analyze all that and give me the insights that would be a different game. Contact Centers are going to be way ahead of this conversation because they've been data mines for years. I mean, I 10 years ago, when I worked in contact centers, we were already using AI enabled technology to do basic analysis of calls right to understand changes in tone, and when certain were keywords were mentioned, and these types of things. So you take that level of analytics and connect, start to connect the dots, to really figure out who might need help and when, and in what area. I think contact centers are going to teach us a lot, because they're so data rich, as compared to a lot of workplaces that just don't measure a whole lot when it comes to what people do, besides, you know, the very outcome and then what maybe they've consumed or completed to get there,
we have eight seconds I've enjoyed this time with you. Thank you so much. Hopefully the resources are helpful. Thanks for joining me. Everybody back to the big room?
Anyone want to volunteer?
Hello, hello, everyone coming back?
Well, so main takeaways, who's going to volunteer something? The rest of you put it into the chat for me.
Did you have an aha moment?
Shannon, I'll share for for me, it's,
I don't want to say validation, reassurance that it's okay to continue to try to find that separation between work and personal life and be intentional about protecting that time and not feeling guilty about it. Yes, right? We do get hung up on that, where it's like we feel like we have to be doing something, and if we're not doing something, that we should be feel bad about not doing something. We all end up in that wheel, right?
Awesome. Anyone else someone from JDS or Elena's session?
There she is. I'm like, where did Elena go? On my screen? There you are.
And then also, while you guys are thinking about that, I put into the chat, everyone's so JDS book, Elena's book,
and the nine ways to find stillness, guide from Chris
and I hope that you guys found this Exercise beneficial and useful, that you were able to gather some you know, ideas, or some reassurance, or just something that might say, You know what, I want to try this, you know.
And thank you. Candice
Katie's point about call centers being at the front of qualitative AI, analytics. Okay,
I missed that part, so I'm going to trust JD. Would you want to sum that up?
We were just talking about the potential for advanced analytics through artificial intelligence, and the idea that contact centers are going to help teach us a lot, because there's such data rich environments that everything is measured. So in terms of connecting the dots between how we support people and the outcomes that they see in their performance, contact centers are an interesting place to start, right? Yes, for sure, sales as well, right? So sales, I mean.
There's a data rich environment, especially if you use a formalized CRM type of tool. And Elena's question, learning is going to be the means to what end. I love that. I love that question so much, you know, because it really is. And in my perspective, is, you know, we were expecting change. We're expecting some sort of behavior shift or some sort of skill adjustment. What is it? What is it that you're looking for, right? Dr, a,
I mean, yes, that's all I have. Just Yes,
right? I mean, at the end of the day, we just need to build the practice of asking these kinds of questions so that we can get the information we need. Because I heard a lot of people aren't getting the information that they need. To create a learning solution to some kind of organizational problem. So that's a great way to frame it, and there are many others, but this one's really simple.
Thank you for that. Yes, well, and I appreciate so everybody's putting something in here. You have the reminder to work backwards when planning to measure, yeah, begin with the end in mind. Or, you know, what is it that you want to achieve? How are we going to get there? I think those are all great starting points for trying to achieve what we hope to achieve, from an L, D environment and from that, I'm going to actually end this with Chris. I think Chris, you might have some good words of wisdom as we go into the holiday season. You know, as time starts getting busier around us, what? What would you leave us with? Oh, I have so many thoughts. Go out in the forest. There you go. Everybody, no, I think the biggest thing in Orlando find that forest. There's a tree. There is trees everywhere. The sun is nature is all around you. People, no,
while I highly recommend getting out in nature. I think the biggest thing that I'd love to share is give yourself permission, give yourself permission to take a break, to do something that refreshes and recharges you. And that's the biggest thing, is finding what works for you. We've had a lot of great conversations and found out you know, everything from petting dogs to rebuilding a v, v, w beetle
again, what recharges and refreshes you and make time for that these holidays, especially these holidays all the time, but especially during this really busy, busy, busy season,
you're you're the only one who can champion for yourself. So, oh, there's a statement.
Yeah, I'll get a little I'll get a little morbid. I mean, you're the only person that's going to be with yourself birth through death, which is you. So take care of yourself.
Yeah. So that's wonderful words of wisdom, really, it is, and I'm going to leave it at that. So I hoped, I hope you guys enjoyed yourselves. I hope you learned something new. And again, for those of you who haven't been with us before, the next Coffee Chat is not this Friday, not next Friday, but the Friday after they happen every other Friday, and the next one is all about building that creative mindset and some tools that we can use in our virtual learning situations, in our live situations, to encourage that kind of creativity and creative thought. So I hope to see you all there. On that note, everybody, have a great and wonderful weekend. Feel free to share you know, if you're whatever it is that you're going to be doing this weekend, but on that note, you guys, take care and have a wonderful rest of your day. You
Hey.
Well, hello everyone. It is lovely to see each one of you as Shannon, or as I mentioned, I'm a former corporate talent development leader turned forest bathing guide, and I'm here to talk about stepping away from all the busyness, especially as
not only work, but we seem to speed up once we hit Halloween. We get Halloween, and in the US, we get Thanksgiving, and then we roll into all the holidays, Christmas, Kwanzaa, Hanukkah, New Year's. Oh, by the way, a lot of us in our businesses are trying to squeeze in all those last things, whether it's corporate or own business, before the end of the year. So with all of that, I love to challenge people to really think about what they're doing, to take breaks and to recharge and to step away from all of that busyness and actually do something that recharges you.
So I'll open it up for questions before I dive into some conversations and some some tips and tricks. What questions do you have about stepping away from the busyness?
Or finding things to do that refresh and recharge you.
This I'll start. How do you encourage not just yourself, but also like your business, or at least your department, whatever you're able to influence to have that kind of mindset, because everything I read, mental health is such an issue. And the workplace today, and if we don't do this, it is going to have an effective higher health care cost. It already is, etcetera, etcetera. What can we do to to influence however we can influence about the importance of this? Yeah, it's a great question, and I like to
go at it from a couple of different angles. So to your point, it is about well being and mental health and well being from the perspective of,
if we don't employ this, then we will have more sick days, and more sick days leads to XYZ and turnover and retention. But it's not just about that. It's also a productivity
This is about when we allow ourselves to take a step back. We can become more focused. We can become clearer thinkers. We can become more creative. Think about how many times you sat in front of your computer and you're just trying to get that project done, or this paper written, or this, oh gosh, how many times did I have to do presentations, and I'm just working through it and working through it, and I'm trying to think my way through it, and then I take a walk. Or how many times have you gotten an idea in the shower? It's because you're giving your brain time to relax. So talking about it from not just a well being and a mental health perspective, but also a productivity perspective, we as an organization, we as a department, we as a team, will be able to think more clearly, will be able to be more focused, and we'll be able to be more creative thinkers. If we do build this time in
to take a break, to recharge, to come back refreshed, so that we can think better and have our products be better, more focused. Does that help a little bit? Amy,
yeah, that's great. Thank you. And I'm always a big fan of tying things to the bottom line, because that's what speaks to leadership. So when we talk about well being, we also need to talk about how that can impact retention rates. Can impact turnover. It can impact sick days. Not just, I mean, yes, is definitely a people issue, but it's also a financial issue. Same thing with productivity. I want my people to shine, and I can't. They can't shine if they're always
stressed out
when their cortisol levels are spiking, when they can't think straight.
Let's just take a break.
Use opportunities to rest and recharge throughout the day so that we can be more productive, and then again, tie it into the bottom line. That would be my advice.
You're welcome other questions or thoughts on that. All right.
Well, I'm going to ask a question, how many of you feel that you rest and recharge enough?
You can answer either in the chat. You can call it out. You can just shake your head.
Emojis work too. I'm seeing a lot of no's
sometimes, yeah, it can be tough. Steven, you know, like, even I, I'm a forest bathing guide, and I wasn't. I did not go into the forest this week. I did. I went on a baby forest. But, like, I need a deep immersion every week, or else I'm like,
so it can be hard.
How many of you do things regularly that do recharge you.
All right, so Jessica, does I like that? Jessica, I'm still working on resetting my mindset when it comes to rest, leading to productivity and not feeling guilty about it. Yeah, that guilt.
Hi. My mind comes from childhood. I don't know where everybody else comes from. It was really interesting. I was talking to a friend, and they're like, think about it. This has been around, especially here in America. This has been around since America was founded. It was always about, what are we doing? Think about the times you hear the I used to hear this.
Idle hands are the devil's workshop, or, oh, you know, a productive we've got to produce. We have to do this. I mean, that's been around since this country has been founded. So it's, it's just kind of in our DNA, almost.
Amy, I love that you go to Zumba three times a week. Visit gardens, both restore you and
you start at yoga. Jessica about a year ago, and it's been really helpful. Yeah, so you know when it comes to the guilt, here's the thing, I like to tell people,
You are the only person that will be with yourself from birth through death,
so you owe it to yourself to take care of yourself.
I know each one of you probably have these really great plans, and whether it's helping your family or giving back to the community or creating some really cool things, but how are you going to do that
if you're burnt out,
if you are
as I was so often, laying on the couch watching Grey's Anatomy, which Great show. Not sorry about love and Grey's Anatomy. And I don't think I need to watch each episode a million times. So, Michelle, I love that you relax by creating images in the mid journey. Yeah, you don't have to meditate, you don't have to journal. I mean, there's a lot of people out there that say that's what you need to do. I love those practices. They're really relaxing and recharging to me,
but they may not be for you. So So Jessica's found yoga. Michelle relaxes by creating images in mid journey.
You too can talk about mid journey. I haven't used it before. I know what it is, but I have not used it that that's a little intimidating to me.
Amy goes to Zumba. Steven and Jeff, what do you do to relax and recharge? I charge.
Go to the lake. I fully approve of that, Steven.
I fully approve of that in the garage, rebuilding a 68 VW Beetle. I love that. See to you that's probably almost meditative, because it's just something that you do that you really enjoy. And that's why I tell people when they ask me, Should I journal? Should I meditate? I don't know.
Do you enjoy meditating? Yeah, for me, it's just trying to find a way to get your brain to do something different, right and get away from work. And I know, I know for us, transitioning to working at home, we find ourselves working 10 and 12 hour days because it's easy to keep working or work early or work late while the kids are doing homework and you're still working.
In some regards, I think working in the office was better for me, because when it was time for me to leave. I walked out and I left work there. And so I have to,
I have to take a breath and get my team and my supervisors to realize that we still need some of our off time. So finding a way to get your brain to do something different. For me has been the biggest help. Yeah, and Amy, that's what I used to suggest to people. So Jeff, when I first went remote back in 2008
I was doing the same thing. I was tied to my computer. I was working for a company that was based out in LA, so I was on, you know, I'm in Maryland, and I'd be answering the phone. I'm like, Whoa, because I used to have that commute, so like Amy says, Get off of the computer and out of the house. What I would do is a virtual commute, and to this day, I still do that, even though I'm running my own business. I have this is when I'm stopping, close the computer, I actually get out of the house, or if it's really raining, I go in the basement and I take a walk on my treadmill. You can work on your car. But what's that virtual commute to give you that transition period. Thank you. That's great, because I'm going to use that with my team, virtual commute. I think virtual commute we can, we can identify with, yeah, because we're so used to it, all we need is that. So what's and it could be a fun little team conversation. What's your virtual commute? What are you going to do on your virtual commute? I know people that because they work from home, and they have families with kids. They just shut the door and they I have somebody who has read magazines, read a novel colored because if they go outside, the kids will Mom, mom, so they just shut their door and stay in their office. But they're doing something completely different. So that's something I know other people do
I get outside, but I also color and to your point, it's getting that, but reframing it as virtual commute, and
that could be a really great conversation. What do you do on your virtual commute? Could be a great lunch and learn or a little team bonding before.
Meetings. Thank you. You're welcome
working on trying to keep my work at work in my home, at home. Yes, Steven, it is tough. I've been there. I hated the first two years working from home, and then I started teaching people how to work from home. So I'm not sure what happened, but yeah, it's really about finding that flow that works for you, and knowing when to turn off. What does that look like for you and for me, it's actually turning the computer off, my big computer, even though I come into this room to do other things, this computer is my work computer. That's it. That's all.
And then I make sure I shift too. So then I pick up my iPad, or I have a different computer, and I make that shift. So think about what can you do to shift and then do something to to Jeff's point, that get your brain completely off work, your brain and your body. Because if you stay sitting in the same chair and but you're doing something different, your brain might still think it's at work, so maybe it's getting up out of the chair. Maybe it's moving to a different room. To read,
those are just a couple of suggestions, and feel free to share in the chat any other ideas that you guys do to kind of separate your day from
work from home. Wow, it's
getting close to lunchtime. I'm not thinking correctly right now.
It sounds simple, but for me, I transitioned by playing with the dog. He's actually sleeping on the chair right next to me right now, but he is
not necessarily a COVID dog, but kind of so very much attached to my side, and once my attention does a good job of relaxing in the office. But it's that, all right, I need to give you some attention. And this gets me up and moving and away from my desk for a while too. That's a great idea. I love that. Yeah, pets can be a really great way to do that. It was nice to have you here, Jeff,
but in the handouts or the resources that Shannon is going to be giving, there's a link to something that I have. It's a nine ways to find stillness. It's whatever it means to you. Stillness doesn't mean sitting still. So there's just some ideas in there that you might find helpful, and you can take those and try them out and see if they work for you, everything from
going outside to, gosh, reading, journaling, I crochet. That's what works for me. Crochet is my answer to rebuilding a VW in a garage.
Thank you, Chris, have a good day. You're welcome.
All right, everyone. Well, thank you so much for hanging out. And I invite you to go back to the main room so you can go ahead and hang out with the group before Shannon closes down the coffee chat. But it was lovely to meet everyone. Thank you. Thank you.
And now,
oh, you did it.
It's often hard to remember to do the small things, and then when we do them, it's like a big win, right? Awesome. Brandy Chet Maureen, Victoria Leslie, is there anybody else? We have seven people, so some folks might be not on camera. So welcome as well. Um, so we have 15 minutes. What I'd love to do is just let you share what you need so that I can guide you, and just note that almost everybody will have similar questions and needs. So don't be don't be shy. I just did four workshops at Dev learn, and people would wait to the end and come and wait in line and talk to me to ask a question that I guarantee everybody else in the large room would have loved to heard the answer to. So don't be shy fire away. I'm going to shut up and respond to. However I can help
what needs to like any any types of questions, or is there a specific topic that
we need to so my expertise, as you heard, is in measurement and making measurement easier. So I think the best use of the 15 minutes would be, where are you currently charged with measuring and, like, what's an accountability that or an action you have to take? And where are you maybe stuck? Where you're like, Oh, I'd like to do this, but I'm not. I don't know how I have a barrier I'd like to remove and share that with me, and I can probably help.
So I have a question. So I am a Learning Consultant at a pharma company, and as part of our kind of portfolio planning, we need to identify
the business metrics like the KPIs. And so first of all, you know, we get the KPIs. We may, may get something,
but we're not necessarily, as an organization, approaching all of those things that are going to impact that KPI. And so it kind.
Seems, it seems superficial to say, Oh, we're tracking it to the business. KPS, I don't know. I don't know what advice you have for if there, and because the learning measurement, like the kirk Patrick levels one through five. Like, that's done, that's done separately from like, in my role, it has to be the business. KPIs, so, like, what are some ways to help either make that link, make that easier, or have the caveats that, like, Okay, this is just one piece of this, because the business is also hesitant to share that with us. So yeah. So I suggest, and I was thinking from the kirk Patrick model, that you don't do level one or two at all because they're not indicators of the actual change that you're wanting to track, and that you start with level three, and level three being what I call the performance factor, and that's really where training has the most influence anyway, training when designed, right? And so we're not talking about lunch and learns, we're not talking about one day workshops. We're not even talking about retreats. Those are not going to change behavior. I'm sorry, but that's behavior change, and influencing performance requires a much more robust design, which in quiet, which involves a larger performance support. And so the trainings that are your leadership development manager trainings where people are engaged in a larger, longer term initiative, those are ripe for performance focused measurement. And so to your question, Maureen, usually training is meant to solve a problem, so really getting clear on what is that problem? What so if the KPI is sales, if the KPI is conflict addressing conflict resolution and having more productive teams, the question then is, how do people address that problem? So sales? Maybe sales are struggling because people aren't following the process that was designed. They're not starting with an open ended question. They're not quickly finding the problem for the customer. They're not using reflective listening and reflecting back. They're just pushing a product right. There's probably some expected performance that would help solve the KPI. We have to drill down into that in an observable way to then be able to measure growth in that. So with the sales example, if we know that if people follow the sales conversation protocol with fidelity, it leads to higher closing rates, which is a KPI, we have to be able to measure in an observable way. How are people currently following the process, where are they getting stuck or stopped, and how can training support them? And then measuring before the program, so we have a baseline during the program and a few milestones after the program to be able to show the growth throughout the learning experience. If we can show growth in a key performance piece alongside participation and training, and we have a higher correlation, never causation, but we have a higher correlation between people's participation and training, and then the changes in performance that we track along the way, and then, of course, how that relates to increases in sales, if that's The KPI or productivity, in the case of maybe productivity is a problem, and we noticed that conflict, high conflict in the teams, is inhibiting that productivity. The assumption and a hypothesis check to what we talked about before. The hypothesis is, if we solve for the conflict, we would get higher productivity. So everything goes back to what's the problem that we can solve through training and performance changes, and then getting really clear on what's the performance expectation, measuring baseline and throughout the learning and a few moments afterward, we want to see a trend line. That's all we're looking for. We want to see a trend line of performance improving against the individual performance. The individuals should be growing as the training program progresses, and that gives us a high strength of relationship between training and the KPI. With that performance factor linking training completion and the KPI, we have to see that change in performance. But if we don't design learning to influence that performance, then that correlation can never be possible.
Great question, does that help? Is there anything else I can add in answering that?
It does help.
I think that challenge is that almost like this train. It's like pulling teeth, even to get from the business, like what the KPIs are, because it was a specific directive from somebody higher up who also doesn't have doesn't provide those specific directives. But because of maybe internal politics, we're going to do it like so.
It just, it's, it's just an ongoing challenge. But
the the political dynamics that get in the way of us being effective are very pervasive, and the best thing that we can do is to change the way that we talk to people when they request training. So I suggest two things. When someone asks for training, we ask, we ask back, happy to help. What change would you like to see after this initiative is over,
and if somebody can't say the answer to that we don't know, then I would say,
I'm not going to be able to provide the best solution for you without the insight to know what kind of change we're trying to influence, and I know it's difficult in some cases that might be putting you out of a job to consistently push back, because the organizational dynamic isn't one that is used to that, but we just gotta start
Getting people used to that line of questioning. And if somebody says, Oh, this isn't a change initiative, I just need to disseminate X information, well, then that doesn't need to be measured with the performance factor. We just need to give the information in a way that makes sense, and as time we have a way as possible and measure that they got the information. So I'm always looking for are people wanting change out of our initiative, or they just need to disseminate information, which is usually your compliance programming, then that can guide you in a conversation going forward, but your biggest challenge will be pushing back when people are like, I don't know, because then that pulling teeth thing comes in, well, I'm not really sure the answer to that, And it's then our job to say, if you can't give me this really important information, I'm not going to be able to deliver an effective solution. How do you want to navigate that?
It's challenging and tough, but that's really what we have to do there.
Yeah. What else I
I think that, I think your walk of fine line, especially if you're a consultant,
but everything you're saying, I think kind of taps into consulting skill. That's right, called consulting skill, yeah,
chat. Sometimes stakeholders will ask for,
I know you can have like, check the box, you know, situations.
But also I've had experience where
it's difficult for for someone to define exactly what the outcome should be, but the company may want to have training for, say, a product in order to be able to sell that product like they need to be able to say, we have introductory training or overview training for this product. So you're as a as an L, D team, you're like, why am I creating this? But from a company perspective, they needed to sell the product. And uncovering that can be really challenging, but yeah, I take it back to the problem, definitely, and I so as a consultant, I always lead with I'd love to help. So someone says, comes to me and says, Whatever they say, happy to help you. Can I ask you a few questions to help me give you the solution that I think would be the best for what you're needing. So in that case of product training, saying, Hey, tell me more about why the product isn't selling. Like, what are some of the bottle gaps of things going on that, like, are really difficult to sell the product. And if somebody says, you know, we've got a big group of new people on the team and they just don't know what the product is enough to be able to sell it, then say, okay, so maybe the hypothesis is, if we increase people's knowledge of the product, then they should sell more. Would that be accurate? And if the person says, Yeah, that makes sense, then I would come back with but what happens if we discover that we teach people about the product, their knowledge increases, but product sales don't change. How, how might we address that and trying to be more proactive, because we all know that knowledge is never enough. To Marines point, there's a lot of other factors that influence KPIs. Knowledge is one lever to pull, but likely not enough to get where we need to go and the organization's strategic priorities. So I like to be proactive and think, okay, sure, knowledge is the easiest lever to pull. It truly is, give people information, test them on it. It's an easy lever, that's why we go to it, but it's not always going to be the answer. So let's brainstorm in advance. What are some other levers that we could pull in supporting people to sell more products, maybe they don't know how to handle objections. All the knowledge in the world of a product isn't going to equip them unless they're super, super passionate about the product, to equip them to navigate objections with customers. That's a whole different training than training them on the knowledge of the.
Product. So trying to get some of those levers in advance that might influence the KPI, which in this case is product sales.
How then can we support, as LMD, a comprehensive solution to that, and also find a way of measuring,
because I can then measure handling objections and product knowledge at the same time, and then be able to have a few other data points of, oh, we're selling more product. What's going on? People are handling objections better.
So trying to get some of those levers in advance and be prepared to address them in training or in addition, with some other solution alongside it.
So I want to respond to Leslie's question, because we have two minutes. Leslie basically says, I'm doing a pilot, and you know, what could I be measuring? And so Leslie, this is a great example of working backwards from what do you want the pilot to tell you? Because a pilot, you're doing a pilot for a reason. You're testing it, you want to see something, and you're you're trying to figure out, do we want to roll this out to something else? Do we want to adopt this more widely? So what information do you need from the pilot to help you validate? Do we continue investing in this program, rolling it out to somebody else? So what working backwards? What data do we need to test and validate if this was a good investment or not, and then figure out how you would ask questions and collect data to give you that insight.
And Maureen, if somebody can't articulate what they want, it helps to ask, what would happen if we did nothing? Yeah, a good question I heard in dev learn actually, is to say this, learning is going to be a means to what end. Because learning is a tool. It's a tool to facilitate some kind of change, usually. So in this situation where somebody doesn't know what they want or they can't clarify and articulate it, say, okay, learning is a means to what end for you, and get them to fill in the blank there. That's another way to we need to get the intake information. If we don't get that right information, we don't have a good solution, and we don't have a good strategy for measurement. So trying out a few different ways to get the information that we need, you'll find that one method might work better than another. How L and D, Oh, am I in a new room?
I got I got kicked out. JD, and you don't there, he just got kicked out. Okay, there we go.
It was funny. I was in the middle of a sentence too, as I was being moved to the new room. So I feel. JD, is pain. Hello. Is it just the two of you? Are there other folder, six people, so we got some folks off camera. Well, hello, everyone. Um, we've got 15 minutes that time goes by so fast. So what I did with the last group is just let people ask questions. And so to remind you, I'm a data measurement and evaluation nerd. I love it. I've been doing this my entire career, and there's probably no question I can't answer. So my big picture, what are you trying to measure? Where are you feeling stuck? When do you feel like overwhelmed or needing help? Frame that into a question. For me, how can I help?
There's never a bad question. All questions are good questions. Jeff, hello, I'll start off yeah.
So I have a management a couple of management and program analysts that work for me. I work for a federal government agency. We provide training to state and local corrections, staff and departments of corrections, sheriff's departments, community services, probation, parole.
The issue that we have trouble with, I would say, is when it comes to data,
we always want to prove that what we're doing is working,
and we try to get data from these, from the agencies, or data from our programs. And I think, I think from some aspects, if you're teaching or training in a certain program, it's easy to measure your training objectives, or measure, you know, the goals where, where we struggle is. We can take the same training program to six different agencies. But some of the issues with the follow up or the assessment after they've implemented is they all have different objectives of what they're hoping that the training would do. So, for example, you know, if we're if we're taking a leadership program into an agency, one agency may have a goal of reducing turnover. Another one may have a goal of, you know, improved promotional opportunities from inside staff. You know, another one may have a goal of reduced discipline reports from the prisoners or inmates in their care. And so there's a lot of different variables that are.
Going into this, but how, how do we take an approach of being able to have good data to take to our boss, to ultimately take to Congress, to show that what we're doing is working when so many different people have different expectations from the outcomes? So if I understand correctly, is the leadership program like a standard program that you go and deliver sort of so it's not necessarily an off the shelf curriculum. We sometimes tailor that curriculum to the needs of the agency, and then a lot of times, we're teaching them how to create leadership programs and leadership offerings within their own agency. So a lot of it's technical assistance type programs that we're going in and helping them get started and things like that. Some of some of our programs are off the shelf. So we have our own objectives of what we hope to transfer. But then when they had their implementation, they have different expectations. Sometimes, yeah, well, what I hear and this came up, it was almost the exact same question that kicked off the last group. So there's some synergy going on here. What I hear is the problem that's in the way of the goals for your different agencies is all over the place. And What's difficult is that a canned, if you will, or standardized or off the shelf, even if it's adjusted a little bit, if it's not able to address the problem related to the goals that each agency or group has, then the training is never going to be effective because the program any so here's what I shared with the previous group. We either are going to be delivering training that is disseminating information, and there's no change objective on the other side of that,
or we're going to be doing an initiative that's meant to facilitate change, but then that change is going to be probably solving a unique problem for the organization. And so our training is an intervention or a solution for a very unique problem, and in order to prove that our training was effective, we should see that problem get solved, or see a trend toward that problem changing. And so what I hear Jeff is that maybe the approach that you're taking needs to be a little more consultive, as in you spend a little more time understanding what is the priority for this agency or this group and what's preventing them from achieving their goals, and then more doing a more tailored, customized approach that really targets what the problem is and specifically how people's performance can help to solve the problem. I think that's really the challenge, is that we over emphasize knowledge and topics and information, but we don't drill down enough to well, how are people going to act and behave and think differently, to help us solve whatever the problem is, and then making sure that our training is designed to really influence changes in Thinking, behaviors and activities, because that's where the measurement is. You know, measurements actually really easy when you know the problem you're solving. Well, that's a metric. If we are successful, that problem should look different on the other side of our program, and if it isn't changed, it doesn't mean we weren't successful. Maybe it means that training wasn't the solution. Maybe it means some other thing was needed another intervention. But we don't know that unless we try it out, or we do a really deep intake and we get a sense of, Well, what's the real root cause of the problem, and is performance truly involved in that problem or not? Is it a behavioral challenge, or is it something else? So I think Jeff, in some ways, it's more of a strategy that you have to change to be able to have a greater impact. And I don't know how possible that is based upon the structure and the dynamics of your department.
Yeah, I think you helped answer the question. I think what we do is you made a good point. We over emphasize topics and other things. And I think, I think we're
trying to focus on a target that we're fixing, but it's not a fixed target when you go out into the field. And so we need to spend some more time
being that consultant, rather than just developing a curriculum that's start to finish, cover to cover, right? Yeah, I what I think is really, really helpful, is that we need to really drill down on the performance and support people in that. So what I shared is a good example. Is like, if, if a company's not selling products, well, why, and what can people do differently to better sell the product, and if the problem is that people aren't following the protocol of the sales and product conversations, and we need data that shows well how, where are people falling off of the expected protocol? Are they not leading with an open ended question? Are they not reflecting back?
What the customer is saying on the calls, like there's some really specific behaviors that we need to target, and then our training is helping people to build confidence and capabilities in those behaviors, but those behaviors have to be a solution to a problem. We could train people all day long on anything, but if it's not going to be a solution to the problem, well that's not going to be effective despite how good the program was,
a lot of our measurement challenges actually boil down to learning, design challenges.
Yes, what else? Thank you. Yeah, what else? What's on your mind? How can I help
measurement? Good data, learning, design that leads to good measurement. What else is getting in the way for you?
I can go, yeah, thank you. Um, we do a class I'm in professional development for a school district, and we do leadership classes, and one of the things that we do is a continuous quality improvement project. We make our participants do a continuous quality improvement project, and it causes a lot of stress. And I want every year I try to make it less stressful, but I don't know how to measure stress. And like this class versus did this class feel less stressful this year than the class in prior so how do you manage? How do you gage stress level? Yeah. Well, before I answer that, I want to go back to something I think is even more important is, why do the continuous improvement project what value or what? What is it doing that is essential, because maybe it's actually reevaluating doing that, because if it's causing stress, that's what's causing stress,
and it's it might not matter how much you you already know that is causing stress. So you're measuring it, because you're hearing people give you the feedback. That's a way of measuring stress, is Hearing a theme, and people's feedback consistently. That's enough data to tell you. Let me take a look at this
so it doesn't sound like it's a measurement issue. It sounds like it's what's triggering the stress, and how do we then mitigate that trigger of stress? Okay, thank you.
Yeah. What else we've got a couple more minutes.
What gets in your way? How might we remove that roadblock?
I'll jump in and I apologize for joining late welcome. Shannon was nice enough to add me into a breakout room. So I am a learning analyst that I work for State Farm, and for me, it's about the constant pace of change, right? So as processes and procedures change, we are kind of consuming those from other business areas, and being able to get a benchmark, a baseline, and work forward from that is really challenging with so many variables. So I'm not sure if you've faced that before, or if you have any guidance for me, because we're entering a whole new phase of work, and I'd love to have any insights that you can share. Yeah, well, a good question is so simple, but it's what are we doing and why?
And it's as a training and learning department, what are we doing and why? So what? What's becoming possible? Or what do we hope is possible? Because people are engaging with our programs or policies, what are we hoping to do and why? And maybe in some ways, if change is a constant, which you're not the only one. Jessica, so many people talk about like change is happening so much, and how do we navigate that? It's maybe even worth okay, if, if we know that change is a norm, kind of to Cynthia's question, if this is the culture and environment we're navigating, what's the trigger of the change? How is that influencing our workflows, and how can we better go into the flow of constant change? But all that is answered and like, what are we doing and why? What is our call to action? What's our mission? What's the most critical thing that we're doing in supporting the organization, and how we can continue that even in the face of change?
Yeah, it'll be interesting to see, because some of that comes from above, right? So as you're having those conversations, and I'm trying to react to what the ask is, how granular we get, the more you get detailed and specific processes, the more cost and time is attached to that. So what are those measurements, and how do we make sure that we're communicating the right information. So thank you for that. Thing to listen for is, what are the strategic priorities? What are the strategic priorities for this top level leadership? What are the strategic priorities for probably, your business units, your different departments that you might serve? What's the priority for your.
Of learning, function and department. And this came up in dev learn is like sometimes our intake process needs to take into consideration all three priorities and trying to wait. How do we make an in a request for training?
How do we weight that with within all the strategic priorities at these different levels, and then prioritize the work that we do based upon what closely aligns with all of the priorities. And then you have really good data and a system that helps you to prioritize the work that you do within the strategic priorities. And if you know, I had someone in dev learn say, we get pushback. We've been over prioritizing x, y, z, but if you give them, here's here's how we've been waiting, and give that to your supervisor, your whomever, based upon these strategic priorities. If this isn't working, at least sharing your process and the data that you're using to make those decisions gives evidence versus it's just arbitrary. You get blamed for being bad. It's like no, but I've been really thoughtful.
So, Shannon, we're out of time. So thank you so much for that that helps. We're looking at our intake process again. So timely, wonderful. Well, good luck. I think I'm going to be popping out here any moment. Thank you. Yes.
Come comes my replacement.
Hello, your final guest of the hour. It's great to be with everybody. I hope you've enjoyed your other talks. So the way that I'm approaching our 15 minutes is just with Q and A. So just to remind you, I am a data nerd. I'm a measurement and evaluation specialist, and my goal is to help make measurement easier and simpler by giving you, like, small, simple actions to take, versus trying to eat a whole apple at once, like, what's that first bite? So I'd like you to take a moment and think about what are you currently trying to measure and what's getting in the way and frame that as a question or share what's going on. And I can probably help so no question is wrong question or bad question, share whatever is going on, and that will help all of us. So feel free to get started right away so we can use our 15 minutes as best as possible.
I actually do have a question, because this is what I've brought up to both my bosses, small company. I'm a I'm the first and only trainer, organization, organizational development person, whatever, and it's probably more me than them. They're not asking me, but I feel like I don't know
what to measure. I don't know what to measure that's going to make any difference to the company. It's like the company is so small they don't have churn because it's a great company to work for. The employees are all happy. They are just happy. So we do a measurement every week with 15 five, and they're all happy. So I don't know. I just started a voluntary like a club, where they can do voluntary learning, and the only thing I can think to measure is what percentage of people are using it, what percentage of completion of the courses are they doing? How many hours are they doing?
That's all just based on them, but I don't know how that relates to the company. Do you know what I mean? I absolutely do. So my question for you and for everyone that might be in this position is to reflect on what are we doing and why?
What is, what is the purpose of your training programs? At its core, is there a problem that those programs are trying to solve? Is there an opportunity gap? Like, we need to get people ready for this thing, like in an onboarding we want people to be ready to do their jobs and be productive as quickly as possible, but getting really clear on what are you doing and why, and then how does that support the organization? And oftentimes it all boils down to performance. What are people doing, thinking and acting that we need them to be doing, thinking and acting, to help the organization fulfill its mission or profit goals. And that shows up in so many ways. Whether it's a sales team, it's the marketing team. Everyone has performance expectations associated with their job that is meant to roll up to supporting the organization's mission or profitability. And so for me, measures, the best measures are getting a sense of what's the problem for the organization, what are the problems that are happening, and how are people contributing to those problems, like we're not. I used to work for the National Domestic Violence Hotline. One of our biggest grant deliverables was to answer 80% of the.
Calls and chats that would come in that was like one of our biggest KPIs. Our grant grants were dependent on that, and if we had a dip below 80% we instantly needed to figure out what was contributing to our inability to answer 80% of our calls and chats, diagnose that and figure out, is it a systems issue? It's technology, it's bottlenecks of some sorts. Is it an external issue, as in
Lifetime movie did a Christmas special with domestic violence and put the domestic violence hotline as like, if you're experiencing something like this, call them, and we didn't know, and we weren't staffed. So is there an external influence that gets in the way of the organization's goals. Is it a systems goal? We need a technology update? Or is it people? People don't know something, people aren't confident in something, or they're just not doing the thing that they need to be doing. Why is that? And then training is a lever to pull to help improve performance. And so there's, there's sort of a strategy map that you want to create in terms of, what are the people expectations, how well are they meeting them? As in my example of, are we answering 80% of our calls and chats? That's like an action that's really easy to see and measure and track. Are we hitting that? Why or why not? And then, how can training help us to meet those goals on an ongoing basis. There's measures in all of that, but it all boils down to what's really important to the organization, and for us, that one metric was 80% answer rate. So what's the metrics that your organization like? What keeps your CEO up at night,
in our case, not answering 80% of those calls and chats that kept our CEO up at night. So those are the kind of metrics, knowing what those things are, and how can people support or detract from meeting those metrics, what's going on that influences that those are metrics and things to measure, and how then can training, support, people, development to better map, meet those expectations? We can measure that too, and that's the most strategic way to start measuring and interesting. None of that has to do with completion rates. All of that has to do with really hyper strategic measures that go back to problems that get in the way of the organization fulfilling its purpose.
It's a very different way of thinking than what your LMS management people will tell you,
hi dothey, thank you for being here with us today.
I am
ardent follower of Dr will Fauci. And, you know, going through his performance based smile sheet concept was just very mind altering. And you know, I really feel the better for it. I'm at the precipice of being able to present to my team a post e learning course survey. It does have to be a generalized survey. We cannot customize it for each of the courses that we are making because we're not collecting any data right now, the post survey that I'll be proposing includes questioning that is set up as Dr Bill salehen has suggested because
trying to, you know, dissolute into a Likert scale, which is measuring, you know, now you're waiting numbers instead of waiting answers. And as I have attempted to do this previously, sometimes there is some pushback, because people are not used to seeing questions that have so many options to choose from, but each of those options are trying to measure an important data point. What have been some
some ways that you've been able to try to talk through when people are seeing survey questions, to try to get data that don't look like something that got pulled out of in LMS,
right? Just so to ask a clarifying question, is it the stakeholder, or is it the individual, learner, Respondent that's like, what's going on here?
Um, from it's been, uh, I've been getting previously. Sorry. Words are not coming. Well, my my supervisor, so internal stakeholder in regards to like, if we launch this, you know, would this be successful? Would it be? Where would the resistance be from the end user, who would be the person who took the course? Because they're also not just going to see one through five, right? Well, it sounds like you've not done this before in this format, and it sounds like your supervisor is like, I don't know that this will work, but they have no evidence to prove that it won't work. Is that kind of where you're at Erica? We don't have it. Yeah, it won't work.
We don't have any data says that it will so we just kind of gotta try it, right? Yeah, yes, yeah. So what I would do is give them an example report, because, if it because I think you could get pushback from both places. So do fill out a report so you can get some dummy data and then show what that looks like and how it's going to tell a story of your effectiveness. And I think sometimes the it's difficult to translate answers to a survey, no matter how well they're written into insights that are actionable. So if you can show how, imagine if we did this with a one through five versus how we're doing it now. Can you demonstrate the difference in the insights that you get that then help you take better decisions or actions, because the data is better collected, if you can just give a high level demonstration of that, because at the end of the day, we want data to make better choices. How can we improve? How do we know how successful that we were? And Will's work is so great because it gives us better data, but if somebody's not used to seeing that, more importantly, if they're not used to using it or figuring out, how do we use it? So show them how, with just doing some dummy data and showing them what, hey, 50% of people responded to questions one through three. And this way, this is what this tells us. Now, imagine if we just use a like art scale. He says, like art, I think you can say either. But imagine if the data was just, you know, agree to disagree literally show them the difference. Well, let's do an agree to disagree scale, and let's do this other tall Heimer way what data is more useful for us, and then let her make our him, or they make that decision.
And thank you.
Yeah, encourage experimentation. Like, hey, we don't know this won't work until we try it. And you know what? If it doesn't work, or if it doesn't, we can adjust, and that's okay. We're trying something new to improve. Let me, let me give it a whirl.
Love that. Thank you, doctor. A appreciate that. Yeah, of course. What else I think we have another minute left before we get to the final wrap up.
And Trisha, yeah, you're so funny. You had mentioned, what are your leaders measuring? How can you tie? That's absolutely right. Like, going back to Jean's question, like, what's that critical thing that keeps people up at night? Is exactly, I think we're just to start getting that data. And the other thing I would just add to that, without giving you a ton of background, is just that, if you look at something, they are measuring already, and you're tying into that, it just shows in their eyes that much more value that learning and development has, versus having your own measurements they are easily brushed off or cut or whatnot, versus if it's a measurement that they already value, and you can tap into that and Add to that, that just increases in leadership's eyes, your value of your team department, yeah, a lot of the learning data isn't something leaders care about, right, right? You may want it for your own purposes and decisions, but the data you really promote to others needs to be whatever they're measuring at that leadership level. Yeah, and I will say too, that this is something talheimer mentioned at Dev learn. I sat in his
there was a really great learning analytics panel discussion, and I agree with him. He says, Warning, don't just use learning analytics. Whatever is in your LMS. It doesn't tell an impact story. It may be useful for you, if people aren't completing stuff or they're not finishing an E learning that's important for us, but stakeholders don't really care. But if we, if we just on the opposite end, if we just gave them data, that's like, here's how this KPI like, if I just gave my CEO data, that was like, hey, our call and answer rate changed from this month to last month. What does that tell you about how learning contributed to that? Absolutely nothing. So we need all we need a causal chain of evidence. And so what's really important in that, yes, we have to show completion rates, because if people aren't completing programs, well, then we don't get any impact. So that's part of our chain of evidence. Then we need to have a performance factor. So Erica to your E learning and all of the survey questions that you're asking, it should hopefully help you measure and I love tall heimers LTM model, because it invites us to reflect on changes in decision making. How we make decisions is a critical indicator of our performance. So are people making decisions alignment with what's expected? That's a performance factor that we can measure. That can be huge in our chain of evidence. How is people's decision making then correlating to some kind of KPI in the sales department, in the whatever department or business unit you're supporting? So we must.
Have that chain of evidence. It can't just be learning or just be the business data. We have to show the relationship between learning performance and those KPIs, or business critical metrics. But if you don't know the business critical metrics gene, how do you measure it? How do you make the chain of evidence? So there's a little bit of investigation that sounds like you need to do with what, what? What keeps your stakeholder up at night? What data are you looking at? What performance data can we get and then using your completion rates as well? Yeah, yeah. I guess I've asked them, and they said, No, I'm I'm fine with just being able to offer this so that we develop our people and give them an opportunity like that's they don't really care if it matters to their KPIs. So
it's a great Go
ahead. Trisha, I was just gonna say it's a great culture that they do want to invest and have learning even without those measurements, but those measurements will really add value and cement in your volunteer.
Hello, hello, everyone coming back?
Well, so main takeaways, who's going to volunteer something? The rest of you put it into the chat for me.
Did you have an aha moment? I
Yeah, Shannon, I'll share for for me, it's,
I don't want to say validation, reassurance that it's okay to continue to try to find that separation between work and personal life and be intentional about protecting that time and not feeling guilty about it. Yes, right? We do get hung up on that, where it's like we feel like we have to be doing something, and if we're not doing something, that we should be feel bad about not doing something. We all end up in that wheel, right?
Awesome. Anyone else, someone from JDS or Elena's session?
There she is. I'm like, where did Elena go? On my screen? There you are.
And then also, while you guys are thinking about that, I put into the chat. Everyone's so JDS book, Elena's book,
and the nine ways to find stillness, guide from Chris
and I hope that you guys found this exercise beneficial and useful, that you were able to gather some you know ideas, or some reassurance, or just something that might say, You know what, I want to try this, you know.
Thank you. Candice
Katie's point about call centers being at the front of qualitative AI analytics, okay,
I missed that part, so I'm going to trust JD. Would you want to sum that up?
We were just talking about the potential for advanced analytics through artificial intelligence, and the idea that contact centers are going to help teach us a lot, because there's such data to rich environments that everything is measured. So in terms of connecting the dots between how we support people and the outcomes that they see in their performance, contact centers are an interesting place to start, right? Yes, for sure, sales as well, right? So sales, I mean, there's a data rich environment, especially if you use a formalized CRM, type of tool. And Elena's question, learning is going to be the means to what end? I love that. I love that question so much, you know, because it really is. And in my perspective, is, you know, we we're expecting change. We're expecting some sort of behavior shift, or some sort of skill adjustment. What is it? What is it that you're looking for, right? Dr, a,
I mean, yes,
that's all I have.
Yes, right? I mean, at the end of the day, we just need to build the practice of asking these kinds of questions so that we can get the information we need. Because I heard a lot of people aren't getting the information that they need to create a learning solution to some kind of organizational problem. So that's a great way to frame it, and there are many others, but this one's really simple.
Thank you for that. Yes, well, and I appreciate so everybody's putting something in here, you have the reminder to work backwards when planning to measure, yeah, begin with the end in mind. Or, you know, what is it that you want to achieve? How are we going to get there? I think those are all great starting points. We're trying to achieve what we hope to achieve from an L and D environment, and from that, I'm going.
Going to actually end this with Chris. I think Chris, you might have some good words of wisdom as we go into the holiday season. You know, as time starts getting busier around us, what would you leave us with? Oh, I have so many thoughts. Go out in the forest. There you go. Everybody's now, I think the biggest thing in Orlando find that forest, there's a tree, there is trees everywhere. The sun is nature is all around you people, no
while, I highly recommend getting out in nature. I think the biggest thing that I'd love to share is give yourself permission. Give yourself permission to take a break to do something that refreshes and recharges you, and that's the biggest thing, is finding what works for you. We've had a lot of great conversations and found out, you know, everything from petting dogs to rebuilding a v, v, w beetle
again, what recharges and refreshes you, and make time for that these holidays, especially these holidays all the time, but especially during this really busy, busy, busy season,
you're, you're the only one who can champion for yourself. So, oh, there's a statement,
yeah, I'll get a little I'll get a little more, but I mean, you're the only person that's going to be with yourself birth through death, which is you. So take care of yourself.
Yeah. So that's wonderful words of wisdom, really, it is, and I'm going to leave it at that. So I hoped, I hope you guys enjoyed yourselves. I hope you learned something new. And again, for those of you who haven't been with us before, the next Coffee Chat is not this Friday, not next Friday, but the Friday after they happen every other Friday and the next one is all about building that creative mindset and some tools that we can use in our virtual learning situations, in our live situations, to encourage that kind of creativity and creative thought. So I hope to see you all there. On that note, everybody, have a great and wonderful weekend. Feel free to share. You know, if you're whatever it is that you're going to be doing this weekend, but on that note, you guys, take care and have a wonderful rest of your day.