Our otter guy because I want to do a summary and see the themes of all of our conversations. So I just started that. So um, and we have a go live in just a second so I'm gonna hit the button and then we're off
Good morning or good afternoon depending on where you're tuning in from. I am here with jolt Ola, and we are jumping into more lessons learned at all in an effort to improve learning and development. And really just the quality of our lives. I found that these conversations are turning into sort of personal lessons learned to help us all enjoy our life a little bit more. So I'm not surprised if that happens in this conversation too. Joel, thank you for being here. I'm
excited. Thank you so much for having me
and for those of you who are on the East Coast. I hope you have a good lunch while you're listening. Exactly.
I actually had my brunch right before we started this conversation, so I'm feeling nice and energized. Oh, right. So before we hit the on button for your lessons learned short, I want to have you share just for the listeners. What are a couple of words that you might use to describe yourself in the work that you're doing and learning and development so that we can have a bit of a lens to understand where you're coming from when you do share. Those lessons learned.
Absolutely. So if you go to my LinkedIn profile, you actually see them right there. Less content, more impact, and all four typically awful words matter and that we actually create less of the order taking content in terms of courses, but we actually do start measuring the right thing. So all in all, less content and more impact. That makes me happy.
Me too. I'm all about the impact. And here we are. It's ironic, right? We're having a conversation. It is content, but it's content designed to help us improve our impact. So I'd like to think the best of both worlds. Anyway, a conversation for another time. So now that we know a little bit more about you, tell us what are some of the biggest lessons that you've encountered in 2023 And what are some of the projects that you were working on that led you to those lessons? So last year, we
spent a lot of time creating a data literacy program here for our team just inside Amazon. And I took that to the road, sort of I went to three conferences where I pushed down below is sort of speaking about data literacy. I want to do how people experience literacy. So I build is still they call it a game. I wasn't really meant to be a game but it's actually a game board. So that's the game cards, and it's a dashboard. All people do is I put him into teams, and they need to decide on certain data cards, whether are true or false, depending on the data that they have on the table. And more from this. It's amazing. First of all amazing that people come early for the session. Because they want to make sure that they get so they can help me I love things. The second thing that I learned is that there's a huge difference between learning about doing something and learning doing something. And when you see their excitement and literally argument over data, they're seeing the same thing. It's the same chart shows the same
insight.
And they don't agree that they learned from each other what this conversation is not even me at this point. It's just a team and understanding about how important it is. That they're the same page, keep the same language. And they have the same sort of framework to look at data. Because otherwise when they take his home and he was the stakeholders, they have the same problem. They see the same data and they have some different ideas or insights from that. So that was like basically the learning about things and learning doing things.
Yeah, you know, you're less content that you shared. You know, it's in your profile you lead with this is what I'm all about. I mean, I can't underscore how important it is that learning happens through doing and I am so appreciative that you're giving somebody an experiential opportunity to engage in data literacy because it honestly reveals all of the nuances that you couldn't tell somebody you couldn't tell them. Oh, people might not be really on the data. Because if you think about it data is just facts and numbers and yet it can be interpreted in a myriad of ways. And so, you know, what, what else have you learned about because I'm all I'm with you on the data literacy side of things. I believe we have such a lack of skill and thinking in the data literacy sector. I know this isn't part of what I'm asking you to share. But can you go a little bit deeper on some of your lessons learned around just doing data literacy training and what else you've encountered because we need more data literacy in our lives.
We started it all, because I like to think about as the language that we use any other language. You can talk a lot and say nothing can talk a lot and say a lot. Sometimes you have to be just very concise. And you're done and you say what you want. It's the same skills you need, you understand how to use them parts of data for one. The second part of it is as long as a goal and you need to understand who you're talking to, and how you present those things. So what I what I learned actually, I did my research, I went online and took actually courses from among the name companies out there that sell courses, and most of them are about some expert talking about data and I appreciate the experts. I love their you know listening to them and all that. It was more like an illusion of learning, rather than actually using this whole thing upside down. And let's just put people into a situation when does the dashboard, decide whether this pilot was successful or not? And here's the data and then and then we can get into details of okay, if we disagree on, for example, a mean of what does that mean? No pun intended. Let's dive deeper on why we disagree about that. And can we use that number on a dashboard to make decisions that that was sort of the starting point.
Yeah, that's wonderful. I, you know, if I was a leader in an organization, I would actually be starting my meetings with looking at some data from the organization and what is it telling people and like giving everybody a chance to share their perspective and, you know, it's just a simple way to practice. looking at data, letting it tell a story, but seeing how it unfolds for people based on their lens and speaking of lenses, and experience. Yeah, that way, anything we can do to boost data literacy in our in our world I think is really powerful, but I'm going to stop you there because I know the other lessons that you are going to share because you've shared them with me and I and they're wonderful. So what's the next thing that you've learned throughout this year that you'd like to share with us? They're wonderful. They're honest, they're honest.
If you're a LinkedIn or visit any social media, you might have this big FOMO that you're like years behind of using AI. I mean, literally, you go in and finally find somebody who put together 500 usable prompts. You downloaded it by the time you download it, it's already obsolete, and someone else comes with a new tool. And that's the look just because I just built a video from scratch and I'm like, okay, like, how do I use all that stuff while they were changing? mentioned like, you know, Google just came out with their versions. And now we have two big things. I think there was a report recently, LinkedIn by Donald Taylor had to go out globally l&d actually using a but looking at AI what sort of things they use, it's a very good report read it because they have a very good insight of how to interpret those data points. But one thing that you will notice is that because of this big phone, people actually not using it as much as what it seems they'd be using. Everybody's playing around like a playground. Once it becomes serious, and inside the job is more restrictive. It's more I think, focus. And people don't even actually mentioned it that they use AI and so this big FOMO is happening outside of work. So I share with you this but I always tell people like you know, just pick up the how to choose the emperor of the year that you need to calm down. What's your problems? I'm not changing. It's the same problem problem that you had before. So don't focus on all the things that you could be doing. And try to find a problem. works backwards matching. Look at what you're doing, prioritize your problems, and then go and figure out what your show is using. Whether it's augmenting or replacing you need to have somewhere start somewhere small and just pivoting rather than trying to solve world hunger.
You mentioned this already, but I think it's worth saying again that whatever we do, we should do it through a goal. I mean, you mentioned that with a data literacy, what's your goal and how are we going to use data to help us accomplish that goal? I think you're saying the same thing when it comes to AI? What is your goal? You know, what problem are we trying to solve? And then how can we use AI or any other thing to help us to solve that problem but without a clear goal? We don't get as much benefit from whatever tool it is that's in front of us.
And just with people who actually using it rather than trying to sell you all kinds of things, right. This is definitely a moneymaker. Suddenly pop up resumes and taglines and they all do AI now, which is fine. We have the same gamification back then learning or Bitcoin or blockchain was another one like huge everyone has a blockchain. So just focus on what you actually need. To do, and practical things. Because things will change all the time. I think it seems like every day. Yeah,
I know. It's intimidating. There's a book that's called evil thing and the Science of Instruction. And in the first chapter, they say, when the radio came out, everybody thought it was going to revolutionize our work and then the television came out and then the internet and now here we are with artificial intelligence. And what you're saying is what they say in the book, which is Don't, Don't be swayed by shiny objects and their ability to save the world or world hunger stick stick with what we know theoretically, and practically works and also be filtering everything we do through the problem solved or the goal we're trying to achieve. Yeah, I appreciate that.
I would, I would say, I don't want people to walk away with this that there's not much going on. actual real change. It's actually a big change. And I think nobody can tell what's going to happen on late next month or next year, let alone like in three years now. Because it's it's changing everything that we do from from the from the how we think, from how we get things done. Just coming from every direction and this is what I think they're feeling. The problem is that we feel like there is nothing stable. It's not just one industry, or one type of work. It's like everywhere, all the time, like you know that movie everything anywhere, everywhere, all the time. I've
heard it it's on my holiday watch list and now even more so.
You have to quantum mechanics, which would be my
Well, thank you. I appreciate your honest lens on artificial intelligence and you know, this is the third conversation we've had in this this extravaganza of lessons learned for the year. Change is constant. And I think every one of us is trying to figure out how to navigate change not only with technology, but environmentally economically, knowledge transfer and what is knowledge and anyway, so much change is happening and we're all just trying to sift our way through it. There's another big change that I know is your third lesson learned that you'd like to share. You say there's a big hairy elephant in the room that we should be talking about. We mind sharing that perspective with us.
Yeah, so there was I think this year also a big techy, hairy elephant in the room. The layoffs that happened starting earlier in January, in a sector of them actually expanded. A lot of people are are looking for jobs out there as you can see, these posts are 500 submissions, like interviews, no offer for 11 months. So there's a lot going on out there in the market. I think one thing that I learned from from this just looking at of how we attracting talent, how are we hiring talent and processes is a huge mass first of all, second is that comes with all the AI that you have out there. human element is actually very important because when it comes down to 1000s of resumes into a job posting wins is whoever knows someone have a good work with a hiring manager. Because there's no way you can build all kinds of MES systems. Now this takes maybe 80% of them. There's still 200 300 resumes coming in. And they all look alike, because we wanted to make sure that that gets through the system
as the hiring manager actually going to rely on
it's good or not your biases and one of the biases if I know someone and have a good word, not someone else. I already have at least some of my fears addressed. I think that human element in this whole AI world that happened it was sort of like a controversy. Yeah, you do need AI great changing things, but then you have this q&a moment that you do need to get a job for.
Yeah, yeah. And the relation theme is also something that's come up in these conversations we've been having today, how important relationships are of course with stakeholders, but also members of your team and people in your network. And yeah, I think with with technology and AI and social media, sometimes we can not forget about relationships, but but they but they change and morph and yeah, to continue prioritizing those human connections that we have and whatever ways you can. Very important for lots of reasons. So as we look to transition to well, you've been sitting on next year. You've shared a few things and I have your notes in front of me here but for the sake of time, what will that like one thing be that you think is the most important for the goals and and problems you're looking to solve next year?
I think for me, as I mentioned, we
focus on data as a language and telling the story and pulling together, impact stories. And we've done a lot of good work in the baseline of these of how to use those, how to collect data, how to make sure that we can analyze those we have data literacy skills, without actually telling a good story. It's all just sort of pre work like behind the stage. So my, my next year, big goal is to get much better to the end of the data cycle, which is our data storytelling. So how do you put together a good story that tells for the audience, the audience, whether it's C suite or your team or stakeholders SMEs, here's what we did, here's how it worked. And here's how we should do X. Starting from from the moment, when you have a project is not even clearly defined of what data do we need? What's going to be the story that we want to tell. And the second part of this is my goal is to move from sort of the current state of this is what we did, this is what happened, right? More of almost real time predictive use of data so we can tell with confidence on what's gonna happen. So if the cohort goes through some sort of an experiment, you have data enough to tell, hey, manager, you're going to have a problem, because 40% of our people think they're experts in this topic, and they're not. So what can we do about that or some of these people? Maybe struggle in this sort of skill set? So what do we do about the transition? It's obviously important, because factor modeling less content and more impact. The impact is not whether they, you know, go through the training or not, but they're doing the training the impact is on the job. Design and from of how we afford them, do the things that do the job better, faster, easier, whatever the skill is. Yeah, and I think this is something that more like a consultative way of thinking of how do I help the business and sometimes we have fast, especially not building any courses. But again, this is all stuff with the data. It's not about egos. It's not about who owns water, who is responsible for learning. What data do we have? proves that you know, theory we have that we want to check.
Yeah, I think we have a lot of data. Some people say, Oh, we don't have any data. But there are data artifacts that exist all around us all the time, and I appreciate your how do we take the data to help it predict or find contradictions, even like people say, you know, we can have a survey and you know, 40% of people say that they feel like they've got expertise or expert skill in this thing. But then we're seeing that they're not performing in the way that we would hope or expect and there's a gap there. What can what does that do for us and how do we tie all of the data artifacts that we have and help them help us make predictive and adequately respond to the challenges that are happening in real time i I'm working on that myself and look forward to getting better at that. Absolutely.
And, again, you can use AI for that. I'm not saying this is not an AI, sort of challenge space. But you need to able to actually use data and speak data otherwise what you get out of it is a black box results. And you don't know what to do and how. That's right. That's right.
So as we lead to the last 10 minutes of our time together, it's a time for those who are listening to ask questions if they'd like to and to extract more of your expertise and experience. And so while those who are listening, if you have a question, please go ahead and raise your hand. We'll bring you up on stage. And while you're doing that jolt, I have one question for you. And that is you've mentioned your focus area for next year is basically focusing more on the end of the data cycle. And helping use that data to make more predictive insights and to make it more useful. How will you know what are you going to be monitoring or tracking or evaluating how will you know that you've actually made improvements and
that way?
I will know because they actually have a data informed decision. And by that, I mean they actually do make a decision. Because one thing with data, again, is the language that you spend so much time on collecting and creating some sort of insight providing your stakeholders if they not making any changes if they're not willing to use insight or data or story and nothing happened. Did you really do a good job? It's the same concept of everything. Everybody is passing a course it'll the business model is not moving. Did you get a good job, depending on how you look at your your performance? My evaluation was that it needs to be done on the under budget on time and with it and everybody passed so they know what to do. Don't have control over what they do after so I'm focusing on only the learning. One way to look at another, in my opinion is if we cannot change somebody's opinion, for example about making a decision through our insights, and our data storytelling failed. So just because you present data, it doesn't mean that people is going to say, Oh, this is this is a great insight. I'm gonna have 360 largely one at 180 and I'm going to change my mind about x I'm just going to do sometimes data itself is not going to be I think goal should be is to supply that provider story that actually not only just information, also the emotion, part of things, how do we make people think differently and change and sometimes against their beliefs. So how do you This is my challenge. How do you understand people's beliefs for we tell them the story? Yeah, there are not on the same page. How do we make things change? Rather than just presenting data and insights?
It's funny, you you literally said how do we make things change? And I wrote in my notes here, changes actually occurring. So yeah, you know, I would say to one of our great challenges, and I don't think this is unique to learning and development, I think this is true for marketing and product and technology. If change is happening, how do we know that our intervention or our initiative actually contributed to that change? You know, that's our struggle, and that's we want data we want to use data to show what thing contributed to the change that happened. It'll be interesting to see how artificial intelligence and other technology tools will help us to make that easier because I think right now we're just in the weeds, especially in learning and development. We're so in the weeds. With connecting the right data to show that it influenced change in the way that we hoped. So I look forward to seeing that get better and easier in the future.
I think one more thing on that is if you design things backwards, so you start with the business goals. Sit down with your stakeholders and understand what their needs, their wants, and needs are and go backwards from basically chain it backwards all the way to what are the actual behaviors they want to change. And who who's your audience? If you do it that way, the assumption is still there, that that behavior is going to change the KPIs and that's going to change the business. But it's not you're supposed to find it out. It is the business they should know if they don't know and we're not going to help too much, because we're going to train them on something how the news something and it's not going to work. But if the assumption is right, then we can say with confidence that if we're measuring this behavior, which is easy, it's not the ROI that we're measuring is the behavior and measuring. Then we can have statistical solid methodologies to show that went through the program, people went through the program, people who went through half the program have dropped and triangulate data and show that there is correlation and you can choose causation between those behaviors. And the rest of it is if the behavior is
happening, that moving the needle. Yeah, that's one
way to approach that to start with rather than jumping from measuring nothing to showing the ROI. Yeah.
I mean, I think that ROI is, for lack of better word icing on the cake, but not even necessarily because maybe you don't want icing on your cake or you don't need it. I mean, at the end of the day, we need to show that performance and behaviors have changed and then we can calculate an ROI. But that doesn't that's not the end goal necessarily. And I think we tried to save people participated in training, and then we get an ROI where there's so much data that needs to be collected along the way. And what I hear you saying is that, and this is I'm a researcher and come from a public health perspective. Like we have to have a hypothesis in advance of what we're doing so that we can test it and see, did this intervention move the needle, like you said, in the way that we anticipated, but if we don't have a clear hypothesis, and we don't know what data we're going to use to test that hypothesis, then we're just kind of shooting in the dark. I think a lot of us are are living in that world right now.
Trial and error is one way to do things. It's just we don't have time for that. I think one of the biggest problems and especially in the corporate world, is that people are not going to give you the time that you need. For example, academic study. The risk, you know, in healthcare, it could be life and case well, maybe the satisfaction goes up or down. It's not enough for the business to say like let's just stop everything and do a real study. So how do you do this? Like somewhere in between? There is slowly no methodology behind it. That is why we don't do trial and error because we just don't have time. Yeah.
I would agree. Well, I think folks have just enjoyed our conversation so much they didn't want to ask me questions. They just wanted us to continue going on in our little organic way. So have oh wait, we do have one question. Amy. You are probably patiently waiting for us to be quiet. Okay, in a moment, Amy. I'm gonna bring you on stage.
And there you are. I think I initially missed the prompt. My apologies for my delayed action, but nice to meet you. And thank you both for sharing. I just wanted to ask you, I noticed from your profile that you were involved with the gamification of some learning programs at Amazon. And I just was curious if you could share a little bit about that and what you did and how that's going. Sure.
So unification is is one of those things that I started doing early on all 10 years ago or so when games are banned as a word in in a corporate world. But I did the long way. So it was it was before it became something very popular at conferences and going on points. And badges. funded research is basically motivational theories applied in different contexts. So when I when I joined Amazon, Amazon inside is a country we have at least about 2500 learning professionals in our selection. So daily, someone is asking about gamification. So the first four months I started working with individuals. Were trying to do this and taking a course and we tried to add points to it. Can I help in any way because this is new for them. And what I realized this, this is pattern coming back over and over again in different teams. Instead of working together, they all tried to solve their own problems with learning. And so what I did first is I wrote a simple sort of document strategy document about unification. How do you think differently? It's maybe the pattern with me, but I start with the paradigm of how do you think differently before you start using things? Because it's much easier to make decisions after so we did a lot around how to start gamification, or actual goals, and how to use divisional theories behind those. How do you use for example, different player types to sort of understand what sort of people and how do you apply this in in your job? That's how we started.