Yeah, I think one of the one of the fundamental things is providing transparency and evidence. And evidence is the hardest thing to actually get, especially at the outset, because it's a bit of a trust exercise in that, okay, I'm gonna do this, and I'm gonna get great things on the other side. But if the business does not have transparency into what those benefits are going to be, even if they're the best machine learning models or AI technology in the world, they won't want to adopt it. One of the real world examples I've run into is that sort of punctuates This is, I was once talking to a CEO of auto finance company, and they had to update their behavior and risk models. And they've had these behavior and risk models sitting in use for years. And they came up with really good analytics off of a new set of performance data. But no one could tell the CEO what those new models were going to do to his pricing model. And they were core and integrated into them. And this is a this is a very simple example of of that transparency that I mentioned, you need to be able to when your your data scientists are building great machine models or analytics off the back of the data, provide that connectivity back to the business outcome in a way that provides confidence that those models are actually going to help. Even if they're, you know, well formed models that provide insight in dimensions previously unknown, until the business can get their head around. What is that going to do the outcome? It's difficult for them. And so that's why data science interest, data engineers, business marketers, all of the functions of an organization need to connect in that process. And there needs to be some mechanism to measure that. Those tools that actually provide that measurement, they're the ones that get the traction because they they build that confidence. And then the adoption takes hold.