Yeah, yeah, I'm doing that, and some folks have done that, but not a lot of work in the empathy Spaces has looked at that. Yet, there's actually, like, some stuff on with AI that's starting to do that as well. So, like, there's all sorts of discussion about how, you know, AI can come. With empathic sounding text in response to what people say. And for some people, that can be like a source of support for them. And of course, there's lots of folks that say, you know, well, that's that's not really empathy. And of course, and in some sense, it's not. But if you focus on the receiver. And you think about like the best possible alternative in a lot of cases, you know, there's a lot of people who don't have close friends, they don't have people that they can talk to. And if the best possible alternative is to go to a change stranger in a chat room or on a forum and say, you know, this is how I'm feeling, and hope that somebody responds in a supportive way. You might get better results talking to an AI chat bot than that. If the best possible alternative is to go to a close friend or to go to your therapist and speak with them in person and engage or an empathy circle, you know, I think you're going to get better results with that. But, yeah, it's, it's interesting to look at how, how, how, you know, sort of the large language models come up with empathic responses, and how it people, people will rate those as more empathic and feeling more heard than like a human generated response. But if you label, if you take AI generated responses. And you say, this one's made by an AI versus This one's made by a human, they prefer the one that was labeled as human. And if you ask them, you know, do you want a response from an AI or a response from a human, they tend to prefer the human response and will select it like 40% of the time, even if they have to wait two weeks to hear from a human. So people seem to prefer human want, they want to be heard from a human. But often what the AI generates in text form. And there's also this, there's a difference between, you know, text communication and in person communication, where, I think a lot is lost, and those studies haven't been done, where it's like comparing, you know, come in and speak with someone in person or talk to a friend in person and rate their quality of empathy and how heard you feel, and things like that, versus getting a text response from an AI. But again, if you think about best possible alternatives, and you're comparing it to a chat exchange with another human, which in some ways, really stacks the deck against the humans. But you can also look at like, what they do in those responses, and how it's different from what a human generates, and that can be instructive to like, help humans empathize better. So one of the things that they do is they validate one of the one thing that came up for me is you talk about, you know, there's space to say, Well, what you're saying, like, doesn't make sense to me, or it's ridiculous, and that is met with empathy as well. And that sort of AIS have this problem of, like, sycophancy, like, you know, they'll over agree with everything you say. They validate to an extreme so, you know, even to the point of, like, if you're like, Oh, I feel so overwhelmed, like I want to jump off a bridge and like, they'll be like, yeah, like, you know, I can, you know, they won't necessarily be like, oh, like, you shouldn't do that. You should, you know, I can totally understand feeling overwhelmed, but I wouldn't do that. Like, they'll just be like, Yeah, that makes sense. Like that, you would want to do that. And, you know, they don't push back where you need to push back, but, but they do, you know, that's obviously quite an extreme emotion, and that would require, like, a therapist, but in in more or, you know, suicide hotline, and someone who's trained, or, you know, human intervention, and in some way, but with the more, like, kind of mundane, like, oh, I had a rough day, and My boss was kind of rude to me today, and things like that. What they don't do is they don't offer, like, personal insights from their own life that will make it all better, right now, which is something that humans are often trying to do. So you talk about, you know, creating space, giving time, not judging their emotions, or trying to change their emotions while they're speaking in the empathy circle. And you know, that's, that's what people often want, is like, I'm feeling negative. I don't necessarily, yes, I want to feel better at some point, but I don't. There's, there's often nothing that someone can say that's going to make that person feel better, right there, because they said the right magic words, right? It's about just being present, giving them time to, you know, have that self empathy experience where they're getting a deeper understanding, have their of what they're feeling, and that allows them to work through it over time, kind of thing, as opposed to, you know, sending the right message. That's just going to be like, Oh, now I feel enlightened, and I immediately feel better. And that's often what people are trying to do, is like, get people out of that negative emotion right away, give them the insight from their own life. So there's sort of some of those differences that can be instructive into how we can empathize better. I think so. Have