in, you know, I spent about almost a year with it just sort of pushing around ideas with the technology and talking to different customers and thinking about use cases and what you know, what would work and what wouldn't. And what we realized was that, you know, to deliver on the promise of AI, you needed this digital model of your, of your living room. But we also knew that, well, if I was going to work everywhere, you actually needed a 3d digital model of everywhere. And that was a huge, huge problem. And you're talking more with Victor about, you know, the research pipeline was coming we, we also knew that you needed more than just a geometry, you needed more than just a mesh, you actually needed the ability to relocalized in any spot. So that means get your your position in like a visual positioning service very, very accurately. And then on top of that, you started to then need some sort of semantic understanding things in the same needed to know that this part of the mesh was a different object to that part of the mesh niche know what those objects were. And we had a sense of that rapid development of on device real time 3d neural networks was sort of where the next wave of research was going to come in. And Victor was going to leading a lot of that. And so we we served, you know, that the, that was the technical direction that we felt the world was headed, what needed to be built. And we also knew that a lot of this stuff needed to run at least partially in the cloud. And these were sort of core functions for every AR type of use case, not just a particular app, it wasn't about content, it wasn't about the user experience, this is just to enable it. And so that, that started to feel like to us to look more like an operating system, in that. If you define an operating system, as you know, the API's that the application calls, if you look at, say windows, you know, the operating system isn't the GUI. It isn't the you know, the menus and the buttons and everything. It's actually the wind 32 API stack. That's what all the applications call. And that was kind of the wind 32 stack was kind of superseded by Amazon AWS API's. And that sort of enabled the cloud for all your web and mobile applications to come live in the cloud. And we felt that similar set of API's was going to have to exist for spatial computing. All these applications, whether it was a robot trying to do its job, or a drone, or an AR app, or self driving autonomous vehicle, needed to be able to call API's to say, what's that thing in front of me, you know, is there a, you know, am I on the road is that a person standing there, and these API's to, you know, understand the real world, when under them all together, it starts to look like an operating system for accessing the real world. And that was the, I think, the vision, you know, that was the vision you have for 60. And still is the vision for my antique and I think we call that the IR cloud at the time, and that name kind of stuck. But the, you know, this idea of a an operating system for the real world, and those API's was underpinning everything, we felt that opportunity was going to be as big as Windows as big as AWS, you know, someone was gonna, someone's gonna build a very, very big business on the back end API's we have to us.