Hello, I'm Rob Hirschfeld, CEO and co founder of RackN your host for the cloud 2030 podcast. This episode is about digital twins and the edge. And Simon Crosby from CMA came in and you'll hear him speaking quite a bit. Because this is his expertise area, they are literally building digital twins in edge locations. But a lot of us have experience here. And we really work to expand and understand how Simon's experience translates into general cases and what we're seeing in the edge and the types of systems that we're trying to build the fascinating intersection of how we build models and understand the connectedness of all the components for the edge, versus traditional models. And that really is what makes edge unique. It's not a single application, but a connected system. And that's what's going to have to emerge to make all this work. Enjoy the conversation.
So I would let's try to transition to the topic of the day, which is digital twins, which is why I tapped on Simon to jump in. Specifically from from that perspective, this was brought up by Joanne a couple of weeks ago, around the idea that we're about to have a revolution in new UI, new experience to end better engagement. And to do all that we're basically creating digital twins of of our environmental systems. And Simon, you're deep in this space? Can you could you frame this for us or what you're seeing in market?
So I see it almost sure. I agree with that with the, with the framing. But the way I see customers gain into it is that they get into because they want to have an always on digital version of everything they care about. Okay, so I'll give you an example. Our largest customer is a mobile carrier. And we have an always on doodle version of every device in the network and every one of their customers devices. So it's 150 million plus devices, as digital twins, continuously receiving data from the real thing. And continuously stapley evolving somewhere in memory in some clusters across the country and addressable via the language of the web, they will, they will just have your eyes as how you held them. And they'll draw themselves if you use a web browser or whatever. But this gives you gives them the ability to quickly get hold of the Digital State of everything they care about. Okay, so I see it coming from the other side that is more of an operational need than the customer need. But ultimately, possibly the goal is the same. That is we're trying to improve the speed of service to end users customer service, whatever it happens to be. And having an always on representation of a thing is a no brainer. And to my mind, that's what it is. So digital twins, perhaps started life as you know, overlays that mechanics with the use of jet engines or some some really complex thing, some design time artifact, which is designed to help a human interacting with the real world. I see it the other way, which is that these are continuously evolving digital representations, actors, digital twins of things in memory, on some computer somewhere have absolutely everything you care about. And it's not just the state of everything. It's that they need to continuously organize themselves into graphs, which describe their relatedness and relevance. So let me give a good example. If rich and I are near a cell tower together, then do something. Okay, that's rounds okay. Or if the load on a cell tower is exceeded, then do something. So proximity geospatial awareness, arbitrary constraints related to can 10 years parametric functions have to be continuously re evaluated for every single thing. And so the story about each single asset being a digital twin, it's about the relationships between them. And they continuous evolution over time. And that is what drives the business.
Simon, that question about what you just described, to what degree is the digital twin, a closed loop that is taking action on a programmatic basis, as opposed to advising the human being to take the action.
So I think, in the era of analytics, we put cool you eyes in front of humans and humans make decisions. Yeah, we're always gonna have to do that. But ultimately, to drive automation. an edge computing is an interesting topic here. To drive automation would have made responses to drive machines, you have to respond in milliseconds. And so I see an increasing need on the power of customers to continually process continually analyze, learn and predict, and then automatically respond, do the right thing. So retune the ratios in my mind, netbox I can deal with wherever it is, is now. Okay. And that I think is inexorable. Because every product is instrumented every pipeline, every VM on in the cloud, every, whatever your supply chain, everything uses instrument. And so the volume of data is simply going up, storing it and think about later, it's just not going to happen. And what people want to be able to move at the pace at which the world is now moving, the business world is moving is to be able to continuously analyze, and then respond in time. And increasingly, there's automated so I think analytics is solved. I think the UI problem is kind of that what we humans are Whoa. But I
So Simon, I have a question I admittedly, I'm biased by my conversations with the Air Force with respect to the concept of digital twin freedom, this concept of like, let's take the next generation air superiority fighter. And we'll create a digital analog of the physical device and be able to simulate all of the conditions affecting this device. As a cell, it's more of a design methodology to accelerate the process versus what you're talking about is it's almost like more of a real time interacting in the, in the, the actual universe as opposed to the lineside. CRO. How is that? Because I think of that more as like, you know, maybe AI driven robotic process automation.
It is for production, for sure. And then I think the problem here is, is we've got one word, multiple terms, mean digital twins came out. And this is what confused me is that it seems that the term is used by different groups to mean different things. And I think we've got this is one example. Sorry, close.
No, you're quite well, so Rocky, what you're saying is it started out as more of a design time artifact right. And I'm saying it's more of an operational artifact.
So all I'm saying is that some people say well, the the Air Force's definition of it is different from the journal IT world. Your your definition of it is more what most of what we see in in seminars and discussions and stuff like that is your version. The Air Force has specialized. And they actually have their own definition. And it's not clear which definition was first.
That's a change in discussion going back to what's something that you said before, about analytics, driving automation driving the digital twin edge driving,
driving and so we can barely hear you class. Okay. So it's better now a little bit.
How about now? Much better? I am. For some reason my, my laptop defaults to a very low microphone level. So I repeat so So Simon, what you were saying before about analytics driving automation. And then that looping back into analytics, and that being one of the core tenets of Digital Trends. But I am wondering, though is if we incur a risk of this becoming too domain specific, and effectively excluding targets or even the demographics, because the feedback loop works on itself. we seen this in case of, for example, with, with, with, with graphics processing, where, where people of color, are not detected as well. And in this case, we are driving our automation based on our analytics. But what if our analytics are flawed?
Yep, there is a problem. I agree with you. I mean, so, so thing you have to do is make sure, by the way, I think probably what most people think of as AI, is pretty hard. Mathematics. It's not that hard. So, you know, in the case of a mobile provider, right, every digital turn, every single device is pretty keen, where it will be, what what tower is the newest, and how to connect it? Well, that's also hard to do, right? That's not hard, like find a person a photograph heart. So for most industrial implementations, or most use cases, the what people think of is AI is just computers doing what computers do, right? It's not that hot. So, so, to my mind, the the biggest opportunity, and the biggest value is that there isn't always on digital personality, which represents every single thing in your world, as opposed to having to go to a database and then figure it out, we'll put it all together. So you get to respond in a millisecond or two as opposed to minutes or hours or whatever, or dates, right. And certainly for the carrier that I'm working with. They used to have 400 nodes or so and they get insights on a 10 hour basis. And now we drive the network retune the radios in milliseconds. Okay, that's the difference. And unless I to be perfectly honest, I think most AGI Yes, it's just
not that hard. Right? But is the reason that works because you have a more persistent, digital version of the of the item. What's that this is
the thing, right? Every single it when you think cloud, the fundamental structure in cloud is a database. Right? It's race. And then everything goes in a frickin database of some sort. Databases are the bane of my existence. And databases are million times slower than CPU memory. literally millions. So is the network by the way. So if you can have an always on in memory, somewhere in your cluster, or across the country, digital representation of a stateful process, it represents a thing, and its current evolution, space and time, then your million times faster than any app that's going to use database representation class.
Well, I mean, part of part of what you're describing to me, and what I what I like, and this is the always on aspect for this is that you have an addressable item,
which in our case is just who you are. So every single one of these doodle twins is just on the web.
It but it also it also has attributes and persistence. So when you're building and the thing that keyed off to me was the relationship model like in design, you're building design, you need all These pieces you need their attributes, you hook them together and it but it's a static, yeah, static, if I'm building a connected environment, and I want and I care about the relationships between things, the persistence of a, of a digital entry, as carrying, right, I can't have relationships, I can't build a connected relationship with things that that come and go, or don't exist or aren't well defined, or databases I need
by way of you. Even a relational database can't deal with parametric relationships. So the relationship is nearness in 3d space, or I know something else, right correlation in terms of symmetric, then no database finds those for you. And no database can trigger the reaction. Okay, so you continuously evaluating for every single thing in your world, every possible permitted function that's of interest, right? So you have to continuously compute as the flows over this thing, it turns out to be a graph, which is in memory. So to grab that links, these digital twins in some way, forget the binding to physical computers, but it's a graph that these things form themselves. So as you become near to me, and you satisfy the numerous description, we link, that's like an edge in a graph. And then you get to see my state and compute on it. Right? Okay. So the notion here is that everything you care about continuously computes based on its own real world events, the state of things that are relevant to it, relationships are continuously evaluated and found, and the graph is continuously in flux. And it's always on and singing memory somewhere. So, and these are things that databases just conduct.
And I think the key here that Simon has gotten across to me that is making the difference is that digital twins, aren't twins, in the traditional sense of they have the same properties. But twinness is the relationships Yes. And the states and change state changes, not the properties there because you
could that's swell Rocky. I mean, you could say a Roman database is digital twin. Yeah, who cares? Yeah, that that's not that relevant, right? What matters is the relationship. But the relationships between the things in the rows, and how has one of them changes how it affects all the others. That's the key. And turns out, by the way, though, so swim is an implementation of the actor paradigm. There are others. So if you wrote your world in ARCA. And he went down the path of dealing with things in streaming data using Akka, you could do similar things. So the problem is, well, at least the ACA way, and the same way of solving this problem is to use staple processes. So staple lambdas, you can take them as which Kirby memory which represent the things so active for everything. I'm
connecting what you're saying. And also part of what I'm reading in the chat from an edge edge perspective, because part of the part of the relationships here to the people in the in the environment, but yes, but part of what, to me that the edge, that the big paradox with edge, is that in order for edge to really grow, we have to have relationships between the devices in the edges and the people at the office. Right, the purpose of edge computing is to improve the human interactions in the environment.
Just one wrinkle there, Rob, is that is that the edge is where your data comes from. If you have a million VMs in AWS, and they're monitoring them, that's the edge. Okay, sure. It's the edge of the application graph. From which I'm now going to process streaming data and build associations and relationships and come up with conclusions and actions. So the edge is where your data comes from.
I actually think that that's an interesting thought process, in part because there's a question to me in edge of how do you make things that are different applications are different vendors are different silos talk to each other. Bingo, right? This is, to me the dead part of digital twinning here is I have a whole bunch of stuff in my environment, and to what your point is, that could all be stuffed in an Amazon zone completely away from any physical interaction, right. But if I need to say, hey, things in this, you know, there are elements in this, this pocket, there's this silo of analysis and data data generation, and I need something else to be able to understand it. Right? All sudden, the elements of that that pocket become, I need digital twins. So I have a way to look at that. pocket.
And you right, and more than that, in order to respond locally to where data rises. So for example, to bypass a fault in in the northeast, only need to be running in a piece, but the application could run across our country. And I could still see an outage in the northeast, you see, I mean, and if I, if I really so you have to deal with partitioning and availability too.
But if I needed if I needed access to digital twins of information in the northeast, the assumption that I start to build is the closer I am to the source of that digital twin, the more advantaged I am.
Yes. And but the key thing is that the database approach, right. Okay, we did insist that I have a replicated database, destroyed database, which can deal with the same set everywhere. And that actually doesn't suck. But it's complicated. There's a ton of machinery in databases, which is designed to keep things consistent, and variable and so on, that slows us down. Yeah. And so if we say, hey, the Northeast can deal with the Northeast, but everybody else just cares to know whether or not there is a fault there, or some things like that, right? Then you can use cash state everywhere else. And so relaxing a bunch of the constraints to help us deal with partitioning and vaults. And oh, no, so you can have a digital probe the Northeast.
One, this is this is a critical enabling feature technology system for edge. Yes, if you don't have it, then there's so many things you can't act on, that you need to act on. If you're at an edge, and there's a disk disconnect between you and the rest of the world, the rest of the world and you
and and more than that, on the edge topic. The database predilection tends to tie us to physical locations, because at some point, he is spinning disk. And that's really problematic when things move. So mobile devices get on planes, and they show up into parts of the country or they move from one base station to one mec to the next right. If they stayed there somewhere else sitting on a piece of rusted aluminum or whatever, spinning rust, then we have a problem. Because in order for the application to work, I have to build to reach the stored data somewhere else. And that's slow Plus, it's done resilient to partitioning. So everything is just in memory, and we get rid of databases. And then we're, we go a million times faster. And we're much more resilient. But
do you see a concept of digital twinning that enables different levels of resolution at different times or different consumers?
Yeah, so. So another way to think of those, Rob is, by analogy, Amazon redshift materialized views. So I have a, say, in the case of my mobile provider, you know, millions of subscribers per state. Sure, right. So the base level view would be cool. How's the state doing? Well, I haven't meant may have, you know, 1000s of base sessions per state, has each state doing their well that's a digital twin of each state to the extent that a human would Have it. And that should continuously evolve, that is relevant things should bubble up as the base stations change their states. So if one fails thinking go read or whatever, right? so sure. various levels of abstraction,
you're making me think of what I used to work in the physical security, like electronic locks, card key type business. And there was always a debate with that, because you know, you walk into a new building, you have to badge in twice, because you're not in the cache. Yeah. No known behavior. But there's a there's a dilemma with that, which is how far away should the cash be? in case there's a disconnected system. So you in a high security environment, the idea that somebody could get locked out because the cash could get warmed up? is problematic, right. And so we spent a lot of time talking about things like that. And so there's an element with what you're describing to me is, you know, in a day, you could have a dormant digital twin asleeping, digital twin in a database somewhere from your saying that and then and then could you would you actually then look at it as constructing the digital twin and building up the search solution. So you can say, hey, yeah, I know there's a cell phone, it's this, this me ID or whatever they whatever the ID number is, it shows up. I don't know who this is, I have to go build a digital twin to interact with in that system. Ideally, if there's a cellphone handoff, you could do a warm handoff and say, Hey, this, you need to know about this twin, it's in your proximity.
Yeah, so here's the thing, right? Yeah. So for this model, roughrider. We're doing five petabytes per day. That's more than you could the best store their heads. It's a shitload bed. Sure. And but the model, the staple, mumbled is digital twins, is a few terabytes. And it handily fits in memory of 40, computers, distributor on country. So the notion of sticking things on this because the Big Four because that's where the associations get made, because we have a database schema. Yeah, forget that. silversea? Do we need discs? It's a very good question. Do we need discs? It's a handy abstraction for humans. But in reality, you know, discs, all forms of database run a million times slower than CPU memory. And we can keep a staple model of the world, including caches and high availability. If we design the system correctly. And we don't really disk other than for persistence or AJ, for warm starts. In the event, something fails.
I guess I'm you're very insistent on on it this being in memory. From a, from a perspective, I'm not as clear on why that becomes an essential component of the digital twinning.
Two reasons, the first being automation. Okay, so and like, I consider this to be a solved problem. Putting cool you eyes in front emotions is not hard. But if you're going to drive something to an automatic response, you need a response that is literally in sync with the world. Okay, you have to respond within single digit milliseconds. Okay, that's another good reason for edge computing case. Because you're typically within a single digit milliseconds hop away. Okay, so you can drive automation from your edge computing world.
So in part of what you're describing, when I think about it, architecturally, you're saying, you don't need to replicate. And maybe this is key to all the digital the digital twinning pieces. You don't that consumers of the data from a device don't need to recreate their own version of the digital twin. If the digital twin has the performance characteristics that you're describing.
Correct. And you don't need to probably keep the raw data. So right. So this confused between it's bad state note data, right? Yeah, no, just good luck. trend is still five petabytes today. You can't do it. It's 110 gigabytes per second 24 per seven.
But part of a part of what you're doing in the cut because like when I think about the integrations today, literally when you build an integration, you're saying to tap into an API, I'm gonna, I'm gonna clone data but but every integration ends up building their own digital twin. If you read their own, they're a model of the thing that they need to consume. Right. And, and so they have to store the data, they have to process the data to your point about databases, they have to normalize the data. What you're what you're describing here is the concept that you're you're going to attach to something as a digital twin, you're going to count on it being able to provide the state information for you. And, and, and you and the consumers, the people that digital twins with relationships to that, that device that item, don't have to invest in maintaining their own copy of the digital twin their curiosity interacting,
yep. Cry, insert a question here. You described earlier, the notion that, like a network edge, not the way we've talked about edge, compute there an edge node, the relationship being in increasingly important part of this and in forming that relationship, for example, if you thought of those relationships as being contractual, perhaps not a mobile telephony system. But let's talk about smart grids or things like that, where in fact, I'm concerned with the contract between a community of nodes and its various potential suppliers of power, including the members of the community itself, does the solar collectors Do you not create a community of abstraction abstractions that also have to be twinned? In this particular case?
I mean, sure. So we can get the words to be turtles all the way down, but it has
Yeah. Certainly. So but but there's a very interesting thing here, which is, how do you take the evolving state of things and, and in response to changing states trigger certain things happening, right? Instead of having updates, daily basis, things change in memory on the fly. And they will tell you, if you subscribe to their to them, they'll tell you continuously about safe changes, right? They won't send you messages that won't be REST API calls. That's all horribly slow, and, and very one soul, they will send you a continuous stream of state changes. Okay. And then what you do with them is up to you. I mean, that's other programs. The key thing, I've said that a methodology by which you can easily go last grade digital twins. So for us, in this carrier world now that there are bindings for pytorch. So it can do data science on live data coming out of thin air, but
the great majority of this is actually projections, right? It's basically for, you know, we'll we'll call it forecasting or, or projections about what what to anticipate. And
if that's of interest to you. Sure. And this book, some use cases. And we have used cases in, for example, PCB manufacturing, right. Where they soldiering chips on the boards where you're predicting when the machine is going to fail. That kind of thing. Yeah, sure. Some of this, but just having the ability for things to find associations, for example, correlations. Right. Right. And then to be able to project forward on the basis of that is important. And, and by the way, so here's a, if you think forward to a state where every single product is instrumented, and it's always talking to its producer, and every public, whatever, if there's just so much shade around, right? So the interesting thing is, how are people going to create all these machine learning models? There's too much storing this, there are enough humans to do the work. And good luck with that. Right? Okay, so the answer, in my mind is something like a digital twin that learns from its own history. Okay, it doesn't preclude from evolving in the future. But what does say is that it's something like you said, If I said to do like blueberry muffins, you kinda know the answer. You didn't have to call your mom, right. And that will be going to database, you've learned from all the blueberry muffins you eaten, that they're good, or they're bad for you. But, you know, you know, it uses a pull in, in the sense of having a learn something about your past.
So you're, you're ascribing the notion of a, a paid, I won't call them sentient, but at least there are they are learning machines, the nodes, the individual nodes are are in fact retaining some sort of memory, some sort of precedents, some sort of
now use not only their own state, but the state of things rather than nearby. Nearby space, right. So nearby in geospatial terms, or nearby in correlation terms, or
some terms of in of interest or importance? Doesn't geography means nothing in this particular case? Yes. Okay,
so do use those see. So when we talk about twins, and we talk about edge, which I love that the characterization of the edges is where your data is coming from, but but your edge and by edge are not the same? Right? And there are two sides to that same edge. So do we see it? Do we not see? Like, if you and I don't share an edge? We don't have the same digital twin? For the same piece of machinery or the same, you know, device?
I see you talking about different? You, you might own the machine, I might be the vendor. Is that what you mean? Right? Yes. So where can we have customers who are exposing digital twins of their, of their devices to the vendors. But there is a problem here. And you're spot on, which is that if I build a factory out of machines, from Snyder, and Bosch, and so on all these things, how do I put them together? that's harder. But ultimately, as long as I can get hold of data in some feed, you know, some sort of event, there's some de minimis format, dangit hold things in and understand the schema. I can manufacture snow, my notion of digital twins,
right, but you're gonna build your twin and they might have their twin. That and that are completely separate, disparate, and that's okay. Yeah, we're just duplicating data, and they might have different states. But see, then, then we get to where we're talking about, you know, these twins being I like to, I'm wanting to use the term plastic, if you don't mind, right. There's a change. Yeah. And they, they themselves, kind of not evolve, but they, they, they behave differently, depending on their history, right. But they don't have to store their history, what they have to do is change their behavior behavior
that we're storing history is arbitrarily horrible. I mean, as a customer, I like firing customers. It's very satisfying. But we had a customer who operated 40 large compressors in the US, okay. And they were getting 70 data points, for every single degree of rotation of every shaft in every compressor. There are four shafts for compressor and they go like 2000 rpm. Okay, and these guys were collecting Lister. Yeah, good luck with that. Last time I saw them, they were still going and buying hard disks, and thinking they could roll the Mongo. Okay. And, and so the problem there is that the device is designed to run for that servicing for four and a half years. Right. And so you'll see a huge amount of sameness in the data and then a few scary things in the bank. It just felt right. Okay, so storing the original date is a waste of time. What really matters is this staple you Evolution of the thing in some way. And I think the, the way in which we describe the relevance of paper evolution is important that is, we should be able to dynamically attach, I guess attributes to things, computation functions to things to compute on data, as our understanding of them and matures.
That sounds like that sounds like, like, self describing or easily described, exposed interfaces, like exposed API's. And that brings up the whole question of how the, the edges, the the nodes that are involved in this conversation, or at least interaction with one another, have established kind of a commonality of of communication if, if the great mass of this is reading data from your communicating node, it's a little it's a little easier to to imagine a self describing, you know, API that says, This is what I'm, this is what I'm presenting, but yeah, I, you, you've got a you got a language problem here.
Oh, what I see is relatively few streaming edge protocols. Relatively, so it's in the 10th. Right. And then, in the scheme world, a tendency to adopt things like Apache alpro. So no, more and more people are using it. So it's not as hard to build adapters to things right, and to arrive at this as syntax. But the semantic mental art, yeah, that's semantics is the model. Essentially, the semantics is the graph.
Okay. And then this is where, where this is where it becomes more contractual, this is what we mean, with what bi will, this is what we mean, amongst this community of nodes interacting with one another. When we say or expose the following kinds of numbers or, or data,
exactly, and new income customers, I've dealt with it broadly into two buckets. There are people who say, I'd love to find some stuff in there for me, okay. And they generally are going to go out of business, just it's a matter of time. And there are people who say, I want to get better at this particular thing. And this is related to that, I think. So there are people who think about the analytics they're trying to compute, even if they're relatively poor, right. But there are also but the big the big difference between people who say, throw your AI ml at my shit load data here, and Campbell, semi semi interesting. And people who understand the relationships between things and how they might affect the world. Huge difference. I think the former on Trump's with the idea of AI because they think it'll help solve their problems by giving them blinding insights from huge amounts of data they have, but they're really stupid as people, so they're not going to make the business fly anyway. So they don't really help people. In my view, that's kind of a tough call. So we've decided to walk away from any prospective customer who says, Hey, here, I have this huge amount of data, find interesting things.
Destiny, I know what you're what you really are, are coming to is a notion of, we're not talking about AI, we really are talking about machine learning and where and the machines we're speaking of, are, in fact, these very often thing, no notes, notes at the edge and to some degree, kind of abstractions that get incorporated here, which happened to be these sticky, contractual relationships,
stapled things that that interact in some way and form a graph, a computational graph graph. Exactly.
Some of the relationships in the digital twinning are I think, I think that's easy to overlook. And to me, this is where it's edge edges, edges. about the relationships between disparate components. Yeah, that's right. Yeah.
Right. I also think this is where the whole digital twin. The discussion around digital twin out there in our technical world is so fucked up because so few people understand this. And so it's it's thrown around a buzzword with no understanding. They they're not a digital twin of this process. They have lots and lots of data and no analysis or interpretation of relationship. So. But it explains why digital twins are both our communities don't understand them don't have a good definition of them. And the fact that it's a metal level, it's a level above syntax is really hard, because very few people can get above the syntax level.
Right? All right. Yeah, thank you. So that was also the, so you get customers that come to you with a big pile of bricks and say, build me a house. And you otherwise also have customers that come to you and say, I need shelter? And here's the things that I have.
You know, we do actually, yeah, that was an interesting problem. I liked it from
Yeah, those are the ones that you can solve is when they say I need to shelter. The ones where they just come to you and say build me a house. What do you mean by house? You know, you can you can get to a solution for them, but it's not worth the effort. That's right.
All right, everybody, Simon, thank you for coming in. We're gonna come back to this topic, because I know that it's, it's essential for us understanding how we're going to build, build successful pieces, build successful infrastructure. Alright, everybody, thank you. Thank you all. Talk to you soon.
There's so much going on in edge and digital twins, and this emerging technology of things, and how we build systems out of things and how we build multi vendor interactions between these systems that really need to be understood. Swimming is doing some really interesting work here, and taking a different approach in how they think about the data. And that really came through in Simon's conversation. We've had many conversations with Simon, and please go back through the archives, if you want to learn more about what swimming is doing and how it's evolved in their thinking over time. We talked to them over two years ago. So there's a lot of history for you. If you're curious about this topic. Thank you. And please join us at the 2030. Cloud. Looking forward to hearing your opinion in these podcasts. Thanks. Thank you for listening to the cloud 2030 podcast is sponsored by RackN, where we are really working to build a community of people who are using and thinking about infrastructure differently. Because that's what wreckin does. We write software that helps put operators back in control of distributed infrastructure, really thinking about how things should be run, and building software that makes that possible. If this is interesting to you, please try out the software. We would love to get your opinion and hear how you think this could transform infrastructure more broadly. Or just keep enjoying the podcast and coming to the discussions and laying out your thoughts and how you see the future unfolding. All part of building a better infrastructure operations community. Thank you.