Rob, Hello, I'm Rob Hirschfeld, CEO and co founder of RackN and your host for the cloud 23 podcast. In this episode, we talk about analog computing, which is the idea of non digital computing, not quanta, but non digital, basically using circuits, analog circuits, either electrical circuits or potentially even mechanical or fluid circuits, to perform calculations and control systems. These are surprisingly common, especially in older devices, less common in current and modern devices, but they're making a comeback, in part because they're very fast and efficient, but also because, with AI tooling and digital twinning, we can design these circuits much more effectively, and with 3d fabrication, we can actually create these circuits more effectively if they're mechanical. In this discussion, we talk about what is coming with it, why and how it's interesting. And I hope you'll enjoy the discussion. It's really remarkable, changing your idea about what computing is conversation,
one of the things that came up at the conference is, is there is a chance that we're with, with, with all the with, with where we are, that we're entering a new age of miracles in a way like the potential for us To create things that are so groundbreakingly different. Feels like potential and scary, very, very scary at the same time,
true, but you know, I mean in my little P brain, that really is the power of AI, yeah.
No, it definitely. We don't have any, we don't even, we don't even know enough to but know what the guardrails are
exactly that
I think you, you'll like this statement. Joanne, I think it's not just the power of AI, but it's the convergence of all of these other threads, like industry, Ford, auto quantum, all of the these things coming together. Like when I was a kid, there was a show on PBS called connections where they just talked about how all of these innovations and all of these separate parts of the world over love
that show that was, that was high quality. It was high quality. Yeah, yeah, yeah. It's
just it. There's this geometric, exponential associated with the convergence of all of these different technologies that
one point, go ahead, go ahead. I was one of the points that came up. One of the speakers,
Paul from SK ventures. I'm not going to try to pronounce his last name, but he, he made a really interesting point about, you know, we're used to chat GPT, but it's really tokenization and the robotic stuff that we've talked about, right, some of these breakthroughs, that one of the things that that we're we're Learning, is that we can tokenize a lot more processes, and that process, and then combined with learning technology, then that, you know, that has has enormous potential to completely change the way we do things. And his example of tokenizing the joints on a robot, and then training robots for, you know, through learning is is a game changer. The other point he made that was really interesting while I'm while I'm reviewing his talk, was that these technologies are much more organic, meaning that you can't, you don't necessarily plan the breakthroughs. You don't plan the outcomes they have emerged or evolved. And so we don't, we can't just be like, Oh, chat, GPT five is going to add these features and those features and this, we literally don't know what it's going to be capable of until it, until it shows up and starts doing stuff that there's emergent, emergent behaviors out of these models. And it's not just the big ones. It's the, you know, the smaller ones that make it possible to do things like what you're describing,
yeah, well, it, in my mind, it's enabled ingenuity. And I say that because I think we all have that ingenuity gene to go and create something. It's creative, it's purposeful. It's. Yes, we're on a path all the things. Every major discovery that we've ever had in humanity has been somewhat accidental, and I think we now just have a better platform from which to do it, whether it's CRISPR or, you know, generative AI or something else. But to your point about robotics, interestingly enough, had a conversation with a lady who's the CEO of a startup, and their thing is to intervene. If you have a broken robot that's holding up a production line or a facility, they can intervene remotely and restart the robot, which raises all kinds of questions about security and breaches and whatever. But imagine that kind of capability to a manufacturer would be huge. To any to a retailer would be huge. But irrespective just the fact that she comes from Boston Dynamics. She worked on spots arm, and she came up with this idea of, you know, these things are going to break. Can you remotely solve the problem just by hitting a reset button or using a live operator to take control of it, to get it out of the situation that it's in? It's an interesting idea, how it works as a business model, I'm not quite sure, because there would be proprietary technology in every AMR ABG drone and robot
well, and it's an interesting business idea. It's not exactly novel, though, given that we have learned over the past couple of months that, for example, like Chevrolet Cruze, like the autonomous vehicle company, they've relied a lot on human operators to get their vehicles out of quote, stuck situations. Yeah. Also, just going back to your previous comment about discoveries being made largely accidental on AI, helping with that. Like I, I hope I'm not really into the conversation, but I just can't help to to make, to take the make, the conclusion of that, that AI is allowing us to make, to get into accidents faster.
I like, No, I think that's, I think, I think the idea of discovery faster discovery is, is reasonable from that perspective. I was going to pull us into the topic of the day, if that's all right with everybody, because I think there's a, there's a reasonable bridge here. Do it the topic
before, before you do that, Robert, something out. Yeah, I'm going to try a project where I'm working on enhancing my selling ability. So I'm going to attempt to train the new chat, GPT four. Oh, that has all the cross media support the new version released this week to be a sales trainer for me so that we can do gamify sales role playing with AI. So wish you luck on that.
I think there's some GP I think there are some gpts available in in the in the in the chat GPT store that have sales training or sales role playing. If you want to look? Yeah, that's
a good, good tip. Thank
you. Is four? Oh, generally available now if you have GPT plus, is that, how are you getting access to it? Okay,
yes. And if you ask it, why do you need plus? If you're using for four, oh, it tells you that well, coming, coming soon, there will be reasons to do that. In other words, account yet,
yeah, yeah, they, they said there will be a different budget for the free tier. Yeah. Okay.
The Yes. So the topic of the day is analog computing. And we this was a place we got to at the end of another call where we started thinking about this. So analog just, just as a refresher, analog computing is this idea that instead of building a digital circuit, you're building a analog circuit, and that could be electrical analog, it could be fluid dynamics analog, it could be mechanical analog. But you're you're building interlink components that that have a, basically a embedded stimulus response, um. So I got interested in this when I was in high school, and they were I actually spent, I went to a summer camp at the Naval Academy, and they were showing hydraulic computers that they were designing for the fighter jets at the time to do some aviation controls and things like that. And there's actually very sophisticated analog computers, but designing them is so hard and complex that they're really out of favor, and then patching them and updating the software is impossible, and so it's they fell way out of favor in our conversation, we were talking about them, you know, regaining some popularity being possible. And I think going back on our talk about AI, the idea that AIS could design analog computers, analog you know, maybe we need to expand the definition from what I gave that there could be new use cases or new new avenues for it. And so I'm saying all this to try and rekindle the where we were in the conversation and in the.in the agenda notes. I have a couple more points, but I'll tee it up there and see if anybody wants to pick up. Yeah. Can
I share my screen for a second. I'm going to show you. Let me
try and give you permission if you if you need it, you have it.
Can you see it? Yeah. Okay, so analog systems come down to controls theory, which, as you can see, requires calculus, and actually, when the feedback in the control system is nonlinear, it's actually differential calculus, and that's why it's so hard, because there's very few people that understand the math and how to build these kinds of systems. So but, and you think about what we do on the digital, digital versus analog side. The digital side, we, we we create, you know, if then else loops and other control structures to be able to, you know, you could have something like your feedback function as your network monitoring and whatever, right? But, but that's that's the reason why this is so tough, is because it's differential calculus that you've got to contend with. And when you start thinking about control systems theory in a way that is multimodal. Your underlying units complicate the math even more extensively, because you've got differential calculus across like the chain rule and physics type of units and translation. So this is an incredibly challenging design problem to be able to to solve for. But on the other hand, it is so much more powerful when you when you've built a an analog control system, it is so much more powerful than a digital control system because of the reduction in latency that's inherent in an analog control system. So I thought that would be a good way to start the conversation.
What I'm thinking of, though, to your point, Tyler, is, yes, there's a tremendous degree of complexity in the calculus. Is No pun intended. Alcula, but why were we going back to analog? Rob, I'm trying to remember the context in which we came to the conclusion, or why we're going down this road with respect to observability.
Okay, I'm gonna, I'm gonna pull up something else. But the short answer is, you can put what would take a rack of computers inside a single chip, right with an analog control system. So it's really okay,
so okay,
but Joanne, here's, here's the list of items that I had in my notes, because there's a lot of them. Edge. Does AI make analog more possible? Is it possible to design hardware beyond human comprehension? Is there an impact on quantum What about pair? Parallelism, and does this create a and I'm not exactly sure, why this? Why does this create a return to the mainframe? Oh, so maybe, maybe we're talking about analog computing systems that are more, more than what I'm thinking, which are more like control circuits, but a broader, broader use of analog.
Yes. Okay, I remember the parallelism and returning to the mainframe, I do remember that okay,
and my son, who's in the car with me, is saying that ASICs also can have analog compute components. Tyler
was going. That's where Tyler was. There are programmable, analog, CPLD or FPGA equivalents, right, right? No, they exist. I don't know the details of that,
which would make a ton of sense on signal processing, if you're doing signal processing, sound or, yeah,
what so I my, my theory is, since this is cloud 2030 is that the the advancement of AI is going to Make the construction of analog computers radically more simple
and and if, if you can build better analog computers using AI, let's take that as a given. What does that? How does that help us? Like, are we going to see analog computers in in more places, or are they? Is it going to change certain types of computing problems, like, we're assuming quantum is going to change certain types of
computing. I think
so. I think it's almost like the whole thing like, well, what is AI going to be, you know, in five years?
Well, but I think just go ahead, Rich. Well, I was going to the two two thoughts that that occur to me. One is it makes a lot more sense for, uh, edge devices, or edge things that are built on the back of of kind of this animal, the the next generation of analog technologies. It also leads to some of the questions about whether you can use analog
electronics, basics and so forth to actually arrive at really low power, high
low power, low latency control systems and put them in, you know, Very inexpensive. Put them inexpensively in places where two things one is they're arguably more resilient to environmental issues. Something goes wrong, and suddenly you're not you don't have a network anymore. But the other is that they are potentially more secure. And I don't know exactly why I want to say that, but for some reason that's that's coming up in my mind, the Yeah,
it's harder to alter the function of an analog. Exactly.
It just like you were saying, it's harder to harder to reprogram a, you know, a fluidic, fluidic computer in a jet, in a in a jet airplane. Yeah. So question, the question, I guess I want to, I want to raise, though, is, if AI helps you design them well and good, except I'm wondering if there's a new kind of or a different kind of representation that has to be used for,
I'll call it, you know, the for embeddings. If you were to build a an AI that was as close to a digital twin of an analog if you were going to use an AI to simulate an analog setup, something tells me that using, you know, vectors, the way We're using them now, is not the right way. I
yeah, I see what you're saying, Rich, I mean, so think about how we, you know, we have very mature simulation software that already exists for creating analog search. It's SPICE models, if you were will, yeah, and, and it's, and that's all based on a digital backplane for simulation. So using AI to augment the already existing circuit simulation modeling software seems like a layup to me. Yeah,
well, it Well, I think of it as neuromorphic computing. Neuromorphic computing,
yeah, I'm too clogged up.
I put it in chat, because then, you know, AI is already, we're already using, or taking design parameters that mimic the brains, neural network, right? Those systems translate better into analog circuits to achieve low power to do high efficiency processing. They're good for AI on the digital side, but if we put them in devices, we have different parameters that we can start to play with. So
and it's interesting because that that could become the backbone almost of now I'd like you on the word the autonomic systems that human bodies have, right? That that are certain responses, but then the ability, the other thing that we haven't mentioned is, right, we've had breakthroughs in additive manufacturing also, so the ability to manufacture these very, sophisticated designs one way up, and then you could, then you could embed, you could do hybrids, where you have some digital processing mix in with these systems. And the capability of building an autonomic or some baseline functions on an analog basis. Could be, yeah, remarkably powerful. That's really interesting,
but, but here's, here's my question, the bridge between the new analog and digital, and I haven't quite wrapped my little brain around how that would work. Where do you where do you define the border between the analog chip and the digital plane?
We've had that for decades as an analog digital converter. I've built them on chip before. So you're converting a you're converting an analog signal, which is a continuum of different voltages, that's time varying into a digital signal.
But what is that digital signature signal, the signal taking? Is it multimodal? Is it, you know, sound light, like, to me, you have those bridges.
I'm glad you brought up light, because, yeah, doing it with light is another that's still an analog, an analog computer, like, like, Facebook processor, right?
Yeah? I mean, like, like fiber optics, like converting from laser light to a digital signal process. It doesn't really care what the MO whether it's multimodal or not, and what the underlying unit of it is. It's just converting it. And on the digital side, you just need to know what, what the it has to be programmed with, the nature of the signal that it's seeing. So you can have sensor that's sensing, light, sensing, audio, sensing, whatever, and goes through the analog computer, and then through the ADC conversion, and then the digital side has context of what the sensor signal is. I
mean, basically you're going to have same you can have signaling systems that are basically side bands. You might have a very, you know, very low, you know, low throughputs, you know, kind of a control system that says, hey, what's coming down this, you know, enormous. You know, light pipe is, you know, this is what's in there. You know, adjust yourself, you know, accordingly. Now, there, there, there are lots of ways in which this, this can go. I like this notion of the autonomic, you know, kind of the introduction of autonomics kind of fits into what I was saying before, and that is that placing this kind of, you know, fallback, fallback systems, when they're cut off, they're in the. They're, they're, you know, without their in distress there, there are no there's no power, there's little power. There's no communications so forth. How do they, how do they, you know, remain in place, how do they kind of take care of business while, while being, you know, disconnected. I think that has a that that is going to be quite important to the the question that you were raising a little bit before Joanne was more a question of telemetry. And I think that kind of goes back to the that control, that control channel, that kind of side, sideband, you may find that in order to monitor a lot of this, it's going to require, it's going to require A a digital a digital implant, if you will, that is used for, you know, sending telemetry, working with digital twins, you know, all of, all of that, all of that good stuff. But I, I, I like, I like the idea of the of the economy, I'm still not convinced that. Well, let me ask you this, does anybody know enough about the design of these analog systems to know whether the use of kind of vector embeddings and and that that kind of proximity or similarity measures makes sense here. Or, you know, is it a, is it a kluge that has to be replaced?
And that's the question about where, where the where you go analog and where you go digital.
Yeah, I guess I'm, I'm wondering, to what degree do the AI is that we're talking about doing a lot of the design and doing some of the simulation that goes into the design and testing, to what degree do they have to do? We have to find a new or a different kind of representation that they're operating on, as opposed to the kind of the linguistic model that we've got with, you know, natural language processing, the the the embeddings model that we're using now, it's a, I don't know I could be, you know, completely off on a, on a wild goose chase. Yeah, I
think, I think we may see, we may we will see emergence of new
AI architectures as this space evolves.
There's no question about that. I mean, chat, GPT four is a fantastic tool. Club's a fantastic tool. We've got lots of fantastic tools, but this is they're not at the level of doing differential calculus at this point. But how long will it be before they are and somebody creates an architecture that will make that possible? I don't think that what is far out on the horizon. So I also, I added up to the chat a link to an MIT article that might actually scare the crap out of you guys, because we're actually using analog circuits to create artificial neural networks. Yeah, a lot that that are that promise a million times the power performance of existing large language models or existing AI systems. That's probably not. It's not a not correct to say a large language model.
What's the nature of the analog technology that they're
using? So they're they're using programmable resistors inside of integrated circuits. So they've implemented a neural network in an integrated circuit with programmable resistors as as connections between the nodes. So it's it's really more of a an a and n architecture, as opposed to an. Actual, the actual neural, the biological neural network, right
the axiom, and
it Tyler, it leads me, and I just took a quick scan through it. It leads me to the question of, and to Rob's point about advanced materials, will we be will they require that? You know, silicon goes away and gallium arsenide comes back in
no, no, they're able to use standard fabrication processes for this. I
uh, but will it lead to the development of new materials to make them even faster? That's really where I was going.
Oh, I see, uh, you know, I don't have enough knowledge to even guess. I mean, my gut says yes, it will lead to new materials.
Yeah, because, I mean, they're referring to nano just FYI. If you ever happen to be in the area and want to have a real enlightening experience, check out the nano fabrication facility at University of New York at Albany. It's like a couple million square feet, and every OEM is in there. It's a fascinating way to see the future, because they're building that stuff there now.
So Rob, what were some of the other questions you had the follow ons, could you mention them when we started?
Yeah. I mean, I think we're covering the what we wanted to talk about, which is, you know, is there an area of computing that you know, is in the periphery of mainstream that has some that we're over, that we've been overlooking, as with a potential to really create, you know, some new opportunities, if, if we could do them better. And I think one of the things that we've talked around, but not directly, is analog computers are actually computationally very efficient, and can be, you know, can do really, really sophisticated, fast operations, yeah, and so I, you know, the potential to solve that, yeah,
let me ask you about that question, because that's something I know absolutely nothing about. You would said that fast. How fast is fast and does it? Does it? Is it on par with what can be done with digital and the reason I ask this is one of, one of the, the nastiest jobs in the world. And one of the the the most horrendous technologies in the world that, have you know, been perpetrated on us are power semiconductors. They are, without a doubt, you know, I'm going to bet that elect electronic engineers that have to do with with power, semi conductors have the highest suicide rate of any profession in the world, because they are, they're awful. They're terrible. If one were to apply, if one were to apply analog to power semiconductors. That might be a very, very interesting place to go. I I have no idea whether that's that's even feasible, but, you know, power semiconductors are just finicky as hell. And, you know, so, so sensitive to, you know, environment and changes that, you know, they can blow up. And I when they blow up, they really do blow up. And they're just, you know, there are horrible things that we have to we have to deal, that we have to deal with in the in the electrical grid. I would, I would wonder whether serious analog computing, analog circuitry, could be applied there. And, you know, take that pain away. Yeah,
that's an interesting yeah. So I've got another one, a use case, and that's self replicating systems, and I'm going to share my. Hearing again that this is a very simple fractal equation, and basically, if you remember what my control systems theory equations look like, it's the same as a fractal equation. So what, what we're going to see is analog computers being utilized to implement fractal creation of systems that are going to self build and self replicate. I uh, I've not seen anyone bring that up, but I haven't really done an extensive Google search on that. But that's, that's definitely something that is kind of, you know, Terminator, to ask, as it were,
what is, what's the, what's the, what would be the menu? Maybe I'm asking the wrong question, but what would be the manufacturing technology for that would be employed there? I have what?
I have absolutely no clue. Okay,
it just
what I'm saying is that's where this is going to go. This that may be a topic for the 2040
call that we haven't set up yet, but,
but that's where it's going to go.
Well, that's the kind of thing you want to, you know, places you know, kind of the nucleus on of something like that on the moon, and then say, all right, use the local, local materials to build out, you know, the world's biggest, baddest data center, you know, based on this, this kind of self replicating, you know, build out and
so, so, so you could find front end your analog computer that implements the Fractal Design for replication with artificial intelligence that performs the analysis and signal processes that feed into the analog computer.
Yeah, the monitoring. I mean, basically that goes back to the whole what's
the competition, what's the composition of this moon dust for construction of of lunar hats?
You don't even have to wait for the moon, though, because no, of course not. Power controller is think EV, yeah, think lithium. You know, the worst job that that Rich was describing for the grid is the same one that's for EV, so maybe the manufacturing process that he's thinking of is similar to the manufacturer of lithium batteries for EVs,
yeah, that you guys are probably onto something there, where the first application of using analog computing for self replication and those kinds of things would, would, I would obviously be in manufacturing, right? Because that's what's going to drive that's where the business value is going to drive development, 100%
also more about repeatability,
right? Part of the
benefit of these systems, or when you're dealing with, you know, a highly regular, a highly repeatable process, a normalized process. You
know, maybe this is the reason Tesla Musk fired all of the charging division of Tesla because, you know, they want to go analog now,
yeah, first course, by the way, but talk Talk about an analog computer. Oh, my goodness, yeah. Wow.
Oh, it's certainly save a lot of dollars. Certainly save the environment even more. And probably is doable if somebody figures out how to take what was an agreed standard by GM Ford and Tesla to be universal, renegotiate the whole thing all over again. But, yeah,
but you know, high power, high power systems are definitely a place where, you know, not having to convert from analog, which everything's analog. Ultimately, the. It's the less conversion overhead that you do, the more effective the systems are going to be. And Rich you were saying fat, you know, why? Faster? But it's, you know, if you're dealing with analog signal and functionally, you get to take a derivative, the dollars point second and third derivatives of these signals analog by the devices themselves and how things are structured, then all of a sudden you can do control equations.
Yeah, the the, the the the overhead to do the all of that derivative work well then
and all of the interrupts and signal process and all the OS overhead, yeah, yeah, and so, so all of a sudden you can, you can take a signal, the thing, the thing that I'm not sure of, and we're out of time. So I'll leave this as a teaser is, you know, just how we end up embedding these systems into the into the systems that we have, right? The ones I've seen are much more sort of dedicated function. Be interesting if, if it becomes easier to make them side cars or integrated components into systems that we already have. We are out of time, and I do, I need to, I need to wrap up. Next week we're talking the topic is, hello crypto, my old friend. So we're gonna go back to
token token.
Can we talk about blockchain, we will talk about blockchain. It doesn't have to be crypto. See, you guys, everybody, wrap it up. Thank you. Fun conversation. Thank you. Bye, bye. Yeah, wow. What
a fun conversation. I love it when we go into areas of techno technology that are adjacent to sort of the mainstream cloud pieces, but really think through how dramatically we could transform the industry with just a little bit of change, with a little bit of new capability, with some new thinking, because these systems can easily be embedded and adjacent to the systems that we're used to seeing, and the results could be profound from an improvement on performance perspective. If this is interesting to you, please keep tuning in. Join us at the 2030 cloud, be part of the conversations. Tell us which one here participate in our book group. We are having a lot of fun here, and we hope you are too. Thanks. Thank you for listening to the cloud 2030 podcast. It is sponsored by RackN, where we are really working to build a community of people who are using and thinking about infrastructure differently, because that's what RackN does. We write software that helps put operators back in control of distributed infrastructure, really thinking about how things should be run, and building software that makes that possible. If this is interesting to you, please try out the software. We would love to get your opinion and hear how you think this could transform infrastructure more broadly, or just keep enjoying the podcast and coming to the discussions and laying out your thoughts and how you see the future unfolding. It's all part of building a better infrastructure operations community. Thank you.