This high stakes battle for a competitive edge. The right AI strategy could decide the winners.
Hi, everybody hi everybody, and welcome. Welcome to our session. Thank you very much for joining us today. My name is Richard Halkett. I'm the Chief Customer Officer at SambaNova Systems and we've got a fascinating session for you today on pervasive AI where we're going and how you compete. I've got a great panel that I'll be introducing to you in a moment. But first of all, I think, you know, understanding the buzz that we're feeling today and it TechCrunch it's great to feel the energy this year. But generative AI and AI in general is really on everybody's mind. But this feels like the hype and the excitement and the potential is really here to stay. So why is that? Why do we feel that? Well, first of all, you know, it's almost a year ago that we saw chat DVT went to 100 million users in two months, which is multiple times faster than any other technology. We've seen before. So this really does feel like the fastest industrial revolution in history, but we're also seeing really interesting signs from the market. I read a report from Accenture, which said that 97% of enterprise leaders believe that generative AI is going to be transformative now 97% is not a statistic that you see very often that basically means everybody. So it's extraordinary that we're seeing this right now. And they believe that 40% of working hours are going to be changed, transformed by generative AI and we're also seeing there so in another report, Deloitte said that 37% of businesses are already investing heavily in generative AI and that we're seeing and this again came from Accenture that next year 67% are going to ramp those budgets. So this is a fascinating place to be if you're a startup if you're an investor. If you're a customer and an enterprise pioneering, but it is complicated as well, right? We all know that AI is a relatively new technology that's exploded onto the scene in the last 12 to 18 months. And it's complicated. So we see these gigantic models they're getting larger and larger. They take a long time to train, they take 1000s of chips as well. And that all adds up to millions of dollars of investment. So if we see all of this potential with AI, and yet we see all of this challenge, how are we actually going to tread that path and ensure that we achieve that potential. So today, I've got a panel of three folks who are going to be helping me triangulate this problem. So first of all, I've got Rodrigo Liang, who's the co founder and CEO of somebody of assistance. Welcome, Rodrigo. We've got Alan Lee, who's the Chief Technology Officer of analog devices. And we've got Dave Munichiello, who is the who leads GVS digital investing team. So I'm going to be asking them some questions. We're going to be getting into this and hopefully we'll come up with some some decent insights and answers. All right, so first of all, Rodrigo, I'm going to come to you if I may. This term pervasive AI and the fact that we're seeing AI really change in the market. What do you see what opportunities do you see what challenges do you see?
Now? We're at a point in our journey here where people moving from talking about specific models and GPT or Stable diffusion, to think about how to solve real, real problems. And so the term pervasive AI is thinking about how to deploy AI pervasively in our everyday lives. Right. And so solving real issues for HR for legal for finance, or software development, versus talking about is GPT this better than GPT that and so, you're gonna see over the coming years, that people are going to start thinking about how to take the technology that we are now quite familiar with continue to improve it, but really think about how do we integrate it into their everyday work environment. But the challenge is, as you say, Yeah, we started to see this year with security, with privacy with accuracy, hallucinations, as you know, well, we're all familiar with, how do we tackle those and that's something that we as an industry, we as people in the technology field have to think about how to tackle so that this technology can be pervasively deployed and be used in every in every company and every in every single one of our lives.
Thanks, Alan. I'm gonna turn to you right. So first of all, tell us a little bit about Analog Devices and what your interest is in in AI and some of the opportunities and challenges that Rodrigo spoke to that.
Sure. Analog Devices is a technology company that really bridges our physical and digital worlds. We're a global leader in analog digital and mixed signal technologies, and we work at the forefront of the intelligent edge. Our solutions, you know, they really transform signals into actions and actionable insights. And similarly, AI can transform data into insights in support of manufacturing, engineering, sales, and most importantly, our customers. And that's really the point. Our collaboration with samba Nova can really provide us with the means of understanding and connecting with our customers on a deeper level than ever before. Okay, thank
you. And, Dave, you've been involved in this space a long time before it was fashionable. And so and you have a great view across as an investor, right? So you're, you're trying to look at the trends in the market. You're looking at all the startups that are coming through, right, you're looking at some of the incumbents and the market dynamics. What are what are you seeing that in the market and what's changing right now?
Yeah, so, TV, we've been involved in artificial intelligence investing and using it in our own business for more than a decade. I don't know if it's fashionable. Now. I look at this group. Now, but it's definitely popular. And so I think one of the, you know, one of the things that has been amazing over the last year and a half is Chechi Beatty has really educated the whole world about how exciting AI can be for almost every use case. And so that has been two major things. On one hand, it has, you know, created a ton of innovation and opportunity for innovation. So we see this whole room full, we see, you know, dozens and dozens hundreds of startups, trying to find new opportunities to explore using AI. We also see almost every large company bolting AON to AI onto their existing products. And, and as a result of those big large companies, they're not looking for a cloud based toy that they can play with and interact with. They're looking for modern enterprise grade AI. And so to me, that means dependability, reliability, stability. And so at GV we're really excited about investing in all of the layers underneath the really powerful applications that will power our next generation of companies.
Oh, wow. Okay, so you clearly see it as a there's gonna be an ecosystem. There's a stack here, right, which is going to actually be dependent on the success here, right? It's going to be important to deliver that totally.
I mean, at GV, we pride ourselves on having more questions than answers. So I won't stand up here and tell you, you know, all of the different applications that will be successful in AI, but I do know that you know, there's probably three or four major layers of that AI stack that will be developed over the coming years. We're excited to have exposure and investment in each one of those layers. Sama Nova is an extremely important one in that you guys prevent you present an entire system, right. So from the chip all the way up to the applications that power businesses.
Okay, so Rodrigo. Probably a lot of people are already aware, right, some major announcements and something over this morning, right. So maybe you can talk to us a little bit about those not just what they are. But But why right? I mean, building new chips and things takes a long time. So you've placed some that's why and where is it going to take us?
Well, we chose this forum so that we can actually announce it only to all of you, nobody else. But no, it's we're super excited about this. So actually brought one of these so you know, the world's the world's been talking a lot about CHIP shortages. How do I get more more AI chips and all the faults we were talking about finding finding these things, you know, so these are the chips that powered the largest models in the world that some Inovat We just announced today at sn 40 l this is a purpose built chip. So you can power the largest GPT LLM models in the world. And so with these platforms, you can now power 5 trillion parameter models, right, which is ultimately the types of models that you want in order to deploy into the types of large companies where it's mission critical, where accuracy matters. But we're security matters where data privacy matters. And so really excited about this, you know, you we deploy them in these integrated systems full stack so you don't have to hunt around for the chips. And Dave talked about reliability, predictability, availability, when when we deploy these, they're always there because they're dedicated to that particular use case. So really excited about this, you know, it's been in the works for for a few years. And today we're announcing are broadly available for us to actually deploy starting today and so happy to engage with folks and, and come check it out on our website. We're really really proud of it.
Thanks. Thanks Rodrigo. So our and when when Rodrigo talks about that, right, and we hear about you trillion parameter models and things like that, what does it mean to you when you're actually looking to the needs that you have in a business?
Yeah, it's really important to us and it ties back to the idea of pervasive AI or ubiquitous AI is AI everywhere. Our mantra and analog devices to be ahead of what's possible. And if you really want to do that today, as an enterprise, you really need to think of how AI can touch every aspect of the business, right? How how can you use it to garner more customers to understand your customers? How can you use it for instance, in your front office and in your back? Office? And how can you use it to design your chips or your software or whatever else that you have to do that your company's producing and thinking about that both from the high level and then getting down to the details is one of the things that really attracted us to samba Nova, right is that ability to really be on the leading edge of AI technology, and to have the largest and robust models and underlying technology available currently.
So I want to get back to you because Alan talked earlier about, effectively, these multiple functions, these multiple domains within a business. I think that I often think this when we think about healthcare, we automatically think about hospitals. Right? And yeah, the healthcare business has an enormous amount of other things going on it and I think about, you know, Allen's business, right? And part of it is actually building these devices and but part of it is actually also just running a day to day business. So how does what you talked about today not just on the chip side of things, but on the models actually support that.
And we're really excited about this. This is where, you know, partnering with somebody like Analog Devices is so important because these are complex enterprises. They've been around for a long time with a lot of mission critical applications and so the way we think about this, and we're really excited about this architecture, we call it the composition of experts. And so you can build this trillion parameter models, where they're made up of open source models and today we use llama to right as you know llama today is a meta release model one we're all in on open source. You can take these models, you can compose them, side by side, each train for a specific task, a specific function in the enterprise, be able to put them together as a trillion, trillion parameter foundation model that then allows you to then have the machine select which expert to answer, it gives you several things. One, it gets the highest accuracy, right? It gives you the ability to actually train those models dynamically, asynchronously, you can actually fine tune some fine tune for days some fine tuned for weeks, right? And so you can asynchronous do that. But most importantly, it gives you access control. Right that different people can query different experts because you have access to say the HR expert versus the legal expert versus the customer expert. And so for enterprises as complex as Adi and being able to actually handle all of those things, it's a it's a tremendous type of platform that allows you to actually have that level of control and don't lower flexibility.
Okay, so Dave, coming to you, right? Again, you observe this whole ecosystem, right? And you kind of laid out before the opportunities that you see there's multiple pieces that are going to come together, what what opportunities does this address what gaps does it address and what do you still see to come? What's the challenge to Rodrigo now for the next phase?
So I think one of the amazing things about salmon Nova is their ability to continually keep up with state of the art on the hardware side, but it's also the ability to continually reap in the best of open source models. So if you can imagine every model will continually come out and be open sourced over time. Eventually each model will make its way into the market. This company has the ability to like Nvidia, continually be at the cutting edge of compute and unlike Nvidia have a software stack from the chip all the way up to pytorch. And TensorFlow, the places where data scientists spend their time and energy and and that holistic, that whole system approach. It's almost like Apple providing you with a MacBook, this is like a MacBook, but for your data center. And and the thing that I get excited about is not just the chip, you know, Rodrigo, his background is in hardware. And he's, you know, the company has a mix of hardware and software. The thing I get excited about is the trend that this company is on, which is state of the art hardware, best in class open source models and an amazing software team. That sort of bridges between those two, best in class approaches.
So, Alan, when you think about when you're getting started here, you've mentioned before about these different domains, Rodrigo talked about privacy and the ability to have kind of role based access, right, which I think maps to those functions you've got. What's the tension that you see here between, I guess there's a speed of getting started. And then there's all of those other considerations you've got to take into account as well. So as a as a business, where do you begin? Where do you stop?
Yeah, you know, it's, it's a very interesting question, and we're going through that right now at my company. So it's something that we're doing and you know, one of the things one of the reasons you're partnering with samba NOVA is the scalability, right? You have the ability to start relatively small, but then you can go up, of course, with a release today, you know, up to 5 trillion parameter models. And that's important to us, but there's there's a lot of vectors to consider. Here. Right. There's the we have additional health care units, and you pointed out earlier, that's important. Of course, security and privacy are are really important. Any business when it comes to data. And when we think of how to get started, I mean, flexibility is really the key, right? We want to make sure that you know the samba Nova systems, you know, we were starting actually with cloud and then we're bringing in so we can actually integrate it with all of those business functions and things that we listed before. Not on Moss, you don't have to do everything all at once, but you can bring it in as needed for those groups in the enterprise that are farther ahead, and then bring in the others and scale as appropriate for the business.
Rodrigo I saw in the press release this morning. There's something there about the fact that this provides the opportunity for multi modality in a way that we previously wasn't possible and I was only if you get into that because of course, we're talking here about LLM, right large language walls and then we're talking about multi modality so what how does that happen? And what again, what does it actually do for real world use cases in this case? Well,
I mean, as we think about these models, and they've touched on this, right, these are open source models that all of us here, but all of us are innovating every day, right and they're getting better at this exponential rate. And so you think about language models now moving into vision models and moving into models that are doing a broad range of things. Now, you can do two things. If you're thinking about actually taking advantage of those, you can actually try to make these big monolithic lease lifts, where you go from this type of model to data model as they upgrade, or our thesis here is, why don't we create a platform that allows you to incrementally install these open source models, train them, and then you can compose them into these foundation models that give you the most multimodality right? It's a much easier way for people who are already using AI in enterprises do a lot of work verifying those models, integrating those models. It's a much easier way for the staff to journey and not have to get to a new model and reset and start over again. So that's kind of the our our thesis as a company that on how enterprises want to integrate these multimodal models, not as a big monolithic Lyft every time a new model shows up, but how do I incrementally get better and better and better and get incrementally better capabilities as the innovation of our open source community continues for for the next 1015 years?
So Dave, we're obviously you know, some of them has made these announcements today we got a room here full of businesses of all different shapes and sizes, right. So again, you you look at this overall market, when we're looking into this era where AI is going to become more widespread, are there are there new areas are the areas that you see are particularly ripe for future development by startups and other businesses? Right, what's hot?
Yes, I mean, within our partnership, we think that anything that is done well with software today can be done even better with AI in the future. It's a matter of time, before we close that gap. There are also a lot of zero to one capabilities. There are things that could never be touched before, by software, the data was too big. The insights were too complicated to get to. So you can imagine tracking weather patterns, understanding human genomics, you can now do that in one system. And there are a bunch of other zero to one applications that we're talking with, you know, large banks and enormous customers with that are very exciting. Those are exciting to unlock. But the reality is that almost every business is using AI to compete, and those that aren't using AI to compete might not be around in the next couple of decades.
Right so that's the opportunity. So Alan, I suppose coming to you here, right? What's your challenge to Rodrigo, right? Like I mean, you know, what are you looking for next, not just for Rodrigo, I suppose. But again, from all of the startup founders and entrepreneurs that we've got in the audience today. What do you need in order to make this potential revolution the reality
really need to push the limits of headed heterogeneity not just on the chip, you know, but in terms of multi modality, right, we need to be able to have state of the art language models but at the same time without any downtime whatsoever, instantly add in the compute and things we have to understand and evolve as more and more use cases create, you know, if one thinks of, I mean, you, you talked about the incremental addition but also federated models and adding new data and doing other things and training on the fly. It's, it's that ability of taking everything that we have right now and building upon it and building upon it and building upon it. That's really, in my opinion, the future of technology. Yeah,
yeah. Well, it's a great chance for us to collaborate in this kind of desire. These partnerships are just so important. Just because there are folks like Alan really helped guide us They guide us on what to do next. And we use some yo put some in our software stack. Some of them we embed all the way into the chips. And so it's a great way that we will learn from our partners and we will learn to figure out what the what our clients need. Well, I want to
thank our three panelists here. I want to thank all of you for coming. I feel like we've understood the potential that we're now seeing a pervasive AI, right, there's a there's a, as I said before, there's a tremendous amount of potential and hype that's going on here. But we're now beginning to see this move into production reality. I think we heard from Alan here about the diversity in models and capability as well as the reliability and predictability that's required and in a real businesses actually trying to do this. And then Dave, I think you're seeing not just the potential that salmon over shows, but the gaps in the ecosystem that are going to need to be fulfilled. And thank you Rodrigo, for joining us, giving us an insight into the announcements today. And thank you all for joining us. Thanks very much. Thank you.
Michael goal is to become the place where parents and caregivers come to find childcare that works for them.
My goal is to build the largest university in the world. My dream is to be the telemedicine provider for kids and families to stop generational trauma.
I want to change culture. I want to enable these gaming communities to become self sufficient. That's the goal.
All right. Hello, everybody. We have a really killer session for you up next. So stay with me, but also a quick reminder, we do do q&a On the stage for all of our sessions. There are mics in the middle. We're not doing mic runners. I was wrong about that. But if you want to ask a question in the next session, the mics I think there and there, and of course, don't forget we're using auto for transcriptions. And we have more space over in the TC plus lounge. Now, this next panel is one that I'm actually most excited about for the entire day. Perhaps venture capital is absolutely not new. Several GPS absolutely not new but similar GPS building new firms in public is somewhat new and from a very staid and somebody industry see memes and jokes is a welcome change. So please welcome to the stage it's normally supply change capital Mac from Ruby adventures and turn for banana capital and your moderator Anna Haim.