Keynote: Leap into the Future with Advancements in AI Technology - Ren Wu | SVIEF
5:23PM Sep 30, 2018
very power efficient
And now let's kick off the summit. So please join me on the stage. Our first speaker of the day, Dr. Ren Wu, the founder and CEO of novumind, please come up to the stage Dr. Wu
Thank you very much. Thank you.
Good morning. I notice there is a few of my friends who are late to this session, because I was this sessions title is leap forward to the future. I think for the people who is late for the session. I recognize all my friends because they recognize that with law, Ms. Acknowledge, we cannot to accelerate your path for future you know, and precedent way. So that's why they are able to afford to be here late to be able to get into future earlier which is our technologies.
Oh, anyway, so, okay, so the
talk was this session or my talk is to really give you some introductions. And what his love of mice view on how to get into future your words, what is our approaches, what is our view from past, present to future and like to share some of our
beliefs or our technologies and then hopefully will convincing you that future is very near, we just want to, I'll say it, everybody, so then we be able to get our destiny faster.
Many of you have a lot of you have been and remember the or have been that even back to 21 years ago. That was in 1997, IBM sky machine, Deep Blue beat or champion Garry Kasparov. But I'm sure everybody in this room must remember last year what happened in the world that is
our photo Google's machine beat, what number one good player could you in,
there is no match what completed destroys human players. Those two events apart from 20 years, in fact, tell us a very important aspect because the human intelligence or even animal intelligence, there is a two sigh. One is to do the calculations are this is actually was
not the right version. So this is version we have, okay. Anyway, so the left hand side is one of our limited capacity calculation. And on the right hand side is our
mother capabilities that has been, you know, wait for 20 years to realize that is the
instinct, the intuition. So those two combined in fact, tell us we are in a stage over in the arrow that many of the scenes would, what would we think well, I'll be able to do is not become very realistic. In fact, that is where we all know,
calculation policy intuition can lead us to a long way to the future autonomous driving to become a such a hot topic then precisely is because calculation and intuition to
from the calculation thank to the intuition, there was one common elements that all come from that is really come to fruition. So computation it's the K that enables to do many things that seems unrelated computation but it is computation easy internal driver that have enabled us to do so much things more efficiently than ever before.
So in other words, in summary opportunity today, or even last year, we have been pushing Big Data pushing deep learning push high performance computing because we want to make our AI smarter be able to do most things in a more in a more accurate better way. But that is I think, the pass what happened is once you have an AI technologies once a year become very smart, what can you do next
next is we want to deploy AI technologies everywhere any pace and available any time so that is always think we need completely different approaches to turn AI technology to give AI into our lives and in turn, improve our lives. So that will require not just cloud computing, but also all kinda intelligent devices, also called edge computing, cloud plus age will make a really good entire life and in turn, improve our lives.
And of course, that is we see the future coast. But of course, there is a tremendous challenge because once you have a very good AI, you want make it available any place any time that will require you have the
tremendous computational power in the meantime, you have a very power efficient approach and still is very cost effective. So those three elements pose a serious challenge and how to develop or really make AI available again, computation the power power efficiency and cost effectiveness that it's really the factors we have to address.
But how fortunately I think the people in the industry like us and also the
very you're like tuning award recipient, our legendary figure have figured this out and it's actually open to tell others because once you want to really get a performance power efficiency and cost to extreme that is
a new way and although it does not work the only way to move forward according to john Hennessy is to use heterogeneous or domain specific architecture to move forward let's listen what
we spoke said to me about 25 years ago I
similes Moore's law
he says order
something that's more advanced so I'm trying
go yet we've got about six languages that are specific
Yeah so the keyword is domain specific
architecture because other common architecture runs terribly and
heterogeneous computing pause domain specific architecture is only way to move forward that has been yellow the highlighting his talk
and of course to do that it's really can't come back to the efficiency and those surprisingly cost Dobbin one saying that intelligence is really the species how your fishing species do is since his height to be done. So that is, again back to evolution. Because our species on this earth is precisely follow that directions once you get enough intelligence have power efficient to get a sense bang is really the key fact
again, that challenge also another legendary john Connor has mentioned, we all faced great opportunities. But for most people, the only see challenges grant opportunities, you always hide great challenges. We as a company, we want work with you to take the challenge and seize the opportunity. That is precisely what we have been working in our three years. Our first chip Nova tensor chip, just come back from the foundry and here is one of the exam assembles industry assembles. So we offer
automatic to better power efficiency than alternatives. That was the reason there is a reason the reason is because we design the chip from ground up that is a real domain specific architecture for AI computation. That is a reason we do so much more power efficient way that any competition and in fact, we also gave the US Patent Office we are major patent has been awarded already this months. And just to distinguish ourselves with any of the competition because we as the only company, the patent holder be able to do local native tensor processing, which is key for a operation it's lady format.
And this is early example that has been
show in CES this year. Even an FPGA prototype or FPGA implementation of our chip, which is completely new architecture, we use 5% of the power get half of the computational power realized compared to stay of the Nvidia GPUs you know what that is our order magnitude more power efficient from FPGA painful against 16 nanometers stay with our process
back to our company, we believe the way to future is that we're not just to have the AI countless developed developed AI technologies, we wanted a technologies be able to be available to any companies out there because there was no way for traditional company be able to hire experts be able to get supercomputer for a training process or highly powerful engine to run the applications some what we believe is really a vertically optimize solution. In our words, full stack AI solution provider, we provide everything needed for traditional template. So then they'd be they'd be able to very easily plot AI solutions into the exact existing business. So then use AI to accelerate their business in a very efficient way.
Here is the laws of example, because I say it for a as a technology, if you want to make the technology use for you have to deploy such technology. And also you have to make a technology to get into our lives and improve our lives. And here is one of the example that we have a joint venture company in China to use AI to chant the medical and in this case is and docs up diagnostics technologies. And then I will show you the video and we'll come back to explain why this is an example that AI technologies with the first time technology to really make the AI impairs in real time will change the practice which in turn will improve people's life. I'm sorry, this is in Chinese, but the English subtitles on screen
police, you did it
over a month
eat chocolate surgery. Welcome to
cheaper much time.
Yeah, I think the key is that
for medical diagnostics, it's really the doctor, the expert is the bottleneck. And of course, and even in us in China as well, the
digestive disease is a very common problem, we just don't have enough doctors to really give the accurate diagnostics and that is often Mr. Alex is always a big challenge with AI with AI chip that be able to do the analysis in real time. In fact, the real tech give feedback to the doctors, we significant, reduce entry barriers for the doctor who operate the equipment we know works with the AI
guided diagnostic process, the doctors will almost eliminate the Mr. grafting problem that has been really challenge all the time. And that again, the reason we'd be able to do that is that we design AI models, we use hundred thousand images with 10 very state with our models. But more importantly, because we can run the model in real time. And in fact, it's super real time
during the diagnostic process. So that the AI model will give Dr real time feedback on the problematic areas. And that is what I see that will complete a chance to the medical diagnostic practice and in turn, reduce the Miss diagnostic Ray in in turn to get early detection, which will in turn get early treatment to improve people's life. So that is all very good example. But it's just one example. We have been working with many other companies, we have our pipeline with applications like this will be, you know, advanced in following months or even following years. So
in summary, we have been pushing computational site to get better AI. But now it is really the pan to really use developed AI technologies plus heterogeneous computing so that we'd be able to deploy the air technology to all kinds of forms. And that is where I think the future what goes and of course, with our company
Well, now is back to one mother and my HP Labs colleagues have once mentioned mentioned that to best way to predict the future is to invent it, I think we worked very hard for that direction. I like to work with everybody in this room, or even outside this room to really make the AI in a
very large deployed format to get into our life and improve our lives. And of course, as I mentioned, this is again, this is the one slide there, I'm sorry, yeah, but with our technology, especially our AI processor and our AI Siri to call we will reduce the entry barrier to dramatically and that is, again, I think we will offer a very good computational power, very power efficient and very cost effective solutions so that we'd be able to make the AI application deployed into our lives much either at much earlier. Thank you very much.