The AR Show: Karl Guttag (KGOnTech) on the Many Hurdles to µLED and Broad AR Adoption
12:19AM May 3, 2023
Speakers:
Jason McDowall
Karl Guttag
Keywords:
waveguide
ar
pixel
problem
micro
put
optics
display
technology
talking
light
leds
people
work
dlp
color
vr
eye
emitter
company
Welcome to the AR show where I dive deep into augmented reality with a focus on the technology, the use cases and the people behind them. I'm your host Jason McDowall. Today's conversation is the second part of my interview with Carl goo tuck who's an industry analysts, speaker and the author of kg on tech technology blog at K Gu talk.com. Carl has 40 years of experience in graphics and image processors, digital signal processing, memory architectures and micro displays. He's got 150 patents to his name related to these technologies, and many billions of dollars of revenue attributed to those inventions. Carl spent nearly 20 years at Texas Instruments, and he's been a CTO at three microdisplay technology system startups, in two of which he was also a co founder. Most recently, he was also the chief science officer at Raven, a company developing a hardware and software platform to deliver mission critical intelligence to military and first responders. And the second part of my interview with Carl, we talked about the current state of micro LED technology, challenges with AR displays in optics diffractive versus reflective waveguides. The unique opportunity for Loomis, historical comparables of DLP and l cost to today's pursuit of micro led and AR glasses, the importance of hands free, and the implications on device input, if or when AR glasses will replace a smartphone, and which technologies will win in the mid and long term. As a reminder, you can find the show notes for this and other episodes at our website, the AR show.com. And please support the podcast@patreon.com slash the AR show. Let's get back to it. Let's shift gears a little bit and kind of flipped to the other end of this AR VR spectrum. And here we're talking a bit about this video pass through on the VR side, we're talking a bit about the the opportunity that HoloLens and Magic Leap had with a really fully functional see through AR devices. But the other end of that scale, AR glasses truly wearable glasses that incorporate the sort of overlay that AR promises on the real world. But one of the big barriers to having smaller, lightweight AR glasses is small and efficient displays. And the technology that we've talked about in the past. It's been around this industry, in the hopes column for a number of years and now beginning to be maybe on the commercialized column is the is micro led inorganic led micro LED technology. And maybe we can start a little bit about sharing your perspective on what are the hopes and the challenges of micro LED?
Well, I think you know, there's a lot on the hope side. I mean, it has a lot of theoretical advantages as a technologist. You know, there's a lot of development of LED I mean, LED technology is there, we can kind of see by there's so many entrants in it, that there's a lot of technology been developed the the fabs exists to make LEDs and all that. The issue is I've been following micro LEDs since I guess around 2015 16. Somewhere in there, I first started hearing about micro, if the inorganic micro LEDs is that? I'm sorry, sir, as I sometimes say there's a slope. You know, people have this so called Moore's Law where they think Moore's Law says that everything doubles every so you know, well, a tip is called the learning curve and what the learning curve says. And I think it's the more it factors in a little bit more, it says that for every doubling of volume, costs decreased by a certain rate. And so what happens is you have to keep doubling. And what happens is, one of the reasons why technology tends to slow down is because it runs out a doubling. You know, you do your 10 units. Well, the next 20, you know, if you do 10, if you do your first 10 units, then your next 10 are going to cause this much less than your next 100 are going to cause that much less and so forth. Well, the problem you have is that you can't double fast enough to solve all these problems. And now we've been at it for about, what about seven or eight years that I know of and probably going on before that you've kind of got a feel for the slope. And we're seeing some interesting stuff happen. But fundamentally, micro LEDs as a display device are not very good yet. If you look at the pixel a pixel nonuniformity is huge. So in some ways, the fact that we're seeing these green only ones because the only ones are going to mark it right now are green only ones. And then I've also developed the category I've my late one of my recent blog hours I talked about full color versus true color. And I think we're gonna have to start making that distinction more and more. And what I mean by that is, is well we have monochrome or mono color, which could be green doesn't have to be black and white, it will say monochrome, they think a black and white but it could be just green only, you know, green and black, so to speak. But then we have full color which is you have red, green and blue there so you can produce in theory, any color in between, but then there's a thing called full color and the distinction is Make between full color and true color is that with true color, you actually hit a color point. Actually, if I put up a color is the color, I meant it to be not a color in the ballpark of what it to be. For example, if your red, green and blue, you can't control them very well, if I take red and green and put it together, I should get yellow. But if I have too much red, well, it's going to be a very reddish yellow. And if I've had too much green, it's going to be a very greenish yellow, kind of a pea green, kind of off green. Or if I do black and white, if I put up a white image, it's going to be blue or pink or red or something in between, it's not going to be look white to your eye. And so the problem we have with micro LEDs is the pixel pixel uniformity is huge. And the uniformity across the display is large scale. And so if I want to put up, if I want to put up something that looks red, I can do that. If I want to post something green, I can do that. But if I want to put up something that looks like a human face, I, I want to pop up and show a picture of somebody or something, it may look pretty bad. And we see this a lot with waveguides. You know, we know that most privily the diffractive wave guides, they have our heart really hard problem with uniformity across the the image that generally the color is various, you kind of see a change in colors you go across, which of all you're putting up this red, green or blue and okay, the green will be a little darker and one side of the screen than it is on the other. But it's still green on both sides. But if I put up white, all I'm going to see is a color I'm going to document see, if I tried to put white up there, it's going to look say red on one side and green on the other, it's not going to look white, uniform white across. And so when you try to get to true color where you're really trying to hit the colors you want, it gets to be a big problem. And they're a long way from that. The other thing I see in all micro LEDs today is a huge what I call a grain. Basically pixel pixel non uniformity is very hard. When you buy LEDs if I was building projectors for a while or working, doing L class it was used in projectors. When you buy a bag of LEDs, you combine them in different they're different quality levels, they're different brightnesses for the same power. And the reason why is is what they do is they dice up the bat, they literally take away for full of LEDs, cut them all up, test each one and bend them accordingly. So they're binned. They're put in different bins based on their quality. Well, as we say, when you do a micro led, and you do a flip chip, where you take all the LEDs and flip them over, basically take a white chip of LEDs and put them on a CMOS device to drive it.
You get all the bands at one time. So now what you have is you might have due to either bad contact or a defect in the LED, you have a bad led next to a bright LED next to another LED. And they do tend to cluster too. And a lot of times, you know, if you have a bad led, there's a higher probability that the LED next to it's bad. So sometimes you might have two or three weak LEDs. So there's some serious issues there. With the uniformity pixel and uniformity across to the display. With micro LEDs, therefore, your ability to make full color ones, or with all lots of color, many, maybe she call it multi color. If you want to do multicolor, one multicolor, you can do that. And that's kind of gonna get there. But I don't think you're as you're still a long way from competing with an O li D or an L cost display or a DLP. When it comes to image quality. We're not even we're years years and years away. So depends on what you're doing. Now we get back to applications. If you're doing AR or augmented reality. Maybe you don't need perfect photographic quality for a lot of your apps. As I often say, HoloLens to prove the image quality wasn't important in the apps they serve because their image quality was so poor. So what you might do is find ways to do that when you're trying to do an informational display. Maybe you need just red, green and blue or a few colors. Because you want to say you know the classic thing is green is good. Red is bad. Yellow means caution. A few other colors to fill in in between black and white. You know some white of course in AR we don't have black, AR we get clear. Black is clear. So dark colors don't work very well. You can't do like a dark blue. A really solid blue is almost invisible in the real world because there's not a lot of we called nits, there's no nits in blue, so you can't see it very well. Blue tends to colorize things like if you mix blue and green together, you get cyan, which a lot of people interpret is light blue. So a lot of times what you see is Blue and an AR headset is really cyan, it's got some green mix them, just so you can see it. So I oftentimes say the blue colorizes the green anyway, so micro LEDs, I mean, I could see them maybe in watches, you know, things like that, it's going to be hard to see how they get to cell phones and high volume anytime soon. The other thing that we're going to see, by the way, in some way, point out to people, politically, we look at like play nightride play nitrides, trying to do large displays like phones and watches direct view. And they're also trying to do micro displays for AR and stuff. I don't think that they're using a quantum dot technology to convert Blu I think that technology might be a good way to go as a opportunity in the large display market and large being direct view that could include watches, but I don't see it being a very good way to go in the AR space. I I just don't, I think there's some basic problems with trying to make that work. Because quantum dots really need a bigger pixel, they don't work to what they're, they're issues with getting the quantum dot conversion on a really small pixel, I tend to do all that kind of stuff. And that did not know that like five, six years ago, that's all emerged from having studied the market, getting feedback from people and seeing how things work. But anyway, I think micro LEDs, they're interesting. I'm also from a microdisplay point of view, I'm really interested in the technologies that in you know, I kind of got excited about PureTech, because PureTech had a single emitter of all colors. And I got reminded that ascendo also has had developed that technology years earlier. And there's actually a couple of other companies once all you have to do is go out there and say something. And I'm I'm a fond fond of saying that nobody will volunteer information, but everybody will correct you. So once I once I went out with PureTech. There, I found out about several other things that I've been reminded of minded of and or notified of other people who would developed a single emitter variable, I think that's really important in the micro display. Because we can't afford a spatial color, I don't think a spatial color works very well it's been l classes. Big advantage is that it's not spatial color most, most of the all costs and DLP that's used in AR is field sequential color that lets them get a very small pixel because all the colors are mid from a single element. These things are so small, that there you start talking about pixels that are three to two to three microns. Now, some of them. And though now you're talking things that are only three or four wavelengths of light, there's no way you're going to make a spacial color pixel that say three microns, you're going to have such other physics problems, just trying to get the light to emit when you start talking like one micron or so emitter. So I'm kind of a big believer that if there is a market for microheli, dar micro LEDs in the AR field, these displays are likely going to have to be single emitter, I just think that's kind of one of those holy grails. The problem there is it's hard to control. As I sometimes say I should play this quiz show where he says, give me a technology. And I'll tell you what's wrong with it. I mean, all these things have problems. That's why they're not in the market. Now. That's why you're not seeing a high volume market these days. Because there are these other things they have to solve what you try to assess as a futurist, when you try to say, well, which of these is most likely to succeed? You say which one has a realistic chance of solving all their problems? Because if anybody has a physics problem, even one physics problem, they're already eliminated, they can't even participate in the game. But a lot of some of these things don't like like the single emitter thing. Yeah, I mean, at some point, you're going to figure out how to control it. There's a complexity barrier is a difficulty barrier. But it's not there's no physical barrier to it, there's nothing stopping you from doing it, which is kind of why I get excited about the single emitter stuff. However, they're still on a realistic level many, many years away from from a product, I think there are several years away from even having what I would consider a multicolor slash full color device. And even farther away from that from doing a, what I'll call a true color, which is something when I say to color I mean you show a photograph on it right and look at somebody's picture of a human face. That's why I've ever looked at my test patterns, you'll see I tend to put a lot of simple patterns up there that do things I have in all my patterns. I put lots of white spread out over the entire area. So I can check. I can look for color shifts across the display. And the other thing is I put a human face on there. Because people are really sensitive to human face. They can tell when a human face is yellow. Some people think is primordial. Basically if you see somebody who's who looked through color has changed A little bit, you know, they're sick and you know to stay away from or you know that you get help or something. They some people, a lot of people think it's primordial, that when you look at faces, that's why people are people are very sensitive of color of the human flesh. That's one of those little stupid things you learn to be in the industry for 40 plus years.
A couple of comments on this, since you brought up us tendo, of course, is very near and dear to my heart. On the one hand, I'm not dismayed necessarily looking from a historical perspective that over the last five 710 years, we haven't yet seen micro led emerge as a commercialized technology. It's not to say that I don't want to happen sooner, it's just that historically, light is hard. Tei was the most successful of any company developing a brand new display technology with the DLP the array of micromirrors that they used in the DLP technology. That concept took 20 years from conception to commercialization. And that was the most efficient that anybody's ever been introducing a brand new display technology, oh, led took over 30. LCD was over 40. It turns out, creating new ways of generating controlling light, light is hard. Maybe on the flip side, the implication is that if you've only been at it for 1015 years, does that mean we have another 510 15 years to go before micro LED is going to be ready? I don't know. But light is hard. I don't think that anything that you've stated suggests to me that micro led cannot get there. Yeah, I
think I think it's fairly well. I think everyone believes that's where the big bets are, if you look over the big money and AI and display devices is going it's going into micro LEDs. I think as a, as a strict technologist, micro LEDs have all the long term plays. But you could you could have said the same thing about Oli DS or some other technologies going back to LCDs. But still 90 something percent 98% or something of all TVs are still LCDs are all display devices or LCD even though you know, maybe people like Mike, you know, older ladies have invaded. By the way. There's some disadvantages to micro, they have some problems too. But micro, the I'm sorry, the OLED TVs are generally considered the best because they have very good blacks, they actually don't do well at the dark levels, they actually have problems in some level in some areas. They're not as good as good in some ways is LCDs at all things. But they are good. They've got really good black. Same with micro or LEDs. I think everyone agrees. And that's why you see the big money from the big companies going into the micro led companies. I think there's a broad agreement that somewhere down the road, that everything will be micro led, it has all the high cards is more efficient than LEDs, more durable than LEDs. You know, it's more efficient than anything. Now, there's some issues about and one of the things I get on to is the degree of witches Lamberson. Most LEDs today are Lamberson and their light their light that comes out. Now they put micro lenses on it to try to reduce that the collimated light a little bit.
Yeah, right in the wall TV, you want it to be limited and you want the light to spread out. So you have a decent viewing angle, what people
want to direct you is is very spread out light, you want very diffused light. So no matter what angle you look at the TV or the monitor whatever from it always looks the same. You want Lamberson yes, you're illuminating the whole room. And you realize how clean a percentage that light is making your eye just Just imagine a cone coming from a pixel on the display coming into your eyeball. And think about how narrow an angle that is. And then think about how much of the rest of the room you're lighting up just to get that one little bit of light to your eye. I guess we could argue for a much more efficient monitor where it's like looking at you and then staring you know, you can imagine steering the light to your eye, there's no need to really illuminate the entire display should only illuminate your eyeball. Interesting company called animorph, who's kind of working on that principle for near eye display. The problem is in near eye, we only care about the light that goes in your pupil only the light that goes in your eye counts. So when near eye display, we don't want Lamberson Light Plus, if you look at almost all the waveguide technologies, all the mainstream waveguide technologies, they all require collimated light so all that LED version like is thrown away and only the very small part of the light that's fairly how the collimated will even be accepted into the waveguide they block it off. So most of so even on a micro led if you get down to it right now today. From everything I understand if I took an L if I wanted to build a white display, let's say I want a white sphere, like I'm looking at a webpage. Okay. l costs today due to the physics of edcon do and how they can start with a really tiny LED illuminated l cost device and delta L costs is about 10 times more efficient than the best micro LED. Today, if I wanted to put up a white display, however, it takes the same amount of display to turn one pixel White is it does it turn the whole display white. Whereas with a micro led, it only takes the power is roughly proportional to the brightness of the average pixel. Well, on a typical AR display, you definitely want to be see through. So therefore, most of the pixels are black. So if your average pixel value was five, or 10%, at 10%, it's kind of breakeven with L cos, if you're at 5%, which would be a Texas bleh arrows, the typical stuff you see in a lot of AR demos, is maybe 5%, or less of the pixels are on, or the average pixel values less than 5%, while the sudden you got to win, because now you're not turning on much. The problem is, heaven forbid you go and browse a web page, and they don't have the heat dissipation to deal with. Because now all of a sudden, you're burning 10 times the power that you would have without cost. And this is one of these kind of weird trade offs a and by the way, those numbers are very, very approximate. But that's just kind of I've kind of worked the numbers with one guy, you know, to say, Okay, if I did this, and I did that, and I substitute this for that,
does that include the energy cost for control and picks
on I was really just a light to light, it was just basically Okay, the light the power to drive one LED, how much power did you put into one LED versus how much power did you put into the LEDs and the in the microarray. And the reason why it's so much more efficient, is because with a with a small with a single LED you you can that single LED is basically small enough to fit into the, into the entrance pupil of the waveguide. And if you were to put, it turns out, you can take that time to you can change it, you can take the light with a lens, spread it out over L cost device and then squeeze it back down. The time do is primarily set by the size of the initial light source. The problem you have with a micro display is the size of the micro display is the size of the light source. Now you can put little complementing lenses, micro lenses on top of each pixel or each little emitter. But you don't have much room for that. Because the only amount of condemnation you get, you can call them eight a little bit. But the only thing you can improve by is kind of the ratio of the pixel pitch to the size of the emitter, you can put a lens on there to try to collect that light a bit. If you can get it close enough you have to put those lenses like right on top of it already. You've lost the game. So it's a question how close can you get the lenses and optics and what that ratio of sizes are sides of the mirror to the size of the array. And so anyway, that's just kind of the physics of it without getting into too much too much of it. But the problem is, is that micro LEDs tend to do Lamberson light, what we're really looking for, at least for a waveguide type solution is a an LED that emits collimated light basically like a laser. And we've heard it sleds and sleds made a big hurrah A while ago, the problem was sleds were they weren't very efficient. So even though even though the light was collimated, there wasn't very bright wasn't you weren't getting a lot of brightness. But what you really liked is if you could somehow get an LED that emitted collimated light, that's the the Holy Grail is a LED that will emit collimated light. And where you can tune the color where you can change the color of it at the same time. You can do a single emitter for all colors and light comes out collimated. Nobody has figured that out yet. If they do, then they'll eventually win. But the bottom line is is that yeah, so I agree that I mean, I think everyone tends to agree that physics wise that the gut feel wise is that eventually Mike really does win. But the question is, is that five years, 10 years, 20 years, 30 years, took a long time, all the days of still old ladies have been considered one of the best TV technologies for a long time. They still don't dominate the TV market. Yep.
So I think there's a manufacturing challenges ultimately on that side, on the micro led, it's want to come back to a couple of points he had mentioned, one of the points you talked about was this challenge of getting all the colors to come out of the same spot. Yeah, and one of the benefits of having them all come out of the same spot is that it typically talks about spatial color. So the display that we're starting out right now on our collective computer displays, those are spatial color, they separate horizontally, the red, the green, and the blue in some sort of pattern. And because we are far enough away, our eyes are far enough away, we integrate each one of those areas into one kind of single source of light, that's our eyes kind of tip tend to see it. But that of course the challenge is then that the full size of the pixel is the collective size of all of the sub pixels of the red, green and blue sub pixels. And as you noted in micro led land and near Island, the goal is to get each one of those four pixels to be less than five microns. And so that means each one of these sub pixels needs to be a third of that, or less, which really is a challenge, when you're starting to bump up against the size of the light itself, the wavelength, the width of the wavelength.
Yeah, we're talking green light, which is nominally about 530 nanometers, or half a micron. Red is like 646 2640 nanometers. So on the order of half a micron, and you start getting really deleterious effects of light as you get within really five or 10 wavelengths of light, even if you have a one or two micron emitter, you're having diffractive effects, there are effects of light. As you get that small use all these like secondary problems start to become primary problems like the fraction, when you're talking to television set, you're not talking about the fraction, who cares. But if people don't understand, I mean, like a microdisplay pixel can be like 10,000 times smaller 10,000, you can fit 10,000 Micro display pixels on a TV set
pixel, okay, 10,000 pixels, for every one pixel, what's that for
is a magnitude in area difference. It's just huge. Even a cell phone pixel, like I say, a cell phone pixels, typically 20 3040 microns depends upon how much they're gilding the lily, and so forth in terms of image quality and stuff. But really, you're talking maybe less, I'd say nominally 30 microns, well, ours, once again, our L cos pixels are now getting down to about three microns. So that's 10, in linear dimension, and 100, and an area so I'm fitting 100 l cost pixels or microdisplay pixels in the area of one cell phone pixel. That's still a big difference until the rules of the game are quite different, you're, you're way off the boundaries of light. And as you say, in the case, if I the only as I could count, you know, if I can have that much pitch and make the emitters small enough, I can collimated. But if I'm on an L cost, the problem we have is we got nowhere to go. And this is the thing that faces AR all over the place. You know, I talked some about that, you can put this the display where you want to either an AR you got to put it around the head someplace. There's this kind of fundamental difference between AR and VR VR guys, basically feel the view is free, you can get a really take a large display, put it in front of your face, put some $1 of optics that's basically Oculus, Oculus, you take it cell phone size display, put some, but $2 of optics in front of it. And now you're into VR, from a display point of view got a lot of other things to do. But that's the display side of it pretty easy. The problem I have with augmented reality is I can't put the display in front of your eye. And I want it small and light. So I have to make it really, really tiny. So now I make it small and light and I start placing it away for me, right? Well, it turns out that optics, the optics we think of and this is a mistake I always try to implore on people who don't understand this stuff, is that most of the magnification that goes on most of what gives you that big field of view is by putting this small of some something small, very close to your eye. So the problem we have with with VR is they have big pixels. As you notice, you may know that the VR guys typically have pretty chunky pixels that are pretty complaint screen door effect and all that. Well, they're eventually going to get there, they're going to keep making the direct view display pixel small enough that that will start to go away. I think in VR, and AR, it's a much harder problem because we started with these tiny pixels. And yet, because we can't put the display really near your eye, we have to start coming up with complex optics. To blow those pixels up, we have to figure out how to actually magnify the pixels with VR, the way they don't have to magnify the table, they put them up here, they take a fairly big display put up near your face, you get big field of view, now you make the pixels, another 2x smaller if you make the pixels half as large diameter wise, you're starting to get a point where you start to approach where it's good enough for the eye. They're not quite there yet. I mean, even you know, Medic quest Pro is like half of what they need is, you know, they could double their Pics or have their pixel size they'd almost be home. Whereas an AR we have the opposite problem we have to make. We're currently with these teeny, teeny tiny pixels and how do we make them bigger, the cheap way to make them bigger would be to put them close to your eye. The problem is we can't do that because we got to put it we can't put it in front of your eye you got to put it back around on your temples and stuff or someplace back behind your eye or behind the side of your face and then route it to your eye. So now I need all kinds of complex optics to make it bigger. The other problem we have is we're getting right up our pixel sizes are are only a few are single digit, single digit multiples of wavelengths of light And now we're into all these other effects to talk about like the fractional. So there's, there's a couple of problems that hit back to AR. But the AR has the advantage that in theory is small and lightweight and let you see the real world. So it's a lot safer in a lot of regards, and that you don't have to, you're not blocking off stuff. And like I say, even with the best of me, if you look at if you say like Lynx is among the best of the past through AR solution, it's still very, it's still blocks off a lot of the real world that doesn't have that dynamic range. It doesn't focus, right, there's just a ton of things it doesn't do, right. If you were to measure it as an AR device, as we talked about, I do think there's a lot for augmenting VR, you know, with pasture like it, you know, if you want to be able to, as I say, there's kind of two levels. One thing is whether you want to be able to see where your hand controller is, you know, you got to go look at our other up for your hand controller, it's kind of handy to be able to just pass through and look for your hand controller, I'm sure there's also some game aspects, like a lot of the games we've seen for AR where monsters come out of the wall and stuff like that a lot of that stuff may work with work in VR, with pass through because but it's a different level, if you're really trying to do everyday life things and with also with safety markets.
So this notion than single emitter Extendo solves that by stacking the colors. You noted in a blog post that MIT demonstrated something similar stacking the pixels, there are some some trade offs in doing that. But the benefit is you get all the light, all three colors are the same emitter.
Yeah, that goes back to that fobian thing I said though, there are pros and cons to it, the issue I have with the stack woods over the single emitter. And I don't understand all the physics of the single emitter. So there may be drawbacks there. But at least Partek, for example, is claiming they get very good efficiency even in red. Compared to other people who've done just read. They're not as efficient in there, they have a single read, they have a read only a meta that's better than their statin than their single emitter read in terms of efficiency, but they claim to be narrowing that gap. When you do stack, your problem is most of these, most of the guys who've done stack are talking hundreds of 1000s of nets, where the guys who were doing unstacked are doing, you know just a single device or talking millions of nets. And that's where I wonder what happens. And maybe there is a market in between. I mean, the 100,000 nit market is really the kind of thing that you might want when you're doing like High Dynamic Range VR. But it may have problem with AR unless you come up with other optics, you know, the waveguide optics, the problem they have is that they have to have collimated light for the waveguide to work. And once you do that, you see that diffractive wave guys are getting a percent or two through so you've put nominally you put a million nits in to get 1000 nits out. That's, that's the kind of thing it was like point 1% It's like 1,000,002 1000 out the reflective type waveguides the Loomis type, they claim to be about nine times as efficient. They do seem to be better, a lot better. They basically their big advantages. They have a larger entrance area this get back to that nasty F and do term. But because they have about a 3x Bigger entrance area. They're about nine times inherently more efficient. So maybe maybe they're nine or 10 times right. That was the trend I saw by the way at the ARV RMR was funny enough. And I've mentioned this on my blog that that I'm seeing maybe a trend towards with micro LEDs more of a coupling to the reflective waveguides. We saw that in the play nitride booth, they only had about I say only and it sounds huge compared to only DS but maybe 100,000 nit type device. And they were there with the they were there with Loomis reflective waveguides they weren't as bright as they probably would like to be particularly with a 2d waveguide but it was, but I'm sure if they put a diffractive in there, you'd barely been able to see it. And then we also saw that interestingly jaybird while their customers evolved in using diffractive waveguides. In their booth, they were showing a reflective waveguide. It was a clone kind of a clone of the 2d Loomis waveguide but they had a 2d type reflective waveguide that they were showing in their booth demonstrating during their demonstration. So we'll see how that plays out. I mean, that's the story from Loomis, that they're that they're better. It seems to play out from what I've seen so far. That they do have an advantage and they claim it's in the in the 9x range, which is is huge.
Let's pause for just a second though on this thought. So if reflect To waveguides are 10x ish more efficient than diffractive waveguides
with micro led with micro LEDs
with my 10 times more efficient regardless of what is passing through. Yeah. But if they're 10 times more efficient in, in, in if the argument is that well, we can make we can theoretically make as an industry like there's so many companies jaybird talks about millions of nits with our green LED. But if you have the choice of driving that thing to produce a million nits, or you have a choice of driving that thing to only produce 100,000 hits, so that you get 1000 nits out to the eye and an AR Rig. When you think about the amount of battery power available on a device, you think about the amount of heat generated by these sorts of device. Yeah, the option is always to go with the the solution, whole thing else things being equal. The option is to go with the thing that allows you to ride it, drive it at a lower amount of power, right? Because you get all of the other benefits in the system.
Oh, yeah. Yeah, you gotta either you got the I call it the coin. You can spend your coin for brightness. You can spend your coin for lower power. You could mix and match how you spend it. Yeah, it's huge. If that's true, and I've not had anybody refute that. I mean, it's a claim by Loomis. It appears to be true based on what I'm saying. I'm not saying I'm not I'm not saying I'm 100% sure, but I tend to believe that it's true.
If efficiency is such an important factor in the optics. Why is it that diffractive seems to be the more common approach relative to the reflective of Loomis. Why is Loomis not dominating the market? Yeah, that's
it. That's it. That's that's the multi million 100 or billion dollar question. I guess. There's a lot of stories that go one is that everybody has brother can do their own diffract. I mean, look at if you look at almost all the diffract are a lot of them captive. You know, HoloLens did their own, it's actually a variant of, they bought technology from Nokia, Nokia actually developed the slanted waveguides that were then bought, they was bought by or licensed by them also was licensed by views, sick views, it claims that they're off to some other different technology. But basically, you look at the captive houses, they can build their own waveguides. They don't, you know, they can build their own. It's a little more customizable. You can make them different sizes and shapes. And you have a whole bunch of engineers who can do it. In the case of the reflective waveguides is only Loomis. Now the knock on Loomis has been that they can't build it at cost. You know, they can't build high volume at a low cost. That's been the knock on them. They claim and once again, I have no way to prove it. But they've got a new deal was shot. And I was in the shop booth at AW e Aw, AR VR mr. show. And please be careful on that. They are VR mr. show I was in their booth. And the representatives from shot said yeah, we can make it not going to be a problem when you make this thing and Barney. So I think if they can make it at a cost, because you know, everyone says well, the others are cheaper. Well, I have not yet to see a cheap diffractive way. Show me a product that's inexpensive. That's using a diffractive waveguide. So I don't think either one's inexpensive. Now now Loomis has been shipping them for what about 20 years now? A lot of that went into military and medical that's been their market that can pay whatever the price needs to be. But they've got to figure out how to make it their costs. They had a deal with quanta to lower the costs but apparently quanta you know it goes back you know you made a good analogy to with with DLP back when because I used to work at TI and people think I used to work on DLP and I never worked in DLP opposite ti fellow. So I knew a lot about DLP people don't realize, by the way, DMD which is the other name for was originally deformable mirror device, it was really like on on a keyboard, you know, on a keyboard, you have the clicks on keys, where you push them down in a little bubble deep ends down and makes content and the bubble snaps back. Well, that's the way DLP used to work it used to would do like that. And they need a thing called schlepping optics to discriminate because the problem you have with that bubble that popped down up is you didn't get much discrimination between the on and the off state. So he needs really complex optics to be able to see the image to be able to get in contrast, well then they came up with the flipping the the mirrors that tilted and the tilting mirrors. At that point, they didn't change it to from deformable mirror device to digital mirror device, DMD they changed the name, but originally it's called the formal mirrors. And then they came up with digital light pressing anyway, the joke and this is probably pretty true that ti lost are spent over $2 billion, probably equivalent to 20 billion today, but they spent $2 billion to get this to make a DLP to sell it for more than a cost to make or is a loss. They will they've spent 2 billion dollars and lost $2 billion before they made one that they sold at a profit. The joke was that in the oil costs industry 50 companies spent $2 billion. But they but it was each it was divided between 50 Different companies. And because of that ti got a much more manufacturable device at the time period that I entered entail costs is a TI had put one company focused and concentrated, even though it was a much more complex, I mean, elk, DLP is much more complex. Going back to my story of, of LEDs and other things. That DLP really should not exist. It's a really complex process, compare and very different from all the rest of the semiconductor industry. Whereas L cos is very similar to anything else done in semiconductor. That's why I got into it as a semiconductor guy, I said, Hey, this is pretty close to just plain vanilla semiconductor, make the top level of metal flat. And then you also have to have some process case of process compatibility. I mean, anybody that's why there's like dozens of L class companies through history, literally dozens because anybody could get into it, it's relatively easy, relatively small change to a fat now the probably get into his back to my synchrony of my talk about video rams. Even though it's almost the same, it's a little different. And that little differences in the top layer of metal and process and chemical compatibility with the liquid crystal double touch it, that little bit of thing creates a big headache. And a lot of fabs don't want to mess with all cause. But still, many, many companies could do it. Whereas ti took all their money and focused it in one place. So nobody at any point in time, but 2 billion into L cause no one person to really develop it well. So DLP was able to get there because they they put all their eggs in the one basket and then guarded the basket. However, going back to the early days of TI, people forget I mean, I was at TI and the whole reason to do DLP was to go after the rear projection television market. If you told Jerry Jenkins, who died in the middle of that was his face, very similar to Tim Cook. For whatever reason Jerry Jenkins had nothing about DLP. He was a he was from the military group when the abuse he got kicked out. Jenkins came over from the military group, because they were there was all kinds of chaos at that time. And he, for whatever reason, Jenkins decided I was convinced that not knowing anything about DLP that that was the future of television, that rear projection television with DLP was it as you may know that last that basically it got to market, they just got to where they're making a profit. When the whole rear projection market collapse, they have like two good years, they spent $2 billion and had two really good years of the market. Now, you know, they've done data projectors is kind of a transition way to get into it. And then the whole thing of the movie thing that was market, you know, you're doing the movie industry, because there's no vine, I mean, there's like, I think 30,000 I think at the time, there were like 30,000 screens in the US, you get you get three chips for projector. That's, that's like 100,000 chips, and they last 10 years. So you know, you're talking nothing thought you have 10s of 1000s of chips a year. At my days at TI when I went there, we never built a chip, if there wasn't 10, you didn't have a market that looked like it had many millions, if not 10 million or more, no less, develop a custom, unique process for your chip to go after a market that's in the movie industry of a few 1000 10s of 1000s. Maybe the whole market, but together is 10s of 1000s of units. You don't do that. So it never has really quite made sense. And if you didn't have the illusion, that that that was going to be the rear projection television market. It also developed a saying I had with my partner at Sandia, when we did el Claus. We always we developed a rule that says Never go head on head to head with LCD. When when you find yourself going head to head with an LCD guy. Those guys are crazy. You know, even Samsung's gotten out but in those days Samsung would build literally an entire city. I mean, not they can just build a factory. They built a factory, they built schools, they built infrastructure, they built an entire city in order to build an LCD panel. So you're talking billions of dollars to build a new fab including out of housing, I mean, everything and so when you're going up against that their their ability to drop trash, and then they sell them at cost.
Why do you do that? But it's insane. I mean, as a business people it's insane to try to go up against somebody who's doing that. So anyway, ti got into it. They had two good years with DLP. I imagine it's made money in the in the out years, it finally made it made a little money. But you would have never, if you would look at the I don't think you would have ever started DLP if you knew what the market was going to do to it, if you followed it from the the amount of money you put in and how much r&d you sunk in to now, I mean, basically ti today is an analogue company, they have DLP, because that's also one of those weird things. They've been taught for decades that TI was going to get out of DLP. But it's one of those businesses that's long that's making a little money makes money, even if it's not huge. You know, it's not even on their balance sheet anymore. You can't even see it, it is so small, a part of TI that they don't even individually line item it. But you know, you just wouldn't have gone into this business, if you realize where it was going to be. It just makes no, no business sense to put that level of investment in. But that was all by Jerry Jenkins. So Jerry Jenkins believed that Well, you could argue that could be the same thing happening with Tim Cook, Tim Cook had just or his gun, somebody convinced him because I knew Jerry Jenkins didn't know anything about DLP. He knew I mean, he was a military guy, he knew nothing about semiconductors. That was the key thing that was one of the weird problems with the whole company was being run by former military guys, who were kind of being misled, I felt were being highly misled by people in the semiconductor group. And that could lead them around. Because they didn't know anything about semiconductors. But anyway, ti still still does it today because you can't sell it. That's the other funny part about DLP. Why didn't ti sell DLP? Well, one of the problems you have is who would buy it, there's a lot of talk, they'd sell it to Samsung, because Samsung was one of their biggest was their biggest customer for a while the problem has with Samsung have bought it, everybody else would have gotten out DLP. Because they would have said, Why am I going to compete with a guy who was Samsung, so they're all the same DLP customers to them. So they did it, it's kind of settled into a state that it's in today. And they still do some stuff, I just ran into some ti guys who said they're doing some new stuff with it. And, and may, you know they have, they may do a new entry into the AR space. So we'll see. But it's it's a it's a tough business, on this
notion of kind of the efficiency of the optics and the benefits at the system level of having more efficient optics. But the potential challenges, but maybe they're being overcome with partnerships or potential challenges on the manufacturing side of the reflective optics bucket, which is represented solely by Loomis these days. But there's this other, this other couple of companies that are kind of floating around out there, and maybe you can help put them into perspective. One of them is a company that metal recently purchased named Lux XL. And their claim to fame another guest on this podcast in the past but their claim to fame was that they found a way to create these really teeny tiny droplets and they could 3d print with these tiny droplets secured very quickly. Lots of different shapes of optics, optical grade lenses, and the you know, that made sense potentially within a custom eyewear sort of business but within the area of AR the thing that they were talking about is that they could then encapsulate a waveguide to fractal waveguide or or reflective waveguide that can encapsulate it in some way in a way that makes sense for the for the waveguide allowing them to add prescription vision as part of the lens. One of the
tricks with with wave guides is that and this is something weird with with Loomis to probably should talk about that most wave guides they work by T ir total internal reflection and what happens is when you hit a surface there's a certain angle that light is moving through them and it's glass and it will not exit if there's basically air is it's got index one so you got if you have any kind of index mismatch, the light will not exit until the the light at a certain angle so well the way that diffractive and reflective work waveguides work is you kind of have to make it enter in you send the light in at an angle and to make the light come out you change the angle of the light to make it come out more or less perpendicular to the to the surface of the light and therefore it can come out well what happens is if I take a plastic if I take a glass which is fairly high index and put a plastic which is low which is still got much higher index and air what will happen is the the light will tend to go in through the plastic it will not it'll start exiting the waveguide well the newest Loomis is are kind of weird or kind of different though the newest Loomis is apparently are sending the light in at a much deeper angle than the older Loomis design because Loomis used to also need that air gap but the newest design to so called Zealand's apparently ascending the light through at a much steeper angle and the steeper that angle that goes through the waveguide the less mismatch You need to keep it to ti ra. And they claim now with current plastics that they can actually mold and touch the waveguide.
So that that has reduced that need to leave an air gap. There's also debate, I've heard different people say whether they're really leaving an air gap, or it's really kind of a two part molded process. I don't know how it works. But that's the deal that Lux XL had is that they could keep maintain an air gap by doing the stuff. I know that there's other companies now once once they're out there, you start talking about them. And then of course, once they got bought all these other guys started coming out of the woodwork there several other companies who were also doing, and this is in the prescription lens area, there are companies coming out, coming up with different ways to do prescription lenses. And there's, we know about the meta materials had been doing it for years, they actually bought a process from another company. And then tobuy, which actually has a very similar process, what happened one of those classic example, there's a company that was broken up. And the technology, the assets of the company went to metamaterials, but many of the engineers went to tobuy, who was most famous for eye tracking these days. So tobuy is also got a optic solution. And then there's a spin out or a company that kind of spun out a lux XL, that I've written about on my blog, and they're also doing these optics. Now not now there are ways to create that air gap without having to print the air gapped in. And so there's some debate of that. And there are pros and cons to the various ways of doing it. But this is a way of saying you're going to build a prescription lens into the waveguide. But now I've got to convince you that you're going to wear glasses with a waveguide in them. In other words, once you've done that you've sealed it all up. And now the question is what do you do if you want to change glasses? Does that mean you have to buy new displays every if you change your glasses every year to do you change your waveguides every year did they dispose you know is that it gets to be a very expensive proposition. I'm also has always been concerned about the printing 3d, whether you want to if you go and then print that makes a lot of when I was doing millet, I was working with a military company for a while Raven. And when we were looking at that, you know if you're doing military stuff, yeah, I can afford to build a custom lens for person. But the question is, is I think back as a semiconductor guide, my old semiconductor background, it's like, yeah, but I don't want to 3d print every lens I ever make. You know, that's a lot if I'm going to sell onesie twosie or sell a few 100. For example, what people don't necessarily understand is that the really high volume stuff is done with some form of injection molding, you want to injection mold, the problem is an optical injection molds are very, very, very expensive. If you're going to support a range of sizes, range of certain sizes, they're usually done in big blanks that are cut out a range of prescriptions. And even when you do like Verilux, where you do a variable focus, most of those are reground it's very hard to do glasses that are that don't that have also Verilux in them. But if you're but if you're going to do even a astigmatism one night, you might need like 400 different lenses to cover like 98% of the population. Okay, just with a astigmatism a diopter you have four different lens lots tractable if you're talking billions of people in the world, you know, billions of lenses, you kind of justify building pretty expensive molds for those 400 things. But that quickly falls apart when you start saying I'm going to do a one off or a per thing I can't afford to hold for if I'm only going to build 10,000 of these things, I can't build 400 Very expensive molds. So then you start to say, well, if I go 3d print, then my problem is, well, how much is each one going to cost? How long is it going to take me to make each one. And then you start to look at other methods where you start maybe coming up with kind of in between processes. For example, there's a company that makes a mold so you make a mold, and then use the mold to make sure you see make one copy of it, but then you can then stamp out a whole bunch out of that one mold. These molds don't last as long as an injection mold. By the way, that's the other thing they're using what we'll call a resin lens. That's an optics is Atlan. So to says Lux XL, the difference is that different level of resins that they use and the different types of resins, the most of these ones that use a soft mold or a relatively disposable mold or no mold at all are using resins just like epoxy and how epoxy you have a part one part epoxy, one part hardener, you mix them together and artists. That's basically what these guys are doing only most of the time they're UV cured. Now I believe at optics and optics is using a heat cured. Lux XL is using a UV cured. So the UV light hits it and causes it to react and turn it into a hard plastic. You know, the pros and cons, all of these, the ones that are you're going to buy commercially, typically are injection molded, so you're going to take him into a mold, and injection mold. Because when you make really, really high volume, because even if you build like at optics, I think they're talking maybe 100 ish copies per mold. So then you gotta throw the model, the mold is going to degrade any kind of soft mold is going to degrade as you use more even when you use these resin plastics. So anyway, these resin type plastics, you take two parts are, in the case of lux, XL, they're taking a UV cured one. So it's a one part but it's a very thin thing that you got to keep curing each time. Well, that takes time, you got a really big expensive machine tied up making all those layers to stack them up. Because they have to be thin. If you want high quality, you got to stack those layers up to finally make a a single print. And so you got this kind of trade off his classic business trade off between these things. And then you have to look at the overall AR market. How big is it or not. Now Lux XL was extremely attractive to low volume, people, low volume, high value, and very attractive if you're a startup or a new company, because there's no mold to make, I could see where the Add optics and mod the mod make a mold approach becomes interesting. If you are in that mid range where you you're not you're not making the millions, but you may be making 10s of 1000s of these things. And also the you've got the meta materials and tobuy approach, what they use is glass molds. So the molds themselves are made out of glass, fairly cheap to make. They take two molds, they can actually move the molds in different ways and get different astigmatism and stuff. And so they don't need as quite as many moles as you might think. And then they fill those with a UV karega. So that are those guys are kind of in between their each approaches. But they're basically ways of addressing prescription glasses short of inserts. And to me there's a huge gap here. Like I say, I'm just writing this article on Argo. And I think Argo is kind of like ended up kind of landing in a void in a way. There's a lot of things I liked about what Argo did this is digital lens Argo. But at the same time they have enough I believe they they talked about that they have enough eye relief that they can allow prescription lenses. But the design of their headset is such that you can't put a prescription lens in there. They didn't leave space elsewise for a prescription lens. So it's kind of like, well, then why did you do that? You know, it's like, Why did so well, they've got it, they've still got inserts. And if I was designing that product, if I had input on the early days of it and when I told him if CES was you should have done like HoloLens and built a you know, realize that these aren't really glasses because that's the other thing I talked I can't call them skull grabbers. If you notice on some of these headsets, now they have these, if you look at the so called temples there, they're so far pat your temples up at the front of your face, but now they call them temple or temple tips. But these things are starting to reach around and grab back around your head now. I mean, the ones from Argo nearly touch on a normal head they come they would come around so far. You also see in the Panasonic there, they had their their pancake lens ones. And they also have these big things that reach around the back. Well once you start realizing that you're not really building glass anymore, they still are the biggest ugliest class in those classes. They still they kind of look like glasses on steroids that have you know, but they don't look like glasses much anymore. Once you realize you're not making glasses, maybe you should realize that you should really say forget this. And let's build something functional. So I think in some ways Argo ended up in between, in that they have some capabilities, it's got most of the features at least and I don't I haven't compared it for like I would slam work versus I will once do professionally look at it on paper you say this is a better this is better a lot of ways some other ones do. But then you said you didn't make the ergonomic choices holiday because you know HoloLens put I call it the bustle on the back, or they tend to call it a bun, you know Airbus, but basically they have this little thing on the back and you put all your computer stuff back there. Now what you can do is you can you can take weight out of the front, which is pushing on your nose and move it to the back and balance up the headset. But you have to admit to do that you have to admit that you're not classes anymore. And if you won't make that admission, I think it sends you down a road of a series of mistakes and if I look at when I looked at Argo I said you did this this this and I told it to describe to Bradley I said it's like they fumbled on the one yard line with it with with you got the eye relief you got this you got that you got this other stuff. Why don't you just why don't you just take what they did go ahead and put the thing on the back, wait it up better, make it more comfortable and allow you to wear your glasses because you end up in a no man's land if you don't I think that the prescription insert means you've locked yourself into a market where people are going to buy prescription inserts, custom ones, because really, if you're going to do work, you need individual ones for each eye, you can't just have a diopter, you've got to have some that's going to have astigmatism and other stuff. And if you don't have that, then they're not really nobody's going to wear it. And so I think if you're going to do it, you should just let people wear their glass. If you're if you're not, if you're going to be big and ugly, why not own it, and make something functional? Because what you end up with is something that's big and ugly and has functionality problems. And I think that's not a that's not a place I'd like to be.
Do you think that that AR glasses, the Tim Cook version vision of the world in which we're going to have this pervasive light, relatively lightweight pair of glasses that incorporate some sort of display system that overlays digital on on the physical world? Whether or not it's mapped to the physical world? Maybe it's a separate question. But do you think that this idea of seethrough AR becomes widely adopted eventually,
depends upon your definition of widely adopted? I mean, I think it's going to get it has the potential get fairly pervasive in some markets, like industrial use, you know, any kind of I could see it becoming fairly pervasive, in the sense of you seeing it all the time. I don't know how it gets to be cell phone, like, volumes, I don't know how you get to the cell phone. Like, you know, the these, you know, you know, could it get to a million units? Yes. The question is, does it get to the hundreds of millions and billions? And that's where it's harder to see it anytime in the next 10 years. And I think that's probably maybe Apple came to that conclusion and whatnot and said, Well, we're just going to fiddle around and we'll we'll cut it off. We can we can afford to dabble are dabble are, you know, we can easily afford to dabble with VR. But maybe we don't go there. It's just a hard market. It's it's, it's got obvious utility and military applicants. The place where it always seems to make the most sense is with an industry. There's just all these things, all these applications, all these use models where you need to keep your hands free. And by the way, this is a point I make hands free does not mean you have a controller in your hands. Even hands free doesn't really mean gesture. And by the way, this goes back to something good or good at Argo has a legacy. Some of the people and then have a legacy from where where realer has a legacy back to GoldenEye, which was developed back around 2010 or before and that use voice recognition, I think voice recognition is going to be really important and a lot of these AR headsets, because you got to get your hands free, it's no good if my hands aren't free, if I have to make gestures, or even with the wrist method. I mean, there's all these things of trying to protect your your electron motor nerves and stuff like that, that people are doing risk control as well. If I'm having to manipulate my hand to make my wrists and signals to whatever it is on my wrist, then my hands aren't really free, because I can't be doing that we've got my hands full well, my hands are doing something. So I think there's a lot to be said, for apps that are hands free. I think there's some big applications that people aren't into anticipating there's a company out there, I've talked to a few times that's working on something for Vine cutting, they're saying, look, we got all these inexperienced people out there in the vineyards. And they got to cut a vine. And it turns out if you cut it the wrong way, you can kill the plant or severely damages grow. If you don't trim the but you got to trim them properly. To get the maximum yield. If you just let them grow wild, you don't get maximum yield either. So there's a way to trim up, we can imagine an application where the guy sitting there wearing a headset, and just shows them where to cut the vine. That's a that could be a pretty decent volume application by AR standard. So there's a lot of things like that, that aren't even in the most obvious areas. So you know, I can see it getting to a few million units a year, it's how do you break through and replace a cell phone that that's the one that that I don't think even you know, even today gets a lot of scoffing and I don't think it's Luddites. I mean, you know, you get this, you know, I'm, I'm old enough, I'd seen a bunch of turns of technology. You know, some things are just, it's just really hard that technology doesn't move the way you think like if, for example, I was a kid growing up in the 60s. You know, I was born in the 50s grew up in the 60s. I mean, if you had said in 1968 that we'd be on Mars by the year 2000. Everyone say why is it going to take so long? Right? I mean, look, we went from we went from putting a man and putting a man and not even an orbit to putting a man on the moon in about 10 years. Why can't you get to Mars and another 10 or 20 or whatever. And you know, we haven't even been back to the moon. So Since that, so you kind of gotta realize that some technologies are actually really, really hard to do. It's hard to replicate. Like, we get a man to the moon, we still haven't gotten to the fact that we go back and forth. I mean, we should have been, I mean, it when I was growing up, oh, God, we'd have. I mean, if you were in the 1960s, you would have sworn that we would have colonies on the moon by now. And we've certainly been to Mars a few times by the year 2000. Not no less the year as somebody said, 2001 A Space Odyssey. The reason why they picked 2001 Is that was so far out at the time they picked it, but we've surely surely be be doing what they show in the movie 2001 By the year 2001. Okay, so it shows you how things don't move the way you think, you know, we'll see how this chat, you know, the other big thing that being taught these days is the chat GPT and AI type stuff, and we'll see how that works out. But I just think things are going to tend to slide in a different direction. It's just AR is is a little too hard to get to the volume. You know, and then the question is when people who don't wear glasses today, are they going to wear glasses? There's just a lot a lot of things unsolved, we see this by the way in VR, you know, every now and then they get swept under the rug like those of us in the industry are know about vergence accommodation conflict talked all about lots of papers and hundreds of papers about virgins common a conflict. You don't hear anyone talking about it in the in a generation I jumped ahead i i did my thing thinking maybe the quest pro would have some level of verges combination conflict built in, it didn't. That was the entire reason to exist for magically magic leaps, entire value proposition was there, we're going to have two different focus distances. With the Magic Leap one, magically two comes out, they got rid of it. Because what they found out was in order to get tutor focused apps, they give up a lot of image quality. So they hurt the image quality, you know, it's kind of like you do that spider diagram of looking at all the things you did? Well, what you did is you heard one of your most important things was image quality, you made it much worse, in order to get this thing a vergence accommodation conflict. But we're not addressing that. Well, if you're not addressing that you're not addressing a large percentage of the market, you're just addressing that people are willing to put up with a headaches willing above the nausea, and hope they kind of worked their way through it. They they stay with you long enough to work your way through it. But people aren't doing a lot of the basic blocking and tackling and whatnot, in all the things of making sure you deal with IPD really well, if you're going to do cameras, I mean, really, if you're going to do like pass through AR right, those cameras, the cameras on the front should be on gimbals. And they should be sliding around with your eye. And then they're not going to do all that, that we're we're just one way through it. So yeah, I guess that's why I'm somewhat skeptical is is that I'm trying to factor in that yes, we get continuous improvement things keep improving, keep you people keep doing better. I just by the way, in my in this article I'm about to come out with on my blog, talking about Argo again. And it so happens that one of the persons that digitalise was also at odg. And I happen to have pictures of him from 20, almost a similar pose, but from a different angle. From 2017, wearing an odg and 2023 Wearing digital lens, the headset and you can see that made progress, it's clear you can see the eyes, there's a lot of advantages of it over the OT gr nine, which is a birdbath, which is what we see in most of the stuff out there. And you can see there's a lot of advantages. And a lot of things have changed. But still, you can see how far we have to go in terms of improving things. So it's kind of a it's a mixed bag there. But there are some real markets, I do believe that there are serious business, it's easy to make business cases, the enterprise cases are easy to make. The issue is are those mark is big enough to attract the big boys, the the metas and, and and the apples and so forth. And I think that that's still what we see as they keep delaying. I mean, you know, the answer is, so far, they wanted met, nobody wants to be seen as the Luddite the guy who didn't believe in it, but you see them keep delaying and, and whatnot. So I think that kind of answers that in a way is that they're not you know, with all the money, people think that all you need is money and that now, you could spend, like I said, I think Mars is farther away. Now I believe, Mark, we're farther away from getting to Mars than we were when we landed on the moon. The the my perception of how far away getting to Mars is. And the reason why is we know so much more. We know all these other things we didn't know. We just thought it's just like a long Moon voyage. Well, yeah, but it's two years away. You know, it's the don't they only get near each other every couple of years because the way the orbits go. There's all this radiation out there and what are you going to do if there's a solar flare, something happens while they're on their way there? And then what are you going to do to protect them from all the radiation they're going to get? And all this other stuff? The problem gets a lot harder yet. Yeah. And this is the one that drives me crazy. Yet was it they were recruiting for people What about three or four years ago, maybe five, they were trying to recruit people to go to Mars. And I'm kind of like they should have been recruiting their grandkids that's, it's gonna take a while.
I think we'll be sending a lot of stuff there. I think we can send vehicles there. And I think there's a fair chance we'll get we'll we'll habitate the moon sometime in the not too distant future, the more water up there, which is a good sign. There's some reasonable reasons, but you still got to protect yourself from radiation. People think, you know, there's, there's nutcases who say, we didn't make it to the moon because of radiation. So no, they were really worried about it, they did a lot of things, they had dosa meters on the guys, that were the way they were trajectory. When they did things. They had people monitoring the sun to say, hey, if this happens, you gotta get home and stuff, they were worried about it even back then, and they dealt with it, it's a little different. If you're gonna live up there, you're gonna, you're gonna have to be able to deal with, Hey, we have a solar had a solar event, you're about to get dosed, like crazy, you better have the shielding and whatever to deal with it.
In this world of AR, if you had to bet on technology categories, for the display side, for the optic side, what would you place your bet? What's the timeframe? The timeframe is, let's say it's five to 10 years, this point in which we have some sort of broader adoption, even if it's not hundreds of millions or billions, but some sort of broader adoption of the AR technology? What are the technologies that when
you know, and people will say I'm biased because I worked in it, I have no vested interest per se and l costs anymore. But I would say for the next five to 10 years, it still feels like L cos is the winner. I think I think the laser scanning stuff is has been a loser is a loser will always be a loser for displays, I believe in laser scanning. For things like detection. I think there's good ways to use laser scanning, to do detection, you know, all kinds of slam and all kinds of stuff. There's some neat stuff that could be done with lasers for detection, but I think it's a lousy display device. And the reason is when you don't need to put up an image a human can see. So this for scanning, it's much easier when you have put up an imaging, say you have to have it's a whole different kind of class of device, you need the long term at some point, micro LEDs when but the question is, is that five, I don't think it's five years, I think it's at least and before they went out. And it could be 20. Just because they're a long way. I mean, if you if you told me I, I mean, I want to pull up an image of a person, I want to put a human up there a photograph of a human. They they're just so far away from doing that right now that they've got to solve a lot of other problems and get the coffee, you know, they got to work on too many vectors at once we used to say at TI, you can keep the same quality or same capability and come down in price. Or you can keep the price the same and improve the quality. But usually you fail when you try to do both at the same time. You can you can kind of notch over, you can kind of staircase and go over a bit and down over a bit and down or up and down. But if you try to go both directions at the same time, you tend to get screwed up. And I think that's kind of the thing with micro LEDs, we've yet to see a really good quality for any amount of money. We've yet to see a really good quality micro display. If I looked at, I've looked at micro LED displays under microscopes. I've looked at it with my camera doing macro lenses, I can show you dead pixel. I mean, every display that's out there has super high variability in pixels. And I did a simulation on my blog, it was all done in Photoshop. But I was trying to explain this. The same defect like if I look at a green display a green only display, if I was to put that with two other colors, right, so a full color display all three chips, if I took that green display and matched it with similarly, despite you know, similar issues with the red and the blue, it's gonna look horrible, because the colors are gonna go all over the place, you've got to get to the point where you really can control each and every pixel on that display. And micro LEDs are a long way from that. And I don't think they're five. I think it's more of the I think I'm fairly safe and saying it's more than five years away. That doesn't mean that there might not be some very good AR displays that are multicolor. That could happen sooner than that. I mean, I think we're going to see some of those come out but they're going to be more multicolor. They're not going to be the kind of thing a consumer would want to look at consumer is going to say I want to see somebody's face. And that's that's the acid test when you're well when you're willing to put up a row even even when you looked at play nitrile If you look at their big displays, their big displays had pictures of people. But they had all kinds of colored dots on them. All kinds of they were false color, they were kind of purple and blue and whatnot. The whole reason why you do that, if you really can control your colors, while you'd put up a human face, natural looking human face, you don't splat a bunch of color on them and change the colors and do all that. I oftentimes say when you see a demo, if you see it, lots of motion and movement and whatnot, and all this going on. And they won't show and they won't show you a still image, it's because they probably got bad dead pixels or bad pixels or something going wrong. Because the more motion you throw in, the more you can confuse the eye. If they won't show you a human face, if they show you a cartoon, that means they can do color. And they mean they can saturate colors, because cartoons like are really highly saturated colors, but they don't look human. If they're serious, and they really got good color control, they'll show you a human face. If they won't show you they're still human face that you can really look at, then they're they're not ready for, they're not really ready for primetime replacements of television, you're not going to put it in your cell phones at all. So anyway, so that's the kind of thing I see, I don't know about display, I can see augmentations to all costs coming along. This is one of the things that may happen. Just like with software LCDs, if you had said LCDs still be the dominant display technologies, what did they take off about lighting in the 90s LCDs took off, and then almost like they're going away for quite a long time now. So LCDs are what they grabbed like a 30 ish year run of being the dominant display technology, and they're still haven't really been overtaken. And in terms of decent volumes. You know, the LEDs are there, they're making the cell phones are making it into watches, they're making an end of things. But they still haven't taken over the computer monitor and TV space any any huge amount. So that that's going to take, it's going to take a while for for them to get there. But I think the long term trend is it seems to be micro led, but I could see l costs improving. There's a company I forget what the name of them is right now. But they have one that they put a laser in and maybe you could laser illuminate l claws and do things. But right now l cause seems to have the high cards, they can make their pixel small the thing that the reason why I kind of come back to L cos is they have this, they can make their pixel small. If you make your pixel small, that's a huge advantage in AR. Now, when we talk about VR, VR, it looks like only dieses is looks like they have most of the draw cards. So for quite a while now I think VR is going to start to get dominated by the higher brightness of micro LEDs, prickly pig, all the rage now everybody in VR is doing pancake optics. That seems to be almost everybody. And seems like everybody is saying their next generation is going to be pancake optics. Pancake optics have the disadvantage of being highly inefficient. People may know that the medical is pro used a LCD, they're pumping a ton of light behind it because LCDs themselves are not very efficient. But to get the brightness any one of the problems you have with the pancake optics, I think about 10% of the light from the splay makes it to your eye. And maybe half of that again, it may only be it's maybe only five or 6% because you got to polarize it. So if an LCD LCDs output polarized light, so I think 12% or something makes it to your eye if you did an OLED you'd have to polarize it. So maybe by the time you're done there, maybe only five or 6% gives to the eye. So you think you got a lot you know, it took about an OLED with, say 3000 nits. But now yet now you've found that you're down to maybe 200 nits or 100 nits because you've gone through all that reduction. So that's the problem of the pancake pancake let you get smaller let you make the optics bigger not the quality is just bigger the biggest problem biggest thing I've seen to see it's a quite a meta quest pro and a quest to the resolution is almost the same by the way the pixels the number of pixels in the angular resolution everything's about the same contrast is better but the biggest advantage is that was a that they use the Fernell lens and unless your eye is dead center in the frontal lens you get a lot of problems the pancake optics biggest advantages no matter where your eye lands, they're very slow degradation image quality and if you can off to the side a lot the the image starts degrading with pancakes do but you have a very large sweet spot where your eye is very happy to sit and not really notice much of a problem because of that there's a strong drive both for ease of use and and just desirability even for the higher end people to desire so both the low end and the higher end are going to divide are going to desire the pancake optics but they're like I said about 10 to 20 times less efficient optically. Because the way pancakes work the lighthouse To the light goes recycled through it a few times. So you have to polarize the light, you have to reject light, you have a layer that's literally a 5050 mirror. And by the time you go through a whole that process, you've thrown away a lot of light. So you need a about a 10x, give or take brighter display on the side, which means you're going to consume a lot more power, therefore, you're going to put fans and whatnot come back in. Eventually, you might say micro LEDs, we'll take that over to, they're just not anywhere near, not the best of the micro LEDs is not anywhere in the ballpark in terms of resolution, or image quality or color control or just about anything with respect to micro o LEDs, right now the only Ds are going to win. And in that arena, maybe some kind of tricky LCD will also be their transmissive LCD. But it looks like the high brightness, LEDs will probably went out in VR for the next and then in this period. So you'll see the LCDs gradually replacing with high brightness LEDs, at some point farther distant now past 10 years, maybe the micro LEDs, the inorganic ones will eventually surpass them. But it's probably at least 10 years away to get to the cost performance and image quality that you get there. So that that's that's a lot of prediction. But it just way, way pretty close to me right now.
On the optic side, do you think diffractive winds? Or is it gonna be reflective? Or is it something else?
The question for the reflective guys, I mean, and they're not absolutely perfect, because they've got the slats in there. And in you can see some defect at the site level, but their uniformity. I mean, I've yet to see an image from a diffractive, that comes close to a reflective waveguide. You know, I'm reporting on this stuff. And I hate to say that because I got a lot of friends on both, you know it all these companies. But it's if you get down objectively, the image quality of the reflective is better the efficiencies better. It has a lot of aces up its sleeve. And a lot of technical advantages. It may not be perfect, but it certainly gets there. I like the idea. As an engineer, there's nothing physics that says it has to be more expensive than the other. The other guys I'll talk about how they're theoretically cheaper to mate. But they're still talking theory. You know, the shock guy says they can make it. So if you know it may be that DLP thing, where if one guy focuses, you've got 20 guy is like the Hulk. It's like the flip of the L cause you have 20 guys develop spending $2 billion, versus one guy spending $2 billion. In the case of DLP. That gave DLP a big lead, particularly in the projector space, it's never been a sale, because of the size as the size got smaller. And because we weren't trying to leverage a large light source, alcohol started to dominate in the the AR space, because you can make small pixels and the power of the display was lower. The L class with the L class takes a lot less power than DLP to produce an image. So while the DLP have a little had an advantage, argue, different numbers, but somewhere between 30% and 60% advantage or 2x, sorry, two depending on which way you're working, maybe a 2x advantage and brightness over L costs. But you never got there because you're only talking a few lumens. Whereas a theater projector can be 1000s of lumens. Your eye projector is aluminum. So when you're only talking a few lumens, what counted was the power of the display? So that kind of made elkus went out? Well, anyway, back to this analogy. In the case of Loomis Loomis is all focused so lots of guys are making diffractive waveguides a lot of different efforts and there's some big names behind it got got the big material companies behind it. But Loomis is kind of focused they seem to have the technical advantages. I mean, they have better image human form, it is always better than any diffractive they get all their colors at once. There is a little disadvantage, you could argue they're overkill for the green only one so that might be why you see the green onlys. You know, if you're doing green only you might see diffract to be a little, you know, advantage a little bit relative to lumens. Because lumens is supporting all colors, even if you only have a green only. So they would tend to be more advantage when you get the color because they're one waveguide which is thicker, but they only need one versus three, two or three stacked waveguides because the other guys need two or three guys that are precision aligned and stack. So you see that's where the balance comes out. When you start talking color. You know, you've got to really take The whole thing through, you don't just say, well, I made this waveguide. And it's cheaper, you got to say, well finish the design, look at the projection optics go into it, look at the power you got to take to the whole thing. But I do think that the reflective waveguides are advantaged there, then the question is, is there something else, you have companies out there, like a rim is trying to do something different. I know other people are trying to do some different things that are not so waveguide days, because even as efficient as Loomis is, relative to the refractive if you could come up with a non waveguide approach, that can be orders of magnitude more efficient than even luminous. And some of this depends on light source on the display sources too, because it's, it's not an easy, like, let's say company came out with a collimated light LED. So somehow you did a micro LED. But instead of each pixel emitting nearly Lamberson light, they emitted nearly collimated light. Well, if they did that, that might swing in favor of say the waveguides versus other optics, because then all sudden, that led would be really good for using with whichever waveguide. So it's a little hard to say on that regardless, because I do expect that at some point in time, we will see a highly collimated led emitter, that's not a laser because you don't laser creates a whole nother complication that was one of the this is the thing that we always saw these grasses, greener things. But looking at HoloLens, HoloLens thought their problem was brightness. And so they decided to go with lasers. And then they put two lasers to get more brightness to get the brightness they needed, they had to put two lasers for color. So they have six lasers in there was a complicated as heck driving scheme. For the money they did they put into that, they just could have bought a good ol cost device and been done and had a much, much better image. I mean, it's amazing how much money they spent trying to work that and what they found out was that while the laser solves one problem, which is getting the light into the waveguide, the problem they have with a waveguide with laser light in there. And the interference meant that he had really extreme precision problems in making the whole thing. So they never got a very good image out of it. Plus the image resolution sucked. I mean, their whole dried scheme. And the way I've documented this on my website was just terrible. So you know it this is the grass is greener stuff. The problem is, and I always say on my website, I tell you what the other guy is not telling you. Basically, the marketing guy at the company is going to tell you all the virtues. And he never seems to talk about all the drawbacks. And so kind of what my blog, sometimes I'm seeing is very negative. But it's kind of like, I see the claims. And then I'd say yeah, but these are the, you know, these are all the problems that come with it. Or they say I could play this quiz show, where they say, give me a technology, and I'll tell you what's wrong with it. There's a lot of that, and eventually things break through. But you know, as I say, we think Mike really does have a high card, certainly in the waveguide area right now. It feels like the reflective waveguides have the long term edge. If that, you know, it's a very bad argument. And this goes back to the cell cost DLP thing, it's a very bad argument to say you have a template, you say they can't make it. Because now if your only argument that your technology is better, because they can't make theirs, all they have to do is make this
huge admitted they want. So that's what I call a temporal argument. It's an argument that only says these guys have to get better and when they get better they win. That's not a position you necessarily from a business standpoint want to be in. Now, in the case of micro ladies. It's a little different game. Because we don't know it's not like it's this year or next year or just change the process. It could be a lot of years before micro LEDs get there we could we could literally see a generation go through the whole market before micro LEDs just totally dominated when and that's not to say I mean I expect to see him and television sets. That's the easiest area for them to go but they're they're going to sec see this the other thing people looked at, if you look at it like we talked about only DS in your watch and your cell phone, those are very similar structured Le O LEDs. They are color emitting LEDs they're they're each emit color. If you look at the LG the guy who won out to a degree in televisions, he built a white he had a white emitter, OLED that he color filter. And the reason why you got to look at the use case that TV has to be on all day. And you expect it to last 10 years. So you 10 years at at 10 to 1215 hours a day. As we already know a lot of them burn in but they but the problem Have is Sony came out with a OLED that most people don't know they actually went to market with a small low LED TV. And they had to pull it from the market because it was burnt. Basically the Red, Green and Blue were shifting color at different rates. Because they were literally the colors were shifting over. Within months, the colors was started shifting, because the the Red, Green and Blue were burning in at different rates. So what we see is, is that the LEDs are reused and watches and cell and cell phones is a different technology than the structure we saw on television sets because the television set, they had to have something last for long periods of time, and they could sacrifice some efficiency to get stability of color. In the case of a cell phone or a watch. Power is much more of an issue. So you sacrifice longevity, you sacrifice burnin, you sacrifice a lot of other things to get something that will work in the cell phone. So the cell phone is a different OLED. So Samsung tends to dominate in cell phone Oh ladies, and LG dominated and televisions because they went in different directions. Samsung tried to take their their cell phone technology and apply it to televisions and you know it's where they are today. They're not doing it. So there are reasons why well, that's the same thing through the the we're going to see in the in the micro LEDs I believe what we'll use for for micro displays and glasses. I think that the manufacturing process to build micro display micro LEDs is going to be different than what we use in watches. And cell phones, watches and cell phones are very similar both near to face direct view going to be different than maybe what we use in television. So you might you're probably gonna see three technologies, three variations of micro LEDs, you're gonna see those genes for small micro displays, those used in watches and cell phones, which will be using a similar technology. And then what we see in TVs will probably be a different technology TVs and walls. Because in TVs and walls you can simulate see. And we look at this here we build Emily build a micro display, we build on a silicon chip, we literally flip them over on, they're usually made on a CMOS substrate of some kind one way or the other. When we build a watch, they're built with a high temperature poly silicon thing on glass, usually on glass, if we're not talking and talking LCDs. Anyway, the LCDs are made on glass using a much higher and more expensive process to build a transistors. When we make them on television sets, they almost literally printed that some amorphous silicon transistors, they're terribly, terribly inefficient transistors, but they're huge. So they use size to get enough power to the to the thing, so we use that actually, while they may think of the same, they're actually quite a bit different even within the else. All those three are our LCD technologies, but we use radically different technology to drive them and to make them work. And I suspect we'll see the same thing in micro LEDs, we're gonna see, those three tiers probably break out, maybe even something totally different if you're talking walls and stuff, but I think the TV markets will be different, because what they're going to do is they're going to simulate, they're gonna take every LED and cut it apart from the wafer, and then take each LED and mix and match them up, they'll come up with some weird schemes for that. But another technology I forgot to mention on micro LEDs that might make them happen a little faster, is I've talked a lot about pixel shifting. You know, we talked about how bad pixel the pixel is that you know, two pixels are never the same brightness. And you might even have some dead pixels in there. So how do you build some this fault tolerance, which is important? Well, the problem we have with the eye, see if you had a camera, you can fake it out. Many cameras have dead or weak takes a lot of cameras get calibrated. Same thing with you know OLED displays that thing called Mirror, where they they they know that the dark state, they have a really bad variation. And they calibrate that out. Well, what we might do with with micro displays vertically micro LEDs is I call it pixel shifting, but basically vibrate them but by maybe a pixel or a sub pixel and, and by shaking them back and forth, because they switch in nanoseconds. Unlike L cos or even DLP. Micro LEDs switch extremely fast, you can turn them on and off really fast. So you can imagine a thing where you kind of shake them around or move them. Now you've got to figure out a way to move it you can move it physically, with something that physically moves the whole die, you might move it optically with something that shifts that optically, but I can see something where you do some shifting and that might accelerate their acceptance quite a bit. But still five or 10 I think think that's still get you you're still looking at a quite a few years. But you might look at you know, people are going to get clever as I sometimes say. It's just like when people look at crossover careers As they always say, this is going to cross over here. I started this in television, we saw LCD monitors coming for a long time, televisions hung on for quite a few years past what everyone thought the crossover point would be in television sets. And the reason why was people kept making televisions better, they kept figuring out why the guy in the old technology just like only DS, the only DS guys, we're going to, we're suppose we should all be watching nothing but all the D TVs today, if you listen to what people are saying back in, say 2010 Just go back 10 years, everything would be OLED today, but it's not. And that's because the LCD guys kept improving. So generally old technologies hang on a little better, because there's this tendency to think that the old technology is flat, but it's still making some progress. So the intersection point comes out and in space a little bit,