The AR Show: Karl Guttag (KGOnTech) on Mapping AR Displays to Suitable Optics (Part 2)
4:49PM Mar 22, 2021
Welcome to the AR show right dive deep into augmented reality with a focus on the technology, the use cases and the people behind them. I'm your host Jason McDowall.
Today's conversation is the second part of my interview with Karl who talk who you may know from his technology blog at KGOnTech at kguttag.com. Karl has 40 years of experience in graphics and image processors digital signal processing, memory, architecture, and micro displays. He's got 150 patents to his name related to these technologies in many billions of dollars of revenue attributed to those inventions. Karl spent nearly 20 years at Texas Instruments. And he's been a CTO at three micro display system startups in two of which he was also a co founder. And these days, he's also the Chief Science Officer at Raven, a company developing a hardware and software platform to deliver mission critical intelligence to military and first responders when they need it most. In the second part of my conversation with Karl, we talk about matching display technologies to the right combiner optics technologies. Karl also talks about which of these he thinks has the best chance of being successful. He also discusses the importance of matching what the devices can do well, to the user and the use case.
Subtlety is lost in AR. That's another big thing. You know, the whole thing about watching movies or using it as a computer monitor is, in my opinion, insane. And it's so far beyond being wrong. And I'd love to debate that with somebody because, unfortunately, all the other people on that stage, if I go into stage at an AR conference, all the other people on stage, we nod their heads, and I'll be saying that's crazy.
We get into some of the use cases Karl likes across consumer, enterprise and military. On this last one, Karl goes deeper into the hard trade offs and delivering something of essential value to military and other first responders in the field. As a reminder, you can find the show notes for this and other episodes at our website, the AR show calm. Let's dive back in.
You recently been sharing this is mapping between all the different sorts of optics technologies we talked about diffractive Luma says reflective waveguide. There's other variations of kind of diffracted wave guides that incorporate like a holographic approach that North was using with a mirror like surface using a holographic reflector, which is kind of a mirror aspect of different sorts of angles at play. You don't have to reflect in quite the same way on a holographic element. Yeah, you talked about twos and they're sort of freeform object he talked about birdbath with, with what enrile and with what novo and what odg has done in the past. And then you even mentioned or M with our super slanted was all these sorts of different optics technologies that you described as all these sorts of display technologies that are out there. The L cost and laser beam scanning and laser illuminated LCD, laser limited el castigo, micro inorganic led micro organic led all this stuff. What fits like how should we think about the right sort of pairing between the different display technologies and combined optics technologies?
Yeah, well, I think it's, it's pretty clear that if you don't have really great add on do you're not going to go on these people expanding waveguides very well. So that kind of limits you to L cause DLP and laser beam scanning and the waveguides. On the question mark area, and it's not looking great is micro OLED, for example. I've seen them used I've actually seen a headset with a micro OLED on it. I think it output somewhere between a half and one nit.
And for those who don't know your cell phone is probably 500 to 700 nets. Your computer monitor is typically 150 to 200 nets. So you can kind of imagine what happened it to one net looks like you have to be in a black room and let your eyes adapt. That's like night vision night vision goggles, night vision. When they do night vision goggles, they are down to like a net or less they go below in it because they don't want to bright adapt the they they want the soldiers to be able to see at night. But anyway, you you just can't get any light through it. Because you start with 1000 nits, you cut it down by like a few 1000 a day. This was into a all call illumise clone which is much better. It's actually much more efficient than the diffractive waveguides. But it was only a one D expander. And the 1d expander makes a big difference because you're now only losing. Instead of losing like 5000 you may be losing 500 nets or something, or 500x so you're still only getting like you start with you started with 500 or 1000 nets. You're still not getting a net out the back end. So anyway, oh LEDs are totally out with those micro LEDs or I put in the question mark category. The problem is they can kind of get to like millions of net so you can suffer. But why start if you're going to start with a million nets and only get a couple 100 nets out. It's pretty inefficient. You can count
Hi, I'm make the equation work. If you say, Well, I'm only going to turn on some of the lights. So I'm not going to turn on like a bright, a full on display, I'm just going to put up some information up there. You know, in a typical AR headset, and you really should ask yourself, if you're more than about five to 10% content, if more than five to 10% of your pixels are on, are you really doing AR anymore if you know if you're going to have 50% or 100% on if they're actually turning on. And I also like to point out and anything see through does that work with I work with automotive odd for a while to anything see through all subtlety and color differences is lost, you pretty much only have color, you don't have like it's either on or off, there's not much subtlety in it. The reason why is you don't know what you're looking at yet, if you're really doing AR and you're looking to get some real world, you can have a black couch and a white wall in the background. Okay. And if you look at the same color against the black couch, it's going to look nice and bright. If you're looking at against a white wall, it's gonna look totally washed out or Damn, you don't know what you're looking at. So subtlety is generally lost. This, by the way also works in the case of the HoloLens two, yeah, it's got crappy image quality, and they can't control their colors very well. So I'm just going to put something on. And I can't even get to the right color. But who cares, because you can't really talk colors very well, because you don't know what the background is going to be. You kind of get in this mode of the hell with image quality. And frankly, the guys who are doing the best are probably saying The hell with image quality, and are just saying, I'm just gonna put a red dot. And all it has to do is you have to be able to tell from a blue dot, our green dot, but I'm not gonna I don't really care whether it's a perfect shade of red, I don't care about NTSC color space, I don't care about this stuff. I put up a dot. And so but subtlety is lost in AR. That's another big thing. You know, the whole thing about watching movies or using it as a computer monitor is, in my opinion, insane. It is so far beyond being wrong. And I'd love to debate that with somebody. Because, unfortunately, all the other people on that stage, if I go on a stage at an AR conference, all the other people on stage, we nod their heads, and I'll be saying that's crazy.
So I'm sorry, I've gone off afield to their bed again. But yeah, you end up looking at maybe a compromise area. One of the things I'm curious about I know the read people said they're looking at something with some level of pupil expansion, there might be something in a a moderate level. Now we kind of saw North try this but more went like with one of the things you do and it's kind of interesting, we look at the spectrum of pupil expand. pupil expansion is basically this idea kind of routing the light through multiple paths. If you have no pupil like if let's go back to North, and also Bosch and their thing with laser scanning their their class or what we call maxwellian displays, which basically means it's focus free. But it's not a bowl of cherries, either, what they're doing is you can make a really small display, but you have near zero ibox. It's kind of like the fallacy and deal you make that if you end up with this focus three characteristic known as maxwellian. You also end up with almost no ibox. In other words, the ibox of that laser scanning display is as big as your pupil, when they do this laser scanning and like North dead in either hits your pupil or misses your pupil. If your pupil moves, you don't see anything. If your glasses move a little bit, the tiniest bit, therefore you see these fitting sessions where they had to absolutely fit to your eyeball, and to fit into your display and whatnot. Now, if you wear your glasses a little differently, you don't see the display. If you move your eye you don't see this way. Now what they tried to do is they came up with a a four way pupil expander. They basically split the image and came up with four images. The problem with that is is that you tended to see it kind of gave you four chances It was like four tickets. But what if your I ended up kind of in between you'd see double images. So if you look at like HoloLens, HoloLens went to the extreme where they replicate many, many, many times so they make lots and lots of little error. So you kind of see a bad image, but it's always together. You never see a split image you never miss it. So how the lens gets in the diffracted waveguide you have very wide, very good pupil expansion. But one of the things you also see if you ever looked at a picture I've done this thing, but if you look at a picture of somebody wearing these and you'll see they light up the entire area around the eye. If you're at night, if you're wearing this thing at night, you will see like a square rectangle projected over your entire eye. The beauty of that is that's also For your Id go grab, by the way, every ounce of that light is wasted. Any light that doesn't hit your pupil is wasted light. So then you go to the other extreme, and there are people talking about doing this. And it's theoretically possible as you tried to track the eyeball, and you try to now say, Well, what if I direct that light, I have a really small maxwellian light source, and I'm gonna attract that light source. And as I said, you can be maxwellian without cost too, that's a big thing people don't get but if you if your illumination sources maxwellian, and yet a, an L cost device is acting like a mirror. So you're going to be maxwellian. But you start tracking the eye, and you can start trying to track it. But nobody's figured out how to do that well enough yet, but that is a, one of the theoretical ways out is to actually know where the eye is, and then kind of keep funneling Cavett secondary thing that's tracking the eye and kind of follow the eye around and make sure the light goes in the eye. But that's, that's a lot harder said than done. You may remember varjo, they came out what, two, three years ago now now, three or four years ago, they had their thing what they were going to do their foliated display. And they were talking about how they're going to track the eye and whatnot, what they ended up making was a product that didn't track Yeah, they said, going back to way back earlier, in our discussion, when I talked about how you're I tend to sit in about 30 degrees, what they did was they put a high resolution display in the center of vision, and a low resolution display with a beam combiner. So that you you had a low resolution, but bigger display on the outside and just realize 95 98% of the benefit is found that way, without the massive complication of tracking the eye and making the image go, because when I first saw that thing, one of my first comments was, I bet they're not tracking the eye yet, because that is such a hard problem. And Matter of fact, you can't even it's not just putting a beans Blender in there. I know. I turned a paper by land and or presentation by land man who said at Oculus slash Facebook, and he talked about how you can't just track the eye, you actually have to it's how you track the eye and how you curve it. And now you it's a nasty problem to try to attract the eye. It's not as simple problem. Now, maybe someday somehow, the thing I always like to do, and I think it's a good exercise, oftentimes, is that is to say, Where do they violate the physics? I mean, if you violate even one law of physics, you're toast. If you had to break two of them, you're really tough. But But how many laws of physics do they have? So what you have to do is you have to keep trying to cheat the physics, you have to keep trying to say, I can't break physics, when you play the futures game. futures games that require breaking laws of physics don't usually end well. That'll happen. But what you have to do is start saying, Well, what are the physics problems? That's why it's important to have skeptics in some ways. campaign for my my position, it's important to know what what you haven't solved. What are the problems? And then you have to say, are these problems solvable? With technology? Will technology be able to solve these and don't just wait, Moore's law? Moore's law is like, why don't you just say magic? Why don't you just say, Oh, I wish there were magic wand. So I can just play Harry Potter in which magic wand and make it. So now, you kind of sit there and say, does it violate law? If it doesn't my little physics, then you can turn around and say, Well, how hard is it going to be to get there? What's it going to take? What kind of factory do we need? Do we need? You know, one of our big problems, by the way, and that you've talked about why is AR and optics in general so hard, is that the wavelength of light are fixed. If you look at I was involved in semiconductor process. They're using deep ultraviolet now to make semiconductors. And the reason why is that the wavelengths are shorter. Well, we're stuck using wavelength, nominally green lights, about 530 530 nanometers, red lights in the 620 nanometer range, give or take half a micron, we have pixels on display devices that are like three to four microns, so they're only like six wavelengths of light across. Light Source behaving differently. As you approach the wavelength slice structures get within 10x a wavelength of light. Things like diffraction and stuff like that start to become really dominant factors, things we never worried about before, start to become major factors. When I started doing IC design going back to this, the 918 days, we talked about the beginning of this, when we have a signal running in aluminum on the 918. When we modeled it, we didn't factor in inductance. We didn't factor capacity, a little bit of capacity, we had to worry about capacitance, but we we treated the resistance as if it was zero,
it was approximately
zero. Now, by the time I left, 20 years later, left the semiconductor side, we cared about everything we looked at, in fact, we now started looking at any metal line, how much resistance was in it, how much inductance was it and the capacitor In the good old days, all I had to worry about is the capacitance of that line. If I send a signal through and had to worry about the capacity, I'd put a big enough driver to overcome the capacitance. But the time delay, now you're building a delay line on your seminar chip that Oh, by the way, the time you had the time was zero to take zero time to get through, it was only question of having a big enough driver to overcome the capacity. By the end, you cared about everything, everything and you had to build things closer together and all that stuff. Well, the same thing kind of happens here, we start building these pixels, approaching the wavelength of light, they're starting to get so close that what were secondary factors, like when you build a TV set, nobody's talking about the fraction fraction is not an issue, that the pixels are humongous compared to the wavelengths of light. But now we're talking about all of our structures, and then you're going to cram it, you're going to take that image in a criminal hole. So the image contact is effectively fractions of wavelengths of light at that point, you can't help but do damage to it, the different the problem you have with the diffractive waveguide is you're getting effectively crosstalk between color and resolution. The waveguides are basically to close the image content is too close to the diffraction grading size. And so you're now effectively getting crosstalk between color and resolution. You just can't help that the image quality I don't believe is ever going to be what people are expecting out of like their TV set, you just can't. And this is our problem that I call the gap between expectation versus reality. If people are expecting something that looks like an OLED TV, that's what they come in expecting. And so when a consumer sees that I call it about also the expert versus the amateur saddle curve. If I looked at it, when I look at the display in an AR headset, it's like an expert who measured it for image quality would say it looks like crap. A consumer who looks at it would say the image looks like crap. It looks like crap compared to a TV. But this people in the middle the AR guys would say you don't understand how hard it was to generate that image. It looks great for considering how horrible it was to generate it. The problem you have is the consumer doesn't care how hard it was the consumer doesn't care that it took amazing amounts of science, it doesn't matter that it takes this genius. I know it takes a genius, he's really smart guys, and a lot of money to make an image look that bad. Okay, it took all that effort to get there. It doesn't matter, the consumer is going to say I looked at my television set, I looked at this, this looks like crap. Not to mention all the other factors like when you look at the real world, I think part of it a lot of this comes back to the user, the use model, how are people are going to use technology? What are they going to do with this AR stuff? You know, it's not the marvel of our science, you know it, there's good, there's good reason to science for science state. And I will say there's a lot of really smart vagabond, who who go from company to company, they're really smart. I mean, these guys know stuff, I will never know I'm more of a generalist, particularly these days, I'm an electrical engineer looking at optics. And I think the popularity my blog, is because because I don't write in what I call optic seeds. You know, you're only about three steps away from Maxwell's equation whenever you talk to a really deep optics guy. And I tend to try and translate it into more normal, normal things for somebody that might be technically minded but doesn't know much about optics.
So anyway, it's just we're not looking at what's real and what's realistic to do. Whenever I see somebody say, Oh, yeah, this is going to replace your computer or computer monitor, that they're either delusional or lying. It's just not happening. And, in fact, good for me as a as a, I do some consulting and stuff on the side. And the best thing a consultant can do is say, yes, pay me a lot of money and I can solve your problems. And then at the end, you say, Well, I tried hard. I mean, you're seeing billions of dollars getting spent if smart people and there's some really smart people. That's the part I always get. Well, these guys are really smart. Doesn't matter. Sometimes the problem is harder than small, smart people can solve. Some, you know, I hear about the Manhattan I've heard the Manhattan Project and stuff like that. I've also lived when I was younger, the SST. You know, people forget people think the Concorde was the only SSE development Boeing was going to do their own supersonic transport almost drove the company to bankruptcy. If it wasn't for that freighter plane that they figured out, you know, they said well, we can put a lot of passengers in it. They call it the 747 the was developed originally as a freighter if I hadn't had that program, but Boeing goes out of business and back in the 60s, because they were depending on the SSC, they put their a team, all their smartest guys were working on the SST, they put their B team on the on the freighter, but the freighters would save them. And some degree here, you kind of need that an AR, it's kind of like everybody is focused on the SSD. And what they really be got to be thinking about what's the 747 equivalent for for the AR industry? I think there's some of that I think some of it is, rather than small and sleek and beautiful. And going the speed of sound and all that, like the SST, it might have been more the big ugly thing that that gets a job done. Unfortunately, it does not lead to the the Valhalla land. And then this is the thing where people think this is going to replace the phone. And I get that all the time. I mean, I just It drives me crazy. Sometimes I hear somebody say, yeah, yeah, we're not gonna get there next year. But oh, yeah, five years, no problem, or 10 years, no problem. And I'm like, let me go to my list of 23. And let me see how you're going to dress all these. Let me show your image to consumer and say that's good enough. It's just it's not the phone. People want their time to be important. And sometimes you're just living in a more interesting time. You're like, I grew up in the city. I thought I was too late when I got the semigroups industry, right. In some ways. I thought, Man, I wish I was there when they invented the microprocessor. No. And I lived in another generation I lived in. I'm kind of the second generation microprocessor guy I did, microprocessors after they were I did graphics, early graphics, accelerators, and whatnot. Turned out, there's a lot of interesting things to do. That's somewhat played out. Now we're really bumping up. I mean, you see that even Intel couldn't beat can't compete with tsmc. We're seeing them fail. Now. I was seeing that back. But going back to our very early discussion, ti was probably five to 10 years behind Intel and Motorola and process we had, we bet on the wrong thing to bet on calculators, that being the driver, and then they bet on D Rams. And that was, that was such bad big decisions that doomed ti forever. By the time I was doing my graduate school accelerators, startups, were going to TSN t tsmc. had founded by the by the late 80s. Startups building these graphics processor, the graphics processors, API was the big one back then, I don't think Nvidia came until a few years later, but API was going to tsmc, they were paying less for a better process that I had to pay transfer price in ti for the same process, I saw, I saw that Intel just was able to stave it off for a long time. And then ti basically got added digital, what probably about 10 years ago now 510 years ago, got out, you know, they finally just wrote it all off. And ti was like, we were always, it was amazing how far behind ti process was the industry. And what happened is, is basically tsmc by, by building for 1000s of different companies gets so much more r&d investment than even Intel could put into their processes. And finally, even they fell under the under the bus. Oh,
so two things I want to maybe touch back on right now. One of them is to kind of go back to this, this pairing this matching between between the display technologies and the optics. Minor optics are the technologies. And he noted that for those they're infatuated with the potential for diffractive waveguides, the right sort of display technology that makes sense, there is no cost or DLP or laser scanning, those are the ones that can be driven bright enough, and get enough of that light into that waveguide. You know, because of this challenge,
it almost goes back to the Undo is good enough that you just don't get totally creamed by it. Yeah, you just not turning the power knob if you if your only option is to increase the photons out, you just can't get enough your power will just go through the roof.
And that's part of the challenge with micro led as it relates to diffractive waveguides is that the light is more spread out. And so you have more of a time to challenge and they can only theoretically be driven into the low millions of nits. And given the inefficiencies that go along with cramming all that light into the diffractive waveguide and getting all that light out of the diffractive waveguide. It's still not bright enough, it's still not good enough in terms of brightness to be usable in you know, normal situation, especially if you don't want to around outside. So micro led the big hope that a lot of the industry has maybe on a future display technology really begins to match and make sense with other types of optics that are not defective or reflective waveguide oriented like the freeform stuff you're talking about from tos and the birdbath stuff that and real and Lenovo are now using the super standard stuff maybe the RM is doing or some other sort of curved mirror approach. makes sense for Micro ATX. And micro?
Yeah, it has to be, you might be able to get away with moderate I start I'm sorry, I went far afield after this. But there may be some way to do a moderate pupil expanding, not what north and North broken for ways. And it just doesn't work. One of the problems you have with pupil expansion, if it's kind of like you think, think about your eyes, a little black circle, well, one of the problems of your pupil, it gets bigger and smaller. So that's a problem, that if you're trying to make something fit into a pupil, you have to deal with the size changing. But you also have to deal with the movement. So if you do very, very little pupil expansion, like let's say, I'm going to put four of them up there. Well, now you got to worry about does the I either, do you have gaps where you don't see anything? And then you pick it up again? Or do you you know, you have to worry about those things. But there may be some way with some level of moderate pupil expansion. There may be a compromise there between, like, huge pupils benching, you get in these in like a diffractive, waveguide. And only for white pupil expansion that that North tried. There may be something in between? I don't know. I've not seen it yet. But there may be something in between where you say, look, the micro LEDs are pretty dang efficient at generating photos, they said I can get nets. Maybe they're 20x better than our LEDs. That's a pretty big game, I can maybe turn around and spend some of that gain on pupil expansion. For example, I can't spend 5,000x. That's the problem with these guys is I can't spend 5000. But there may be something in between this promising. I don't know, let's say in there is I'm curious to see what these around people have done. Also, you have like Loomis is also doing some interesting stuff that may also come into play. But I don't think it's, I know it, I know it's not to frack and you just never get home, I don't see any roadmap that a rational guy will sit down and say, how are you going to do this, because you're you're already getting down to where like the photon, electron two photon efficiency is pretty good out of these things. You know, it's not like horrible. Now one of the things that you'll find is you make the LEDs smaller and smaller, they get less efficient than like a normal LED. That's one of the ways l Claus gets to stay in the game. By the way, not only is the atom do smaller, but the the, the electron two photon efficiency of like a one millimeter LED is vastly better than these one micron size emitters that you have in in microheli. DS. So so there are, you know, there's some basic design and physics and stuff. So there's still much they can a lot more photons out there. So they got that and the Aton doing everything. That's why they're able to work with it. But yeah, I can see something where there's some optics in there that some compromise in between. The problem we have is if we go totally conventional optics totally refractive simple reflective, no pupil expansion, we tend to make things got to get kind of big. The other side of the equation is maybe we can restrict the field of view May, you know, I think part of the problem is this insanity of going for for you if you want to VR type 110. Great. If people go for 110 degree field of view, they're definitely servicing a very vocal but small audience. I think they think they're serving the world. I think they're thinking video games. And if you want 110 degree field of view, and you want good image quality, then why aren't you looking at VR, you should be in the east. That's VR. I mean, we got that's, I mean, you kind of go back to keep coming. I keep trying to come back to this first principle of the user saying, user sitting there, what what are you trying to put on those glasses? What are you trying to do? If you're trying to watch a movie, I would suggest that AR is about the stupidest thing you could be doing if you want to watch a movie. Because immediately you say, Well, if I want to watch a movie, I better put black over your class as well. Now you just went VR with it. But you've now come up with the hardest, most difficult way to produce VR, because I can produce vastly better image quality at probably a 10th of the cost, if that's what you want it. So you start but I want this thing to do everything and I think you're trying to make it do things that it's like, okay, I can give you VR if that's what you really want. But you want 110 degrees and you want this and you want that and you want the other thing and I think you're trying to solve an impossible problem. Look, I've lived this by the way fact in the
background. Back in my Sunday early sending days. We formed Sydney I was one of the founders of sending it back in 2004. And we were going to originally we were going to go after rear projection television timing was excellent. The rear projection market was just heading up. We produced an incredible looking good display 10 NDP beautiful colors were phenomenal. I have 14 bit perfect Color control, beautiful display and all that. The only problem was is that the rear projection market had was starting to peak by the time we got done and but that we could ever get to production, it already collapsed over the rear projection towel that Mark had about a three year a day where it was making money, and they had any volume. And before the LCDs crushed us, basically, my partner and I at the time Mark Howard, as a great guy, still is a great guy. We said if whenever if we have to retreat, at that point LCDs were 48 inch televisions, we said we have to retreat beyond 55. If we if we can only show an advantage of 55 inches, we're dead. Well, the next year they were 55 inches, and we got into Pico projectors. At first we were like we're doing like 500 you can do about 500 lumens from rear projecting television. And these guys were taught we had all these guys telling us that Yeah, you need 10 lumens only 10 lumens for a, by the way lumens or total light out versus NES, which is like going to your eye or going in a direction. So So anyway, so we said a 1010 lumens. We said, that seems kind of crazy. We've been doing 500 How is anybody gonna do anything good with that? Well, we went to all these cell phone companies, we we went before we we said, well, this rear projection thing is dead. Let's look if there's a way to pivot the company and do something different. So we went to we went to Nokia, we went to all the big cell phone guys, right? And they all told us yeah, the next big thing is going to be projectors. We're going to start planning projectors and phones. Every one of these cell phone companies had a program in it. And so we went after it. And we said, okay, we can do it, we reprogram what was the three chip, we had a three chip, rear projection television, but it was highly program, I was the CPU architect, I made the thing kind of smart. And so we were able to reprogram that same chip and show that we could do field sequential color with the same chip that we did a three panel. And so we did that prove that work, and then started building smaller pixels that we know we can but we're semiconductor little small texels. So when I said the thing we found out was is every year the demand went up. Well, they said oh 10 is not enough. Now we need 20 limits. Now we need this. Now we need that now we need this. Now we need that. And after about five years, you started to realize this and people tried, you probably saw several of these Pico projector phones in the market. They all died. The only thing that went to big volume production is well, you saw a little bit of the standalone projectors and all but of course, is that what's happening is cell phones were getting bigger you the iPhone, this is all about same time as the iPhones coming out where we started getting a Pico projectors the same year as the iPhone came out. So and then then of course they then did tablets. And so you start saying well, how am I going to project the thing, but also the use model was absolutely. I finally in the end said this is the use model is crap too. Because you had to say on any kind of any kind of little projector and a phone, you have to say Where are you? What are you going to project on to this comes back to AR what I did is I had in my briefcase, I had a little white piece of foam board with a white piece of paper on it that I put in a little plastic sleeve to protect the white screen. And I pull out this little board, we started to realize that thing was just about the same size as that tablet. Well, I just put a tablet in my case. And then I had to hold the projector Where are you going to put the projector? You got to hold it up while you're presenting? You can put it where are you going to put the projector? Where are you going with it? Oh, and by the way, would you please close all the blinds in your room of the restaurants super if you're sitting by a window. There's all those things. So that's kind of a bias you can say I bring into the AR world. I'm thinking a lot more now about what does the world look like that you're dealing with as soon as you go AR AR is very much like Pico projectors, you're stuck with the light that's there, you're stuck with the images that's in front of you. And Pico projector and same thing if I put a screen up here, I'm stuck with the room lights, I'm stuck with all that. And so you're not in a dark movie theater. And this is kind of the it's a very good analogy, I think between, you know, trying to do a projector, a front projector where you have all these other problems like I have to have a place for my screen. Well in our case, how do we block the real world out so we can see the image that we're projecting. The way we do it today is with what I call dominance. Basically, I have to be much brighter than the real world for it. Let's look solid. If I can get 10x brighter than the real world, I can get pretty solid, bad piece of news for everybody. basic thing. If you look at White concrete in the bright sunlight, if you look at fairly white concrete in sunlight, it has 10,000 nits. So you're black. If you're looking down at a sunlit sidewalk you're black Is 10,000 nits. If you think you're going to out, beat that by 10x and put 100,000 nits to the eye, first of all, your power consumption is going to be out the Ying Yang. And second of all, the user is going to complain that their eyes are sore almost instantly, because you're putting 10,000 nits into the guy's eye. So you can't make. But that's the way it works. That's the way you make. With AR, the way you make things stand out. The only way you can make them solid, the only way you can make them really solid, is to be much brighter than the light. So we accept things like oh, well, except two to one contrast. So now, if I want something to be readable against that sidewalk, that's 10,000 nits I got to be 20, then I got to be about another 10,000, it has to be at least as bright as what's there, that's not going to mean I'm going to get a great image. If you want a pretty good image, you'd want to tell color pretty well, you need about eight to one. So now if I'm looking at White sidewalk, I can make it look black by putting 80,000 nits into your eye. But nobody's gonna do that. It's not good for the eye. And it's not good for the our consumption and everything else. So I just think I think that there are applications that AR there are things we can do. HoloLens is proving it, that that there are people for the right markets willing to do things, I have to give a nod to Paul travers at viewsonic. He's one of the first ones that kind of clued me in on this maybe five or six years ago, what they were doing. Now, they haven't been perfect, they spent a lot of money, they had their issues, too. But he did make a good point that, that you you have to be somewhat realistic on what the guy is going to do with this. Now I hear him go off in flights of fancy to occasionally he can get a little flower it Paul can get a little flowery too. But he made a good point that, you know, if you got a UPS truck driver, and you're trying to give him a little bit of information, like where to deliver a package or something, he's going to take those glasses off and throw them on the dashboard. He's not going to put them in a nice case. I mean, one of my big things is, I used to joke, I put a hollow I had a I had a carry on. And I can only fit a HoloLens and a magically one. And that's it and maybe a T shirt and some underwear.
That's all the fit in a rollaboard. It's not really portable, portable, is I take my phone and put it in my pocket. We'll try to nothing's portable, if you can't take it off. Like if you have an AR headset, and you have to have a cut. If you can't wear it all the time. 100% of the time like glasses, I never take them off. I only take them off when I go to sleep. Okay? If it's not that, then you really reach this chasm, where now you have to say, Well, I have to put it in my pocket. If I want to. If it's portable, portable, it's got to fit in my pocket. Well, I mean, I got my house. I got a HoloLens case here. I say it fills up half a suitcase. When I put it in his protective case. That's not exactly portable. So yeah, we got there's just a lot of reality. There's the thing that frustrates me is a reality gap between here are the real issues. Here's what you got to do. And people just say Moore's Law, Steve Jobs, apple, you know, Apple doesn't get different physics and everyone else, no amount of money buys you different physics. You can you can exceed it doing things. You can put things together better. You can invent new technologies, but you never get to change the physics.
Do you think there is a consumer use case? He talks about enterprise there's some utility there. Military, there's some utility there? We'll talk about that in a second. But on the consumer side, do you see a viable purpose for normal people to have a pair of AR glasses, given all these limitations?
I think it's a really hard use case. I hate to say never. I can kind of see Pokemon Go and stuff like that, like Mira make it really really cheap Mira. And you know, I mean, I love Mira in some ways, those guys are fans on my blog. By the way, I've got a show I was at a conference someplace they had t shirts with me on it. So I have a certain affection for them. And they started their class like I love the fact that their very first prototype they took to literally two fish bowls, and you can almost see it but their first prototype, they literally took a plastic fish bowl, cut it up, coated the inside that sent it off to the shop to have a metallic translucent coating put on it a mirror coating on it and glue them together. And I showed it to me their first prototype is fascinating. That's classic. They spent like next to nothing basically college student is going out doing a startup and that was their first prototype. Well hey now they're at now they're in this thing yet again for the studios in Japan where their their video games and stuff. So I I could see something like that for Pokemon Go. But you kind of wonder a little bit like where is, you know, like Google Cardboard anymore? You know, it was really okay. Everyone tries it out once. Is it something that only works a lot of things that work in theme parks I'm a I'm an old time Disney fan. If you're looking at my video in the background, you probably see some a bunch of Disney stuff. But I'm also an engineer, I kind of like go back and forth between the fantasy and the reality the engineer, you got to realize a lot of things that work in theme parks are a five minute or 10 minute ride are not the product you would have for your life experience. It's even like the thing about user interface I've, like I said, I was in on the very early end of video games. used to have a guy a software guy used to say, gameplay is king. Some of the things that the simplest graphics end up being really really good. It's about the gameplay, it's about the interaction. When we look at AR it's got some really serious issues even if let's say I do you look at some of these things and they really fascinate they make some things make really good demos that are not good product. They they're very impressive when you first see them. But then you have to ask yourself the next question what I do this all day what I do this for hours on end. And they're very, very few products that make it to their like the phone is like, is more the exception than the rule the way the phone expanded and evolved and started to really consume every other product. You know, it consumed your flashlights. I always thought one of the amazing apps was when you started using the little LED for your cameras a flashlight, it started to realize that yeah, this thing is, you know, it's doing stuff and it's like now now it's killed the cameras for you like, like 1/10 of the cameras are sold that they were the only camera industry there is big, big cameras now, they're still big cameras, because a big camera can still image quality wise blow away a cell phone most of the time. But if you just want a snapshot, now you're gonna want the cell phone watch to do better than an amateur holding up a good camera. So we're back to gameplay. One of the big problems let's say we do an AR game, like almost every error game I've seen, they all revolve around the theme of either there. But the better done is a 2d game, or they're things stuck on the wall alien stuck on the wall coming at you. By the way, the number one gameplay thing, almost all theme parks turn into shooter games, they almost all turn into guns shooting. See the same thing in AR LOOK AT ALL THE LOOK AT THE LOOK AT most of the games you end up with. Now they may do variations where you grab things or do stuff, but most of them end up in shooter games. The reason why is zero training almost anything in a theme park needs zero training, because you're talking about all age groups, zero training. That's why a lot of these interactive cues and things they don't they don't work too well. And the reason why is you got too broad an audience. So you have to dumb this thing down to somebody and the only thing that really worked as shooter. So most of the things you look at Disney, look at how many different quote interactive things. They alternate into shooters, or maybe a colorful version, like they have the Midway mania where you pull on a plunger thing, but it's a shooter game. And the reason why people kind of understand the idea of point and shoot, they understand that you get much beyond that. Now you need training, you don't have time in a five minute theme park ride to do training, you get very basic, same thing with demos. But anyway, let's go back to the Magic Leap and HoloLens had this thing with aliens coming out of the wall at you, right? Really impressive. It's kind of neat. You see, you look out and you see a word that was a wall, you see a hole in the wall and an alien coming out of it right? And you shoot out,
we understand what to do. Here's your problem, though. If your room is too big, they tell you up your rooms too big for this to work your rooms too small. What if you have furniture? What if you have windows? What if you have mirrors? What if the lights too bright? All of a sudden now I need a special dedicated game room that's just the right size with walls that are just the right color. So I can play my game. So the problem is the room is too diverse. So then you got to come up with the game. Imagine playing basketball where there was no court size. Now some courts are L shaped some courts are circle. Some courts are long and thin and some courts are wide. Could you have a basketball game? What would you call? I mean, you can't play the game. Oh, we'll make the hat. And that's all kinds of different heights at different days. So everybody has comes up with their own basketball court of their own shape and their own size and what No, we say what the shape is we say what the basketball height is. We do all that. The problem you have in many of these AR games is now you're saying I'm going to look out in your room and your rooms a mess Plus he can't walk around it because if I one guy's room and he's got to clear floors and nothing in it, oh, he can play the game. But the guy has got kind of a couch in the middle or a chair in the middle. or wrong, they can trip over? No. And that's kind of the problem is I don't see these people thinking about the use model, but the How are you really going to do this thing? And then how are you going to fan it out? And how broad a base you can do. And that's why I mean, it's, it's multi-tiered. By the way, here's something else that people never seem to talk about that bothered me from day one. One thing you have to do sometimes to play futures, you have to suspend disbelief. Live out there. But and this is a part people don't do, you have to come back and I can't cover that bad. You have to say, Okay, I'm gonna imagine, I always say you get a wish. And when you try and predict the future, it's a good idea to take a wish, wish for something that could be true. That may not be true, yet. That's all it takes. If you can't violate laws of if you violate laws of physics wishes, that never happens. But you might wish for like, micro realities will get really cheap, you know, really good, some really good company will come up with a micro led that's full color, maybe multi layer. So that has some advantages, because the pixels all come out the same point, things like that. They're coming in Carlsbad, I think he's, he's working on some interesting stuff in that area. So you get all that imagine that you can make it a cost, you'll solve the cost problem, I can believe that I'd seen that happen. A lot of times, you'll solve the manufacturing problems, I can't believe that will happen. So you go through a bunch of stuff, but I'm just gonna imagine that you figured out how to make it for 10 bucks or whatever. Okay, then you have to say, Okay, I give you I grant you that wish, can you solve the rest of the problem, take the thing that's bothering you and get rid of it. I'm saying, Give me the infinitely best possible theoretical display, you still can't solve a problem, because now you got to get optics with it, you still got to say, well, the display is this big, I can't make the picture. Like you can't make the pixels smaller than a wavelength of light. And in fact, you probably can't make the pixels much smaller than three microns before the wavelength, the light becomes a serious, serious problem. So maybe you make three micron pixels stacked and all that I think there's a big amount of stacking. And one things I like physics wise, come and talk about sustained Oh, they stack the pixels, I have some good physics advantage, because it's better to have the light all coming from the same spot.
Because at some point, we're going to run up again, we're already running up against how small Can you make the big move, by the way, that's another huge advantage of L cos, over over some other technology, there's no cost with the fields a bunch of color, we talked about the color breakup, that's a bad thing. But the good thing is, the light all comes the mirror is covering all three colors. So the pixel size is smaller. It's not a red, green, blue pixel, a red, big blue emitters that are spread out over a larger area. And that has disadvantages, particularly when we're started talking about trying to get things so small that they're approximating wavelength, the light stacking actually starts to make sense. When you start to realize that you're playing with this, you're starting to run up against the physics of light. So anyway, you take your wish, you run with it, but you got to cover that bet you first got to say, Okay, here's how they're going to solve this and start figuring that out. But maybe it's somebody else's run, but I believe they can solve it, I believe, someday, we will have a micro led that is full color. All the colors coming out of the same port, kind of like what a send us doing, I believe that will one day be cost effective. Actually, I honestly believe that. But that doesn't solve the problem. It doesn't solve that and make it a consumer product. Because now go back to my list of 20 you've solved one now you have 19 more to go, what are you gonna do for optics, optics is probably in some ways, the more intractable problem, you can't change the fences to light, you don't get to have miniature black holes, you know, you've got to still make everything else work. And the biggest problem we have, and I think one of the biggest fundamental problem for AR is the thing I talked about hard edge occlusion blocking light is actually a deceptively hard problem. People think, oh, I'll just put an LCD in front of your face, it's out of focus, it's not going to block the light the way you think it is, it's going to block like about it's when you turn one pixel, that pixel is going to blur out over several 100 pixels. So it's no longer blocking that pixel is blocking a block and making a very small blockage of a very large area of what you use, relative to the display. And so but that's the problem is is there. The other thing that started down this and I jumped off somebody used to say, if I was running software, I'd have a whole bunch of jumps without returns. Yeah, I just do four jumps. Jump on another topic, but Okay, let's look at a computer monitor.
Look at the use model, I think I'm very user centric in my thought processes, okay, how's the user going to actually do this thing. So you're looking at a computer monitor, as I said, if I buy it big, I have a 19 by 20. Monitor, it's like 32 inches. Sorry, 22. by nine, I have a very wide, very large monitor, right? I like it. But I know that physically, you only look at about 30 degrees at a time, that's all you can see it with any resolution is about 30 degrees. And it's not because you're Iczer, you it's because your eyes wobbling around, your eyes not gonna wobble out of that, what you actually do is you turn your head, you actually move your head, if you were to put a camera on yourself, and everyone want to do this at home, put your phone out there, record yourself using your computer monitor and see what you do. You're if you put up a big monitor, you can put multiple, I put it up there. So I can have multiple windows, I could do some things, I can see stuff. But you do not sit there and read anything that way it's there. So you can turn your head instantly, very, very quick to turn your head. Now if I put the monitor on your head, I put these glasses on your head. You don't need good visual acuity through that whole display. You don't need all that stuff. Plus, your inborn instinct is to turn your head not to move your eyes, you're now telling me I put the glasses on, I got to retrain you because you don't know how to use your glasses, you now need to learn that you move your eyes there, but your muscles and your unfortunately, your eyes don't like to move that far. So if you if you start trying to move your eyes to make things work, your eyes are going to get sore really, really fast. The other thing everyone in this field knows, you can look down, your eyes rotate down better than they rotate up. So you don't ever want to put anything above your eyes. Matter of fact, in the real world, if you put your computer monitor, most people may build on this ergonomically, the top of your monitors should be pretty much level with your eyes. Your monitors should always be at eye level or below. Because if you don't do that, what's going to happen is you're going to end up tilting your head backwards because your eyes do not like to rotate up, you will tilt your head backwards, you'll get a sore neck. That's just the way your physiology works. So there's all that, then of course, we throw in the fact that people are so different. I mean, you go around the world people know shapes or different people's eye distance to their eye socket is not all the same. But more fundamentally, though, is that a computer monitor? In the way you use it is doesn't map on the glasses very well. It just, it doesn't work. You can try to simulate it. You can sit there and say well put the head tracking on and when you know, like I say, you know, I know windows can Oh, we can tack. We can tag screens everywhere, right? We use virtual screens everywhere. Okay, go try to use them. Go go hit a movie on one, go put one up and go watch a movie. Yeah, you only accept it if it's versus nothing. But you will, you would buy the cheapest tablet you could find rather than watch a movie on one of those things. But it's this thing I call the airplane test. A Tom I wrote about it probably 10 years ago on my blog or something. But it said I noticed this thing when I got on an airplane that passed by kiosks that had these void called VR today but basically television glasses it was coming I think coming was I go but I went by a Best Buy kiosk in the airport. So you could literally buy one in the airport. These I go glasses and get on the plane and watch a movie. All the arguments you hear today Oh, your privacy? Well, you're not going to want to do privacy. Particularly if you got like HoloLens glasses that project the image forward. You definitely don't want to go watching dirty movies on an airplane wearing a typical diffractive waveguides because everyone's going to know what you're looking at. But let's say so we put these on, you got this thing you're in an airplane, think about it. It's got to be if you were to imagine it the most ideal application. You've got people get on airplanes. These people are upwardly mobile, they probably have some money. They got time to kill. If you know if you fly a lot like I used to. For COVID I used to fly a lot. When you fly a lot you start to realize you buy stuff like people buy expensive, noise cancelling headsets, those are frequent fliers because they started to realize you know what, $300 to kill the noise. But I would never do that for once or one flight or two foot but you know what if I'm flying a lot, man I can fall asleep with that I wear them there's just noise cancelling a lot of times you just put them on to kill the noise. It's amazing. get on an airplane. And it makes it make easier to sleep. If you're trying to like take an overseas flight or whatnot. You do it in an instant. It's like you know you pay that. Pay that for the for that. Yeah, anyway we get on the plane. You sit there and say this is an upwardly mobile base Who will pay for technology. They will pay good money for it. This happened. I think about A week or so after the iPad came out, I got on that airplane. And since I have never seen anyone wearing video glasses on an airplane, never seen one guide Ola madam put them on with in a week the iPad came out, the plane was full of people with iPads, watching movies, they weren't doing work, they weren't. And these people all had laptops and stuff, too. I mean, you know, everyone had left but but laptops really hard to work on a plane right quickly. The old ones that didn't flip into tablets, you know that you even back then you really didn't have enough room to type and look at, I don't know why anybody hasn't built a laptop or the screen can move I think they probably did, and the student mechanically unstable that but you know, it's hard to type and watch a thing unless you're in business or first class. But anyway, my point is, I call that the airplane test, because here you have it. If television in your glasses would work, they've had that solution for over 10 years. And for 1015 years. They've had a solution, Sony, go back. And look, Sony's been making video glasses. Since the 90s. Since the 1990s, that Sony has been making video glasses. I don't like any of those sold worth a flip. Matter of fact, the number one is people don't realize this, but they don't think about it. But the number one use for micro displays the things we use in VR and AR glasses. And the number one use for all that stuff is camera viewfinders. Sony just dominates that market for viewfinders. That's how they're able to build the, that's how they're able to afford all this is because they build those for viewfinders, a lot of video camera, all the video cameras used to have displays, they went from CRTs, later ones went to LCDs. And some of them use Dell cars, some of them use. But more and more today, almost everything uses our LEDs, and almost always they're Sony's so that's where the LED, that's where the market is, that's where all the unit sales are. I mean, get up together the whole AR industry, it doesn't buy that buy enough to justify the bill.
So there's a huge challenge that AR has in its advertised potential to replace the displays we use in other parts of our lives. Right? You talked about this airplane test and whether or not these relatively wealthy people who are in a constrained environment in which they have a lot of free time, are willing to use a headworn device, a VR device to watch a movie. And your example was they they would not even in the earliest days of the iPad, the tablet was a better alternative. So to suggest that a see through device as an AR device would be is going to replace these screens is is just not practical. But one of the areas where you are excited about AR is the potential within the military. You are the Chief Science Officer at a company focused on military use cases named Raven. Can you describe why that problem is more interesting and more appropriate for AR? And a bit about the specific problem that you and the team at Raven are solving?
Yeah, when I started with this, like I said, I was dealing with the guy who's a former Navy SEAL, Jake came to me and he had this idea. He's a former Navy SEAL, gone back to college and whatnot and studied about, you know, augmented reality and stuff like that, but mostly from the software side. And I was looking for some help and the display. And it's like, well, the military, you know, when you start to look at the application space, it makes sense for AR a lot of these military and first responder type uses make a lot of sense. You need hands free. So you don't want to be holding a tablet, you don't generally have a lot of input. So we are trying to take a very user centric, he had a very realistic set of notions of what he needed. We weren't trying to play a video game. We're not trying to be all encompassing. The main thing that that ravens focused on is first do no harm. I mean, we kind of go for it. That's kind of our motto almost in the company. Because these are guys, I'm dealing with my CEOs, a former Navy SEAL, we've hired other navy seals, we've hired a greenbrae. So we have a lot of military expertise within the company. These guys know what it's like on the frontlines. These guys have been deployed in war situations, they know what they're getting faced. So we I have the user I always say, you know, the CEO of this company, would have been a user. So he's got the user experience kind of knocked down. And he knows that he's seen a lot of technology be deployed to the military where it just sits in boxes. I mean, our guys look at it, I'm looking at navy seals and whatnot. They look at the IVs, which is the thing using HoloLens and they just laugh at it. They say, Can you imagine going out in the desert wearing that? I'm not going to wear that these guys know that their life is on the line. When they put this stuff on. They are not going to wear a piece of equipment. There's tons of stuff that politicians and generals think are good ideas. They get shipped out and just get left in boxes in the war zone, because nobody's going to put something on. If it has a downside to it. You know, we talk about trade offs here pros and cons, their con is they die. Okay, their con is they put this on, and it's blocking their vision and they can't see a sniper, they can't see where things are coming from, and they die. So they have a different attitude towards us. So our kind of philosophy as a company is do no harm, we're definitely going for lighter, lighter weight, we know we have a situation where a guy's life on the line where they got to keep their hands free. And we can't get in their way. So we're going for something and I don't want to get into all the details of as much lighter weight, much more user friendly, not nearly going for the level of capability. For example, I think I've asked or that kind of product. Microsoft was very big with their HoloLens one was used all throughout the military for training. If you're talking training, that's a different game. You don't care if it's a little bulky or whatnot. If you get hit OK, you died in the war game. All right, no big deal. You come back, you reset, you come back, it's a very different thing. If if the consequences are different, and you blocking off his vision, they think that their vision is like their number one thing. So they their eyes, you know, losing their eyesight is really an incredible issue. So we're looking for something that's much lighter weight, and it's within the realm like the first things I asked Jake is, well, how big a field of view do you need? Well, we're not going to go for such a big you know, did you need if he came back and said, I need beautiful color, I want to be able to look at movies and I need 110 degree field of view, I would have never worked with check. Because I told him you're you're just you're asking for things that are impossible. So we're definitely not going for the big field of view, we're going for things smaller field of view, we're looking at whether or not we need color or not. That's that's a trade of light on for example, if you look at guys were in military displays. They've been using green displays that display only green for 50 years or more. night vision goggles are, were green forever. Now that the big advancement and night vision goggles, they now produce white.
Okay, they're going from green to white. That's the improvement that made from green and black to black and white. So the military guys are trained. See, it's a very different when we talked about the theme park people, those people, they got to be there in five minutes, you got five minutes, they have to be trained. And through the experience in five minutes. Okay, well, the military guy, we can spend months of time training them. So you have a different user, you it comes back I'm like I say I'm very big on user centric, you got to put yourself in the bit in the place of the user. Now the great thing in our company, I've got users I got i'm not i'm not dealing with guys who went to college, got their PhD, and said, Oh, I think I know what military guys might need someday. They never had their life on the line. I'm dealing with guys who have had their life on the line that so that they're constraining this problem from the perspective of a guy who they're thinking about it as a guy what I wear this thing Am I what I've worn this in battle, what I do it so we're we're definitely thinking a much different attitude. So we would accept, for example, brightness, hey, if you got a great color display, and it's not bright enough, you can see in sunlight, you're out in the desert and you can't see it. Well, you might have seen our product waiting to bring it with. If it's only going to put out 500 nits which is what HoloLens puts out. You wouldn't wear that out in sunlight. So it's worthless to you. And and this is kind of the thing, it can't fail. You can't have something that's what I'll call undependable, and undependable is in your from your perspective. In other words, if it won't work in the daylight, then it's undependable. You can't rely on it, you can't rely on it, you're not taking it with you. So it's that kind of perspective of looking at it from the user. So we're willing to sacrifice color if that's necessary. Everybody would rather have color. There's a very famous quote from IBM back around 1980. That said, color is the least necessary most desired feature in a computer monitor. Okay, people may not need color that they'd be better off with, say higher resolution black and white than color that you could show study wise they could do the study, IBM did this study said yes. If you're building your computer monitor, the user will be more productive with higher resolution black and white and color because back then that was the tray. You can high resolution black and white or low resolution color. They said on a cheer technical basis. They will prefer to have black and white. But if you ask the consumer, they'll always say color, no matter what the resolution is, well, they won't accept black and white. That's an issue. The ad side ended that same fit. But this is different. I've got a train user, I've got a guy who's not trying to watch a movie on the screen, he's not looking for pretty stuff. Now he'd rather have color, like I'd like to put like red means bad. And Green means good. That's, you know, we can kind of use visual to help two people in faster. There's advantages even in the battlefield having color, but it doesn't outweigh the advantage of if you can't see it, it doesn't matter. So we're definitely thinking the environment that these guys are in, that we're not going to add like five pounds, so the guy's kit or whatever, we're not going to put the guy in something where he's gonna sweat profusely. I mean, we're worried a lot about little things like perspiration like will it fog up, you don't realize, but these guys are like, super. I mean, these are superheroes, these guys who go out in the military, the amount of weight they carry and how much they do? Well, the one thing they can't control is when they're running and carrying that energy, they're going to sweat, they're going to have to get rid of that energy, it's going to come out of perspiration, and they can fog up glasses like nobody's business. So you got to think about that stuff. So we're definitely thinking that holistic problem, what's the real problem the soldiers app, and trying to do no harm, trying to build something that improve that only improves and cannot in any way, make life worse. And if you look at Hot if you look at HoloLens, for example, as I said, some good things about it, if we're talking in an industrial environment or whatnot, it's insane to think that anybody's gonna wear that out in the field. And it's not one or two turns of the crank. It's, it's it's not just oh, well, little Moore's law will fix it. Now. There's, it's kind of come down humongously. And it's doing some things. And I think part of the problem was, and I sometimes call this the hip and edgy, yeah. It's like, you have some general someplace who wants to be hip and edgy. So he thinks, Oh, I'm going to be moderate and get within and be hip and edgy. And you know, when you say that hip and edgy is kind of a pejorative. It's like, you're trying to pretend like you're, you're with it at all. And I think there's some of that in this program that yes, it sounds good. But it's like, I mean, our guys heard, I think we heard a video where guy was saying, Yeah, you gotta learn to hold your weapon differently.
These guys were like, it was like, This stupidest thing you can channel gotta learn to hold their web. I gotta hold a weapon down here. Like, yeah, we're just going to spray ammunition everywhere.
It's just like, they're like, you don't these these guys are. And I think somebody said, I think I've heard this said that sometimes you hear it said that some of the younger guys, the guys who have never been to war like it better than the guys who've been to war. It's like, it's like, they don't really get that it's not a video game anymore out there. And so that's kind of it. But basically, we're starting from the user experience, we're really, we're willing to sacrifice on field of view, a lot of different and rugged Oh, that's the other thing, ruggedness, these things that they say it's got to be drivable. And like, I mean, we had a much better technology we were starting with, we liked that display and everything with it a lot more, much better image quality and all brightness met all the display characters we want and more is really good. But But in the end, we couldn't go with it. Because we couldn't figure out how it'd be ruggedized, we couldn't figure out how we could survive being out on the field. So you, you're willing to sacrifice for ruggedness, not blocking the guy's vision. I'm not I can't have a guy. Look, it's it's insane to me to think that a military guy is going to be looking through a diffraction grating, I don't believe a guy in the military field is going to do that. Now, because I'm anti anything, it's just on a practical level. So that kind of thing. But the main thing is we're very, as a company, we are led by people, and have a bunch of people in our company who actually know what the reality is of this thing. So we kind of like have the user base built into the company.
Yeah, that's a powerful position from which to create something that's going to have utility and Yeah, actually work in the market. And in the field, saving people's lives. Yeah, do no harm is a wonderful mantra. Let's, let's wrap with one in lightning round question. Normally, I like to ask people the question, what's a commonly held belief about spatial computing that you disagree with? But you disagree with so much, I'm gonna ask you the inverse. What's the commonly held belief about spatial computing that you do agree with?
I think there's a sector of the area where if you have these characteristics, and you can check these things off, I think there's something to be said for spatial computing. You need hands free, does not need a lot of input. So you kind of start finding these characteristics. You don't need to be like typing typing, even HoloLens two is vastly better than always one for entering. But you're still at like 1/10 of one and packed speed. You can do it but it's still lot, you still can't touch it and stuff, input is still a big problem. So you don't need a lot of input you need hands free, you're willing to be trained a bit, it requires some significant training, it's not just pick it up and use it as not watching a movie, you're not trying for great damage, you don't need great image quality, there's usually got to be a high value to the experience life, saving a life, that's high value. Okay, if you're doing a medical medical, I think makes a ton of sense being able to see, that's another thing, you want to be able to see things that are there. Because they help the thing like I want to see through your bones where I've got like an X ray camera, that you know, that's the thing in medicine, I think you can see some things in surgery where it makes a ton of sense to have AR. So there's a lot of applications, if it's if it's a guy on a factory floor, and he's building something and it's helpful, he either pull up some information, or to be say, point to it and say this is the point you should work on. There's a lot of value there. So there are areas within AR that's why I'm working with it turns out the military and first responders is one of those areas where you have all those characters. Imagine the police, you can imagine firefighters, they got to keep their hands free. So those things where though you meet those criteria, that technology is there to do some good, where you're going to add more than you got to subtract. I think we're where the disagreement comes is when you think this is replacing your iPhone anytime in the next 10 years. It's just not. You're asking it to do things that can't do I think you're
Yeah, Carl, where can people go to find your blog?
Pretty simple. I made a really bad decision in haste. The blog is called kg on tech cargo tag on technology kg on tech. But the website is www dot k gu tag k as in Carl with a K good tag g u TT ag comm I think if you search for anything on magically that many things on almost anything in AR you'll probably find my my, I come up pretty highly and that on a good month we get about 20,000 people, usually pretty highly technical people, but I've been a little bit soft lately. I want to try to get going and get publishing a bit more so. But yeah, it's easy to find.
Great. Look forward to that. Karl, thank you very much this conversation.
Before you go, I'm going to tell you about the next episode. Well the next to the next full interview is with Kirin Sinha, co founder and CEO of Illumix, a company building an AR first mobile gaming platform. They've already found a lot of success with their first title Five Nights at Freddy's AR Special delivery. But next week, I'll share a short bonus episode with Karl. in it. You'll hear Karl's take on Apple, a topic we started discussing after the interview. Please follow or subscribe to the podcast so you don't miss this or other great episodes. Until next time.