The AR Show: David Fattal (Leia) on the Magic of Lightfields
5:02PM Jun 30, 2023
Speakers:
Jason McDowall
David Fattal
Keywords:
3d
display
started
company
technology
electrons
hp
photons
devices
content
camera
experience
tablet
quantum
images
stanford
create
quantum computing
people
pictures
Welcome to the AR show where I dive deep into augmented reality with a focus on the technology, the use cases and the people behind them. I'm your host Jason McDowall. Today's conversation is with David Fatal. David is the founder and CTO of Leia, a company dedicated to revolutionising visual experiences through light field display technologies. Light field displays are able to create 3d images without glasses, you might remember layer for its work on the loom pad tablets. And a few years ago on the Hydrogen One smartphone from Red. David is a quantum physicist who got his PhD from Stanford with a focus on quantum computing and quantum communications. His research experience led him to HP Labs, which was at the forefront of quantum information processing. He co authored numerous scientific papers at both institutions. It was a moment of serendipity while at HP Labs that led to the creation of Leia. David spun the company out of hp in 2014. And has been the chief innovator there a sense, in this conversation, David makes the case for why every screen should be capable of 3d.
Yeah, so it's funny in my world that we ask the opposite question is we say, why do people tolerate 2d When the world is 3d, and then any visual medium should be 3d the same way that visual media or color today, right, and they were black and white before but as soon as color became available, every single display on the planet are in color. And you know, the same thing is going to happen to 3d. It's just a matter of cost. And it's a matter of, you know, readiness of the content ecosystem as well. I guess we'll we'll talk about that. More seriously, right. I mean, we've all been 3d environments. So we perceive things in 3d, we feel in 3d, we even dream in 3d. And so you know, whether it's for entertainment game movies, whether it's for productivity or realtor interaction, in terms of chat, and so on, you know, it's just, it's just a better medium, right.
David goes on to discuss the basics of quantum computing, and how his pursuits, they're led to a new type of display technology. He describes what light fields are, why consumers should care, and early use cases, as well as the challenges of bringing light fields to mobile devices and lessons learned from working with RED cameras and building a mobile phone. As a reminder, you can find the show notes for this and other episodes at our website, thearshow.com. And please support the podcast patreon.com/theARshow. Let's dive in.
David, we hear about where we experienced these notional fire drills at work all the time, where we have to rush to try to produce some sort of insight or response to an issue with which may or may not actually be important at the moment. But you've literally lived a fire drill that has led to some great insight. Can you share that story with us?
Yeah, that's right, I love to tell the foundation or story of the company, they actually and he was he was literal, natural fire drill. I think we'll go into this later. But I was working at the time at HP Labs on various topics of manipulating light at the nanoscale. And, and precisely here was we were working on optical interconnects, which is trying to use light instead of electricity in computer chips. And we were working in the lab one day, after many, many thick weeks of effort, you know, we had that hero device way for you know how it is, you know, many, many nights working on this. And then fire drill, right and fire drill. So the siren goes off, and we're supposed to drop everything we have, you know, and run in the parking lot. But you know, I spent so much damn time, you know, trying to perfect that wafer, actually, I grabbed it. And I went in the parking lot with it. And it was a beautiful California day, you know, around lunchtime. And it turns out, I was just kind of talking and you know, and people were gathering around me. And as I was holding the wafer, and it turns out that this way, perhaps structures nano structures that were a diffracting the light, the sun actually, you know, reflecting the structure, and they were creating a very directional light pattern that were really, really cool to see. And so that attractive people and then we started to talk and say hey, that actually, you know, the structures would be a great building blocks for a display. Okay, so so we started just completely serendipitously like this, and then it actually led to the technology that is today towering holographic tablets. So it was pretty cool. And I like to tell that story for sure. It's so amazing.
That an unexpected outcome I guess unexpected benefit from the technology that you focus on creating, let's wind back the clock a little bit and we can talk about what led you ultimately HP Labs originally in in kind of what led you into that whole area of interest for you kind of studying the sort of optical interconnects and the quantum level and kind of imprint lithography whatever it is. But for you what was it really that led you to the study of physics? Where did that passion for the physics come from?
No, as far as I can remember even as a kid I've had this fascination for physics you know for the other stories that you know Christmas I would I would ask I wouldn't ask for toys or Legos or so on I would ask for books right? And I remember when I was very young my dad bought to me it was just them a quantum mechanics book Gary's ation book for for kids. And I just loved it. So I don't remember For what triggered, as far as I can remember, I've always found it fascinating. I think what I like about it is probably the fact that it's what gets the closest to magic, right? That you can imagine in this world, right? So when you, when you master physics, and you understand physics, you can make seemingly magical things happen, at least to the rest of the people who don't understand. We don't understand physics. And I think that's probably why I've always been fascinated by it
went beyond the physics really, to the fundamental elements of physics, you've noticed that your your interest was in quantum mechanics, even as a kid? Why at that level,
I was kind of always curious, you know, how things work? And you know, it's like a nanyan, right? You can, you can always have, you know, first explanation, but then you're like, okay, but you know, why is that, you know, and you get deeper and deeper, right? And then what is, you know, matters made up? And, you know, why when I press my finger onto my hands, why is my finger not going through, even though Atoms are made mostly vacuum? And this this type of question, you know, and, and the more you dig into it, the deeper you go, and that leads you invariably, to quantum mechanics, you know, and in this case, the Fermi exclusion principle that, you know, electrons can be found in the same place, and that's what creates repulsion. That's why you're not going through, you know, your finger and your hand. And so, you know, very quickly, you're led to this to this question. And, you know, so it's just, I guess, some thirst for knowledge, right? And then the more the more you drink it, that actually the thirsty or you get through it, because you're answering one question opens two more question. And it's, you know, then you're thinking in that in the dynamics
of college, when you went to Stanford to pursue your PhD, what was the specific problem you were most interested in investigating? At that time,
when I started at Stanford, there was a change of plan. So when I was in, undergrad, back in France, I wanted to be particle physicists are actually at the time, string theory was starting to be a thing. So it was beautiful mathematics that had maybe up to today, I haven't followed lately, but you know, I don't think it's been confirmed that it has any bearing with reality. But the math is so is so pretty and so interesting. And it can explain a lot of physics in a very nice unified way, you know, this is what I wanted to do. But then and that's to tell you the level of I would say of practicality that I was at, at the time as really deep in, in theory, right and explaining things I did a practical internship in, in a quantum computation Lab at Stanford. So for me, that was that was as applied as, as it was getting at the time. And after three months, I was working with Professor Yamamoto, who is one of the leading expert in the you know, kind of quantum manipulation of light. And I was working on a really cool experiment where we made a single photon gun. So with a device that essentially every time you press a button with a meet a single photon, and when you could press the gun twice, and with me two photons in a sequence, and then we had a really cool experiment, for the first time, we demonstrated that when you when you get two of these photons in a row, and you make them collide, on a piece of mirror, they would interfere so that that's where the beauty of quantum mechanics happens somehow, when two identical photons come together, they always come out on the same side of the mirror, they either both bounce or both go through, but they never, they never cross. And that's the that's kind of one of the magic of quantum statistics. And so we demonstrated that in the lab for the first time and got a publication in nature. And that was just after a few months, I was just a very, very young graduate. So I said, Okay, you know, let's explore deeper. So, you know, I stayed in that group, and then ended up working on on quantum computing with light with light. So that's, that's what led me to do it.
Yeah. And so quantum computing with light, the alternative is quantum computing with electrons, as we have it today with electrons Correct, right. Yeah. So here now you are increasingly experienced with and obsessed with light at the quantum level? What was attractive about HP Labs at that time in your career?
Yeah. So So firstly, at the end of the PhD, when I became more and more applied, so I think after having these ideas and conceptual, you know, how to make, let's say, quantum processing, and information processing with photons, and I really started to be like, hey, you know, how do you build things? How do you actually make a system where you can produce these photons make them interfere and so on. So I started to really study nano fabrication, and that Stanford at the time, had a, you know, a beautiful nano center with brand new machines, and, you know, started to kind of really enjoy it. So I stayed in a little bit longer at Stanford to learn, you know, these tools and to be able to make all these structures and you know, again, that was another area of fascination because not only you can predict, and you can predict magical things, but then you also make them happen in the lab. So that was that was paradise at the time. The thing about Stanford is you had hundreds of students, you know, wanting fighting for that tool. So the way it went is if you wanted access to one of these nanofabrication tool, which was a you know, electron beam lithography to be exact, you If you had to reserve up to one week in advance, but he was always booked. So if I wanted to get access to the tool, I had to wake up at three in the morning to book for three in the morning, the next week, because if I didn't do that he would always be taken. So it was a total rat race, he had similar and better tools, where you basically, we were two people in the lab, okay, so, so it was this, this unique opportunity. And I joined HP to participate in the creation of the first quantum computing lab at HP, that's where they hired me for originally, and we had access to all this beautiful equipment that, you know, unlike Stanford, I could actually spend the whole night just, you know, just keep writing and keep fabricating and the productivity was was was off the roof. And so that was very, very attractive. So HP at the time was really the primary center for what we call nano fabrication. And I've really thoroughly enjoyed the years, they're developing all kinds of tech.
So maybe you can put that professorial hat on for a moment and just kind of walk us through this notion of using photons versus electrons for quantum information processing, what's the trade off the benefit of using photons? How are they different unique from electrons in that sort of application?
Yeah, absolutely. So so it's very interesting, right. And again, it comes back to the to the quantum statistics and the nature of particles. So photons are what you call bosons, and they essentially, they behave like ships, they like to, to bundle with each other, they like to be in the same place. And essentially, that's the principle of working with the laser, you know, the the laser, you essentially you emit one photon, that then produces the emission of more, and then you end up with a bigger beam of light, electrons are different, they're called fermions. And they don't like to be with each other, and they repel each other. And so fundamentally, they behave, they behave very different. In practice, what that what that means is that light is going to be very good at carrying quantum information and quantum information in quantum mechanics, you describe physical states, of systems by linear superposition, you know, you can be blue, you can be red, but you can be a superposition of blue and red. And then that's, that's, you know, without doing a whole lecture on this, this is what gives you a kind of continuous parameter space, that gives you much more freedom to compute and sometimes much faster than you can do classically, you know, which is the tenant of quantum computing. And with photons, photons are very good that we know once you give them that this, this quantum information, they will keep it you know, they can propagate for kilometres, and they will keep that information, they're very good at storing and transmitting the information, they're very bad at interacting, okay, they, in theory, you know, if you take to take light to light beams, they're gonna, he's gonna cross and we're going to, we're going to interfere very much, you know, unless you, you pay great care. And that was the big deal about that experiment that we did, where if you really type things correctly, and you can offend them, right, then they will kind of, you know, get to to interfere in a specific way, but it's quite hard to make them interfere in a normal way, electrons are the exact opposite, okay, they, you know, they have electrical charge, they like to interact. So it's very easy to realize operation on them, you know, it's very easy to change their state, but it's very hard to keep the information in place when you get an electron in a certain stage, because there are so many electrons around and charges and so on, and just the electromagnetic field fluctuation and just in the room as we speak, they will very quickly lose that information, right? So you have very two different kinds of systems are one is very good at keeping information but very hard at actually computing and changing the state. And a the electrons exactly the opposite, right? So the whole goal of quantum computing with light is to leverage the fact that, hey, you can keep the system's you know, very, very stable. And you can send this this quantum states or, you know, at the other end of the planet, if you want, but the trick is, how do you get to do the computation, and usually requires some type of system that involves electron, so you will, you will turn a photon into an electron and then back into a photon. And that's how that's the whole trick of doing the computation.
And so you mentioned that when you were working at HP, and you had the fire drill, the specific technology we're working on at the time was optical interconnects. So how to optical interconnects play into this idea of leveraging electrons for computation and photons for transmission of quantum computing information.
So optical interconnect, in a sense, it's application of nano photonics. So it's again it's still manipulating light on you know, tiny waveguides and at the nanoscale, but you you remove the quantum aspect from it so now it's just purely classical. So you think of light as you know, maybe just, again, just a light ray or so on. Electrons as a flow of current, you know, classical electricity. The thing about electrons as they propagate in wires is the heat up a lot. They're susceptible to electromagnetic fields, you know, if you if you get a very large you know, He filled in, in a room in a server, for example, he will, he will, he will kind of, you know, mess, mess things around and you can, you can pretty much kill your your computer in this way with an electromagnetic wave light instead, you know, because there's an interact, we said he doesn't interact very much if you propagate information with with light, you know inside the computer, you will do so very fast and it will not be prepared and most importantly will not heat up very much right? Because not interacting with your environment means that you're not hitting the environment either it's very clean. And so we were trying to leverage the time there's a bigger effort also through the DARPA funded DARPA being the kind of defense type of program where they fund some r&d initiatives. So it was a lot of a lot of people at the HP and Intel, I mean, a bunch of big companies were after this optical interconnects to keep Moore's Law going, right. Moore's law is the you know, every year and a half the power of computer doubles, and so on. And every year people are at the end of Moore's law, but the engineers always find a cool tricks to keep it going. And at that time, the bet was on these optical interconnects, like, hey, we need to replace electrons by light, to make sure that we keep going, you know, fast, and transport information very efficiently and very fast inside computer chips.
So is this work on optical interconnects that you're trying to save from the fire?
That's right, that's right. And to be specific, one of the challenges for optical interconnects was, you know, once the light is inside the chip, you know, it's all good, but you want to be able to communicate in and out of your chair, right. And if you want to do this with light, it means you need to couple a light signal that would maybe light that would propagate from the outside and typically in this system is through an optical fiber. So you need to bring an optical fiber from the outside, and you need to you need to somehow couple it, you need to use a nano structure that will take the input from an optical fibers or photons flowing from the fiber and very efficiently coupling them inside the photonics chip like which is which is planar. And the structure that we were using to do that coupling, you know, our structures that we're interacting with lights and the sign on that parking lot, and our vino creating these very directional light outputs, that ended up being the basis for our displays
very directional light output. So why is that helpful, ultimately, in creating some sort of display technology,
if you think of traditional displays to pixels, you know, the displays are what we call in our 2d, two dimensional, because the each pixel that you're looking at is anything the same information in all direction of space, right, it's maybe if you turn that pixel on your screen white, wherever you look at the pixel from it will be it will be white, in particular, your two eyes will see the same signal. And so it will look it will it will look like flat because your two eyes see the same signal coming from a given plane. So it understands that the signal is flat. The reason why you see 3d in the world is because your two eyes see different images, different version of the world slightly shifted. And this is you know, then your brain is going to take these two images from your eyes and he's going to recreate the illusion of depths, right, that's actually 3d is an illusion that is made by your brain but but the input has to be slightly shifted image perceived by your eyes. So the key to making a 3d display that doesn't require glasses, but you can always cheat. And if of course, if you put glasses in front of your eyes like people do with the headset, and you know, we had a big upon announcement, I think yesterday that everybody or two days ago that everybody's talking about. So if you put if you put displays directly in your eyes, then it's very easy to send different images to your two eyes. Likewise, in the movie theater, when you put your glasses on, you know, again, the screen actually carries two type of images that is carried by different kinds of lights that is filtered by your glasses and your left eye You're either going to see respective image and that would create depth. But if you want to do this without glasses, let's say I want my laptop here that I'm looking at I want I want to see 3d from my laptop, it means that pixels of my laptop somehow will have to send a different information to my right and my left eye. And this is where the having directionality having so called direction or pixel comes into play. Right? So the key to making 3d slash lightfield slash holographic however you want to call it display. The important thing is that you don't need to wear glasses is that you need to have structures and pixels that are going to be able to send different light in different directions of space and with very good separation meaning you know very you need to be able to be very, very directional, very, very separate, I think in the images you know very low leakage or crosstalk between these different directions.
And so if you can create a display where you can selectively produce an image looks different in different directions. Then you have an opportunity to create the perception of 3d depth, because each eye sees something a little bit different. Yeah, that's
exactly right. Yep.
And so you recognize this, as you were standing there in the parking lot of HP Labs, and you had this group of scientists and engineers excited about what they're seeing as the light was reflecting off of this wafer that you had in your hands. How do you take this sort of insight, and it transitioned it into being the company that you're now running, or now have been working on for almost a decade,
it was very interesting. First over the sort of, you know, the light beam, the directionality, and the 3d display at the time, light fields were actually pretty popular in the scientific community, not for displays. But for cameras, if you remember, that was the time was a company called Lytro. That was a very popular, you know, Stanford grad. And, and the light field camera would record, essentially a light in a direction or way. So you know, this kind of the inverse of the display, when two light rays would heat the same pixel from different direction, the pixel was actually able to record both traces, and essentially, just mark the fact that the light look different from from two direction. So we were talking a lot about, you know, I have a lot of diagrams about what light fields look like. And they look like, essentially a forest of light rays, right? So you have to imagine, you have to imagine a bunch of light rays coming on to different point on the plane, but also coming from different direction. And whether it's a camera where you record or whether it's a display where you aim it, you have that picture in mind of having tiny structures that have somehow that are associated with different, you know, Myriad's or different different direction of light rays. So this is the picture that came to mind, you know, very specifically under under the sun, it was like, hey, that that light field, that field of life, this is exactly what we can, we can recreate. And then when he take it was pretty funny. I remember just wrote, you know, just after these, just that this that night, made a PowerPoint, just highlighting, hey, you know, instead of having a few couplers, for optical fibers on our chip, if instead we we actually put couplers everywhere on the surface, right, but then we wouldn't be able to do, essentially a display, right, exactly what I described, like the display, and you know, I'm skipping a little bit, the details, but at the time, we were using lasers and waveguides, and all kinds of complicated technologies. But within a few weeks, we had a tabletop demo. So he was taking a whole optical table, you know, like you have in these labs. And and we were producing a small little hologram. At the time, it was very simple experiment, it was a letter X on top of the letter O Okay, so you had an X floating on top of an O and it was maybe a half an inch wide. And you had a big laser system, and D couplers. And everything, and it was taking the whole, the whole room, right. But it was very impressive, I think for people. So we started together interest inside HP. And you know, in these big companies, you have, usually people I left like 20% of their time or 10% of their time to work on, you know, different projects that, you know, they have to be relevant, they have to be scientific, of course, and so you know, a lot of people started to work, so called 20% of their time on that project. And a year later, it was like 50 people working 20% of their time, and I was working more like 150% of my time on this. And the system got better and better. And then we started to show it to different customers. And in 2013, we published our first paper and made the cover of nature. And so it was was at that time, we had a lot of press and it was HP hologram. And we got into in all the major press outlets and so on. And so we started to have a lot of a lot of interest from from inside hp, but from outside HP as well. And you know, long story short, I think HP was not equipped at the time to kind of move fast and to say, Okay, we're going to fund this research and we're going to kind of productize so I had a whole plan technical and business plan in mind. I said within four years, I want to be in a phone that was 2013 or 14. And I think we agreed at the time that the best way to do this was to spin off HP. So we had an agreement with HP and they're very friendly. And because we spin up and for the fun stories to these days, our lab is actually located inside HP Labs. So we actually ran some facilities there and we share some tools and we even to this day, we do some some some jobs for HP some time on optical interconnects actually using our tools. So
this is still a very close and friendly relationship. Yes, absolutely. So from a practical sense, why do consumers care? How is it that light field displays are better than traditional direct view display technology whether it's for a smartphone or a tablet or any other sort of traditional screen that we're used to?
Yeah, so it's funny my world that we ask the opposite question is we say why do people tolerate 2d When the world is 3d and then any visual medium should be 3d the same way that visual media our colors today right and they were black and white before but as soon as color became available, every single display on the planet or in color and you know the same thing is going to happen? And 3d is just a matter of it's just a matter of cost. And it's a matter of, you know, readiness of the content ecosystem as well, I guess, well, we'll talk about that more seriously, right. I mean, we have all been 3d environments. So we perceive things in 3d, we feel in 3d, we even dream in 3d. And so being able to get visuals that are 3d, immersive, you know, gets you just more engaged, get to feel closer, you know, if we were going to do that talk right now in 3d, and I know how it feels because we have a 3d chat app in collaboration with zoom on our, on our devices. So it does feel much more engaging. And, you know, you keep the eye contact. And this is just another experience, you know, whether it's for entertainment game movies, whether it's for productivity, or remote interactions, you know, in terms of chat, and so on, you know, it's just, it's just a better medium, right. And again, the reason why it's not everywhere is because so far, the there has been no really good technology. I mean, there's a lot of good technology, but not good enough that there's no excuse for the for the consumer to say, Hey, I love this, you know, and, you know, I'm willing to kind of pay just a little more, you know, to get 3d because it adds value to my to my life. Right. So I think it's just a slow progression of technology. But we are at an inflection point, I think, in this year. And you can see, for example, from the Apple announcement, you know, even Apple is getting there. So when when Apple joins 3d, you know, that the time is right, right. So,
yeah, absolutely. So who benefits the most soonest? Like, who is willing to, to put in that extra work to find the device is able to do this? And how does it benefit them?
There's two answers, right? You can't you can look at business applications. And this is for disclaimer, this is not the road that we took. So I'll give you my true answer later. But you have some other companies have commercialize their laptops, for example, for designers, so if you're doing 3d design, and or you're doing some 3d modeling, and so on, then I think there's a, there's a case to be made that yeah, you know, you can see your creations they immersive, you know, it's a, it's actually give you a much, much better kind of representation, you get the spatial awareness better, and so on. Some people have argued, you know, if you do research in pharmaceuticals, and you want to look at protein folding, and complex structures, and so on, then, you know, 3d 3d will, will help, I think a bit a little bit different, because again, for me, it's more about the fact that 3d is going to be better for everything except perhaps reading text. Okay, so for reading text, you don't need 3d. But everything else, as I mentioned, whether you watch a movie, you play a game, or you do chatting, or you look through, you look at pictures, everything else is better in 3d. So I would still argue that the consumer is going to be the one that is going to benefit first and the OEMs that are commercializing that 3d technology first will also reap the benefit, you know, they have the brand recognition, and then they'll have the early the early followers, right. So this is essentially what we're doing now, in partnership with some of the with the larger OEMs, to release that technology to consumer directly.
As you're saying this, I'm staring at the windows that are on my laptop screen right now. And there is an attempt in this 2d interface to project a, the feeling of 3d by using shadowing. So the topmost window, you know, projects a bit of a shadow against the bottom, the lower and lower and so on, to create the sense of depth to help it distinguish, you know, have distinguished place in my in my brain, I guess, one of the other companies who had a chance to chat with on a podcast a little while now, but had a chance to chat with Sean at Looking Glass Factory, also trying to solve a similar type of problem, and how do we deliver a great 3d viewing experience on a display? The approach is quite different. Can you describe how the approach you're taking compares or differs from the looking glass?
Yeah, absolutely. So part of we started pretty much at the same place where we started in what we call the static light field displays. So those are displays where the display doesn't know where you're located. And so they have to the display has to send many different views or call them views so many different would say, point of view of of a given scene, you know, from the far left to the far right. So that when you come in front of the display, wherever you are, you know, the display attempts to provide a good 3d experience, and also to multiple people at the same time. Right. And so I think that that is good, but what we learn is that when we try to put the technology on a cell phone or a tablet or a laptop, the degradation that these causes to resolution, right, so So remember, like we have, we only have a finite number of pixels on the display. And if we have to split them to provide different images from left to right, let's say you want to provide 100 images, your resolution is going to drop by 100. Okay, because by definition, from one point of view, you're only seeing one out of 100 images means you're seeing 100 of the pixels, and we find that for signage or for larger display You know, where the the goal is to show maybe some retail objects and so on. I think the static life approach was good. And I think that's what Shawn and Lookingglass are focusing on. The focus in our company is really to get on, you know, mobile device. First, I'll tell you the reason why mobile is because mobile also contains wonderful ways to capture 3d, you can take 3d pictures, you can take 3d scans, and so on. So we've been always also kind of weary about the ecosystem, and how do we provide ways to create 3d for everybody, not just the Hollywood Studios creating 3d, we want everybody to be able to create soap. So that's why we focus on mobile. But when you want to be on a mobile device, you cannot drop the resolution as much so so the city pretty recent, I think we made the transition last year we're using head tracking. So the display now is going to be aware of fairly fairly accurately of where your head is located. And then we don't have to render all these views, we can render just just to collect, they're there, you know, and they're updated as you move your head. So you still kind of get the look around effect and so on. But we only render two at a time and that allow us to conserve the high resolution that people have come to expect on on mobile devices, right, whether we want it or not, you know, the few compared to, to an iPhone or an iPad, you know, the screen is beautiful, and you cannot deviate too far from that quality. Even in 3d,
we'll come back to this technical point that you're making about number of views and the implications on the number of pixels, you have to utilize in consume terms of resolution. But notionally just kind of walk down the path that you end up taking, which you said is to start with smartphones, benefit was you can capture 3d, truly very pervasive devices, billions of sold every year. And you ended up partnering with red, which was a camera digital camera technology company, if I recall, they kind of made their name as the first digital cinema grade camera solution for Hollywood. All right, but we're dipping their toes into consumer electronics at a time. That's right. So what was the approach there? And what was the product you end up creating together?
Yeah, so right, that was a wonderful experience, the reason for a camera company to look at cell phone was actually very simple. They just realized that 95% of all video content, you know, at the time, I think it was 2016 or 17, or started to talk already back then was taken on smartphones. Okay, so, so that kind of worried them. I think at the time, he also had the very first Hollywood directors starting to shoot entire movies with their iPhone, right? I think Mike Soderbergh was it was one of them, you know, was placing with 100 iPhone, like in a room and then you know, it was recording and then the entire movie was shot. So I think they were kind of interested in seeing, okay, you know, like, we probably need to, you know, at least get an experience, you know, in that field and see, see what it would look like, the founder of red Jim jannard. Also is, is a 3d Passionate, you know, he had I think, even bought some companies, you know, to make 3d rigs for cameras, had sold a lot of them to Jim Cameron, and so on. So I think the combination of Yeah, the interest in having, where's the camera footage taking place, and the interest in 3d, I think, led led led him to, you know, wants to commercialize when he called at the time, I think, a holographic plan. So we met in 2016. I think they they really loved the technology, which was very early at the time. To be fair, I think you remember that HP story in a lab on and a big tabletop was 2013 14. And 2016 17 is not you know, in science, yours is not that far away from the from the very start. But I think to the credit of red I think it's it's it was a formidable marketing machine, he got a lot of interest, including AT and T and Verizon, who I think to this day, I think, as far as I know, love the program and convinced them to launch the phone in the in the US, as well as tell sell in Mexico and there was a couple other programs. So he was the very first iteration of the display while he wasn't tracking the face. So it had you know, a little bit that lower resolution feel, but you know, it was still very acceptable. And we launched the phone was read I think in 2018, again, projoint, Verizon and AT and T so it was such a thrill to see your product, you know, in the AT and T store in San Francisco, close to where I live, right, and then bring in the kids and the friends and, and seeing that it was quite a
moment. And what did you learn from that experience?
You know, it was it was very early, right. So so if I had to redo it, we would redo it but you know, a posteriori I think it would have required a little bit more preparation. I think we kind of rushed into the project, especially on the software side, I think on you know, on the hardware, we're like, Okay, we're gonna have this 3d display hardware. And then you know, we said hey, by the way, we need a probably a way to take pictures right you know, your read, right? So we want 3d pictures. We don't want just just to the picture from the phone. So we had to kind of while running we had to start up a 3d camera program and we had to kind of you know, get have the software in place and the UI and so on. And then we started to build an app store. So it felt a little rash. But to the credit of red and our team, I think at the end, we, you know, we pulled it up, I think it was a little late compared to schedule, but kind of kind of turned back and see, the amount of work that had been done is incredible. So, yeah, you can always say, we should have been, you know, should have taken more time, you know, polish the experience more, and so on. So, you know, if we had to redo probably would take a little bit more more of our time, you know, more like the Apollo approach, you know, where he takes several years to really try to understand what's the need, and was the experience and to polish. And in this case, you know, it was a little rushed, but it was a tremendous experience. And, and what it did for us is that it pushed us also to start investing in the software and the content ecosystem. So because of red, because of red, we actually stepped up and, you know, we started to work on you know, we built an app store, for example, we released a really good SDK for Unity and Unreal, so that gaming studios could create content for this type of displays. And, you know, we started to look at computer vision, 2d 3d conversion in particular, so that people could take to the content and can read the kind of ready to 3d. So all of these activities that today are very important part of our business started, actually, you know, thanks to the right collaboration out of the need created by the right collaboration.
So you took that all that insight gained, and the focus on software and content and enabling creators to create experiences that can be appreciated on the devices that you're creating, as well, you ended up making the strategic decision to have your own branded devices. So what was the strategic thinking behind that decision to go with your own brand,
it came also out of necessity. So we were, you know, unfortunately, the, you know, Read program stopped, and, you know, was was a shame, because I think a lot of people, including the carriers wanted more, but, you know, for for various reason, red decided to stop. So, you know, and at that point, you know, we had been exclusive also with with red for a while, so, so, you know, so this was a wonderful program that, you know, we were effectively working for it for a year and a half. So when that stopped, right, we lost what I would call that pooling function, you know, Red Hat's pulled us, you know, they came to get us, they wanted our tech, and they really pulled us, you know, to develop, and then suddenly, we lost that. So we were a little bit on our own. And we realized that, you know, if you don't have another, another pooling function, you need to be able to show people what the experience can be like, right? And the hydrogen was great for that. Because, you know, to this day, you know, I'm still using my hydrogen, right? So I can take 3d pictures and so on until a layup makes another phone, I'm gonna keep using it. So hopefully not too long, because it's starting to get a little old. And but I think what we realized is, unless you are able to show what the experience looks like, completely OEMs are going to be very hesitant, you know, it's a big investment for an OEM to invest in new technology in a new ecosystem, and so on. And so we decided to do more like what we call the reference design, rather than a product, right? So we decided to prototype and commercialize in low volume tablet, or which was the lone pad one, which was pretty much the same display as the hydrogen is bigger, which was, you know, very interesting, I think people liked the fact that it was more immersive. It also made the resolution issue worse, right? Because it was the same resolution as the phone, but it was two times the size. So the pixelization was twice more apparent, and so on. This being said, it was a good experience. Because, unlike read, we had control over the operating system over all the apps about about the firmware. So we were able to track to get the analytics from, you know, a lot of the users and we learned a lot about the use cases, right? So because on the loom past, we actually had a movie streaming platform, for example, we license content from Warner Brothers, and universal. And so we had a lot of what you would call Consumer entertainment app, right? Like, like, like, these are playing games. But we also had creative tools. So we have a, you know, 3d camera, a way to share picture online, an app called biotics, we're still very, very popular today where people can can share 3d pictures. And we realized after that, that the main use case for our tablet, at least from these early adopters were more on the content creation side that the content consumption, right, so very few people actually spend, you know, two hours watching a 3d movie on the tablet. But people were coming back and taking hundreds of 1000s of 3d pictures and interacting and commenting and editing and so on. And that was really the core of the the ecosystem there. Right? And so this is where we realize, okay, we need to do a little more to support this creative, creative crowd, right? And that's the reason why if you If you fast forward to to loompa two, which is now was actually a proper program with an OEM steel, we are driving that use case of creation, you know very seriously and for example now using tools like generative AI to to generate features in 3d directly on the on the pad, right. And it's to satisfy that thirst for creation. And these these early adopters, you know wanting to create the more than just consume passively.
Creation is king for now on these early adopters of
are now for now for now. That's right. Yeah, that's right. Yes,
there's a lot of emphasis on as you noted, one of the key lessons learned is that you really have to think through what is the software set of software experiences, creation tools, being Prime among them based on what you've learned. And you'd notice that you'd spent some work building plugins for Unity and Unreal, to allow developers to work in their their favorite 3d creation environments and have those exported and working with loon pad. So what is it that a developer needs to do? I'm just going to jump in the weeds here for a moment, from a developer perspective, what do they need to do differently, with, for example, a 3d game that are creating to make it work, the beautiful 3d experience on the GamePad,
very little, right. So if you start from a Unity or Unreal Project, and then just for most of people, I think your your, your listeners will know how it's done, right, but you have a 3d scene, so they're made of meshes, and textures, and so on. And then what Unity or Unreal does is the position of virtual camera through the scene, virtual camera that the developer can can can move and can zoom in Canada, and Japan, and so on at will. And that will essentially capture a 2d version of the of the 3d scene. And what our SDK does is very simple, right? If your scene is untouched, so all the time it took you to create models, and so on, and place the lights and all of that is concerned not to touch anything, we're just adding a camera, or we're just adding a camera, if you're too on camera through the scene, and it's done automatically, right. So if you, for example, you had a game done in unity, where you already have set up your camera motion, and so on, our plugin will go it will detect where the camera was located, and it will add a second camera. And you will position it in a way that the 3d content that is going to create right by having slightly different points of view center, your left and right eye, that the amount of depths and the places are what we call the 3d focus, you know, where the where the where the content is going to, you know, be maximally overlapped on the display. All of that is done automatically, of course, the developer can control that at will. But you know, the very first time you use our SDK, it's literally magic experience, it just turns into 3d. It's properly calibrated, you know, not to litter that too much. And then as you move around, things stay in focus, the you know, the object that you're supposed to look at is in focus. And it's, we try to make it as comfortable as possible. So it's a it's a very benign experience and actually pleasing, right? Because it's you get these these tools, this kind of magic one that kind of autofocus in a sense in 3d for you.
And what were the other pieces of software, you noticed that there was a picture sharing application? What were some of the other and then the content consuming, you had licensed from a couple of key studios that people could consume the 3d 3d movies. Yeah. What were the other bits of software that you found to be the essential suite?
Yeah, so no, no doubt is on the computer vision side is the what we call the 2d 3d conversion, right? So so the SDK for Unity and Unreal, you're already dealing with 3d scenes. So you already have 3d content, and it's just a matter of Riri expressing it on the display, but you don't have to change the original content, right? Similarly, you can be given as you know, already is what we call the side by side, you know, stereo content that is coming from a movie studio, you know, from all their, their library or 3d movies. Or more recently, maybe you have a VR headset that shoots in stereo, again, going back to Apple, we're going to have a lot of that, you know, starting next year, where you're gonna have a lot of even live stream, in stereo from people wearing this headset. And again, that that is that is okay. But of course, most of the content produced to date is to the right, so you have just a rectangle, and it's an image. And I think we find a way to very reliably very fast and on device or on a on a mobile device to take these images or these videos and converting them to 3d, you know, just completely automatically, right. And the process is, you train a neural network with a lot of 3d content. A lot of that 3d content comes from our social sharing, you know, for example, an app right where people have uploaded hundreds of 1000s or millions of these 3d pictures. So we anonymize them, you know, with people's concepts. We actually then use these pictures to train neural network and and you know, to essentially from the left image, how do you create the right image, right, so we actually have the ground truth for that because we have left and right. And when you do this, you can then teach a network from any image to put Use essentially a movement of the virtual camera effectively, right, and to and to produce a novel view. And we can do this very, very well today, I think we probably have one of the best to do 3d conversion out there, to the point where Yeah, so we will say, you know, this is this is what the core of what we call our media SDK. And our media SDK allows you to convert pictures and videos. And it's today embedded in some core apps, we launched the latest device with ZTE, which is a Chinese OEM they launched in China last month. Neagu was the content arm of China Mobile has integrated their media is decaying their app, so is the you have the equivalent of YouTube in China, that can be watched entirely 3d, every single piece of content on there is translated in real time in in 3d, and that's a that's a real crowd pleaser.
Yeah, that's pretty amazing. That's awesome. And as you've kind of gone through this evolution from the hydrogen phone with red to Luma, pad one and ultimate pet two, what were some of the biggest hardware challenges you've had to overcome? So first,
was that resolution limit? Right? I think in the beginning, you kind of try to, you try to ask people question, you know, the experience, and so on. And I think invariably, even if they don't say it, in their own words, I think you understand that it's the resolution, that was the issue. And solving the resolution through brute force would have been impossible, it would have been starting to use 4k display for mobile and tablet. And actually, that doesn't exist today. Or maybe you have a couple or it's too expensive, or moving to 8k or 16k. And it's just, you know, it doesn't doesn't work. So the challenge is that we have to work with the limitation of the current industry, because we don't want to invent a new display, or technologies to modify an existing display with a piece of optics. And so we have to find in other ways, right. And this is where the head tracking came in handy. Right. And so that's how we solve that resolution issue. Now, head tracking is bringing its own set of issues. Because head tracking is done with again, you know, standard cameras that you can find on on a tablet or phone, they're not optimized for low latency or for for the purpose of eye tracking, they're actually defined just to take beautiful pictures and selfies. So we had to somehow work around some of these limitation and make sure that the camera that was recording your face can produce essentially a head location very, very fast, right, because if it's too slow, by the time you move your head, you know, by the time you display the image, your head would have moved to another location, and then the 3d will be ruined. So right now, I would say that's the, that's one of the core technical challenges or opportunities that we that we have now to improve the experience, which is essentially predicting where your head is going to be at the time that the image is rendered on the display.
So making it seem as low latency and seamless as possible, doing the accurate tracking, you're predicting where it's going to be in that way the whole pipeline feels truly responsive.
Yeah, yeah, totally. It's similar problematic, then in VR, you know, in VR, if you're not responsive enough, it's worse in VR, because you're immersed in an environment, you know, you don't have any real world reference point to kind of, you know, set your set or your vision, right, you're, you're entirely in this virtual environment. And if when you move your head, if the environment doesn't move, almost instantly, you feel you will feel sick. So the same problematic, I would say, luckily, are there in VR, so that the industry as a whole needs to invest in this sensing solution, low latency camera tracking solution that, you know, we for sure, would benefit from, and again, I think the bar for us is a little bit lower than, then we are, but definitely, the ultimate experience, especially when you have a phone or a tablet is you know, I want to be able to play a game, maybe I want to drive a car, you know, I want to turn the car by turning my tablet, I want the experience to stay very solid, you know, when you do this, or you're going to be doing 3d AR right. And we have these actually working on the pad where you can do a video pass through the pad has a stereo camera, you look through the pad, and you are seeing a 3d, you know, just essentially the the object that's in front of you on the table is coming out of the screen with the correct proportions. So it looks just like a 3d pass through. And again, as you as you move around, you kind of want the pass through to be stable. You don't want it to flicker to or to break the illusion. So you know, that's that's essentially the the challenge for the next generation of devices.
Yeah, yeah. Fascinating. So one of the big challenges that many a hardware company has faced, as you as you kind of go from conception to privatization is funding it. Funding for hardware technology companies is notoriously hard. But at this point, you've raised hundreds of millions of dollars. So as you reflect back on this fundraising journey that you've been on, what was the initial approach? You're coming out of HP? So I presume that there was at least a lot of blessings coming from HP, maybe one of those blessings with some cash to get you going. But how do you think through that initial funding of that initial cash inception of the company, and then maybe some of the key milestones that drove the subsequent financings? Yeah, you're
right. Startup is always in fundraising and a hardware startup, you know, even even more because the the number of zeros you need to line up is up significantly higher than what you can do, I would say, with your laptop in your living room, you know, even as a as a grad, you know, if you were doing a generic TBI startup these days, so, yeah, so when we when we spun off HP, so first of all, I had my co founder at the time came from Wall Street, right? So we were up there, I was at a tech guy who was the finance guy, and he managed together some some interest from, you know, while I was still at HP managed to get her some some interest in the tech, he wouldn't have, let us go. I think if we didn't have like the proof, actually, we had to get some of the money in the bank first before he can, you know, let us give us the final blessing. So, so that's how we approached, we had a non traditional series, we raised $25 million series A directly because we wanted access to one of these nanofabrication to kind of optical stepper, which is a tool that can produce these nano structures, so that we could essentially make devices are already devices. And that cost a lot of money. Right? So so just one of the students, you know, four to $5 million, just just just by itself. So we had to raise, you know, a little more, but we were very successful at that. On the series, a subsequent rounds. We were I mean, you always need a, you know, a bit of luck, right. And I think that one of the the nature of what we do is that a lot of people don't believe, you know, a lot of people say 3d is a gimmick, or you know, I don't believe in VR. I don't believe in Metaverse, right, you know, and that's fine. But a lot of people on the opposite side get very emotional, like, and that's the case of Jim jannard of red, for example. Right. So as soon as he saw the technology, even at the early stage, he kind of saw what that would become. And then he became almost obsessed about it. And our series B was almost gym entirety. Right. So right, I think invested, you know, $50 million in labor as a strategic with an exclusivity clause. And so that's what took care of the next round of funding. And then we had a similar experience in the series C was Continental, the automotive tier one, you know, at the time came and said, same thing, I think so the potential of the tech, and so we had another strategic round. So we managed to fund the company, you know, almost exclusively with with with strategic, you know, which is, to me the best kind of of investment, because, you know, everybody's getting, everybody's getting rewarded. I think, as a startup, I think you're getting funding at better terms than what you would have with traditional VCs, obviously, the strategic investor, you know, has interested us succeeding, because they want to use your product, right. So it's just great to circle and that works, and it works very fine. And then you know, COVID hits or like, like, a lot of startup and so on, we want it to be cautious. So we we raised the debt financing with a company called eon which is actually an IP specialist and we have a very, very strong IP portfolio. So it makes sense actually to to partner with with AON that has kind of a lot of, you know, IP practice and so on. But I think you're right, we, you know, it's constantly, you know, until you're you're cashflow positive, you you're constantly on the on the current funding tour. So,
yeah, continental that was an amazing strategic partner for what you're doing. Congratulations. That's Yes. Wonderful.
And continue to bring in certainly the rigor, right, that is not to be missing, you know, in, you know, in this time, right. So they have a very structured approach to product development, even on the r&d side. And it's been a wonderful partnership with them since 2019. And it's still going strong. I keep joking when we have this meeting, that we're still in the in the honeymoon period with Qantas.
So they're bringing rigor so this notion of bringing rigor in now, you know, you're working with an OEM on the moped to kind of working towards these broader scale, availability and ultimately, adoption of the core technology. So what is that path? What are these major hurdles that you need to overcome to in order to to enable large scale adoption?
Yeah, yeah. So the goal was never to have a, I would say, our own brand of devices, right. It's just it's just with takes too much too much funding and effort and so on. You know, and I think the path to scalability for us is to scale through this OEM partnerships, and even evolve the business model from you know, selling devices, selling displays to setting maybe just the core updates, you know, that gets into the device, and perhaps even licensing model right where I think we teach the supply chain how to make this because now we have several years behind us of optimization, the the yields are high, we can reproduce and therefore we can also teach others. And I think this is probably a path that will that will consider and I think the company will be successful if the install base base is large, right? There's no sense really talking about revenue of I would say even content and games and other things, you know, as long as the install base is, you know, not, you know, hundreds of 1000s or millions of people. So I think what we're going to focus on, you know, from from now on is trying to make sure that the technology equips as many devices as possible smartphone tablets, and, you know, maybe looking at laptops, monitors, and of course, the, you know, the the work with continental will, will come online at some point in just a bit slower with with automotive, and a variety of other applications also that were that were that were looking, but I think, I think we reached a point where the technology is cheap enough, high quality enough, essentially, you know, it's a no excuse, no, no compromise, right, you don't have to compromise on the to the quality, I think we probably failed to mention that. But the specificity of our tech at layer is that you can switch it off, right? So we mentioned that the 3d was expensive in terms of resolution, right, you degrade the resolution of your display, even with the eye tracking, you still have to divide this by two. But we can switch it off and we can we can maintain the pristine quality of the display. And I think with recent advances, whether it's OLED or LCD, we're compatible, and we can guarantee that your quality into these is the same with or without the 3d upgrade. So when you think about this, there's really not much argument again, so why wouldn't you? You know, is that what you think looping back to what we said at the beginning, you know, when you're asking why use 3d, and I'm challenging you, I say, why use only 2d, if you can have 3d for almost the same price, and you know, and the same quality of 2d. And I think we're there now, and we can feel it through the interest of the OEMs. Right? They're saying, Okay, I don't have to spend, you know, double or triple the price and pay for a normal display, you know, it's only a fraction. And the ecosystem is there as well. Right? I think we're trying to demonstrate that the current loompa to and that we just launched with ZTE, we have really cool application like 3d Chat, where we partnered with Zoom, you know, in the US and with outside China, in China, we partner with Neagu. And there's a couple others think Tencent has been announced as well in order to do a video chat and so on. So you know, it's starting to be also interesting from an ecosystem. And from a content point of view, there's a lot of stuff to do, right, you have everything you do in 2d. And then you're starting to have you know, this, this extra layer. That's, that's, that's very engaging.
I love it. I'm such a huge champion of this idea of having every display that you have in your life be able to project 3d. I think it'll be a logical next step. And we'll like you said, as we were on the other side of color television, we'll look back. And we'll think why, why would of course, everything should be color, color. And same was true when we went from, you know, the silent movies to the talkies. The amount of resistance to even incorporating voice and audio into the, into the movie experience and how we can imagine otherwise. That's right. That's right. Yeah. So as you this notion of reflecting back, you've now been with layers, you've been working on this for over a decade, technology wise, but lay as a proper independent company. It's been just under nine and a half years now. Yeah, you've been through an evolution, right? You were a pure technologist, when this whole thing started. And you found a partner who knew finance, but you had your own perspective on how to set up the business side and work through the strategy. You've been your own journey through this kind of evolution of the company. And you have also also evolved as a leader. So if you're going to reflect back on that, how have you evolved? How have you changed?
Yeah, it's very interesting, right? If you if you remember the beginning, I started as a pure pure academic, right? So just just really particle physicist is, as far as an entrepreneur, as you can think, I think I got started the entrepreneurial journey almost, yeah, almost against, you know, against my own inclinations. But I grew fond of it, right, I think there's something empowering in being able to, you know, in my case, have a very good understanding of the product and the path and division or where I need to go. And at the same time, through my position, you know, I was CEO of the company, until last year, we hired a wonderful CEO last year, so I'm now CTO, but you know, regardless still in the executive team, and through that, through that power, you know, being able to influence and to make the right decision, you know, essentially being able to follow that vision is quite fascinating feeling, right. And I think it's also very efficient. We don't spend hours in meetings, you know, when decisions need to be made, we all have an understanding, we have this kind of small decision team, we have a common understanding of where the company needs to go. And so I think this is this is very efficient. So I like I like that, you know, fast paced, being able to, to make it to have an influence, right. In opposition. If you work for a big company, you very rarely get to make a decision that would materially influence the The future of the company or the future of the consumer, you know, that are that are using the product of the company. And I think, you know, when you're when you're at the helm of a startup is the opposite is every decision you made, you know, routinely are basically impacting people, right. So this is quite a, quite a thrill. This being said, as I said, I think by nature, right, my own personal interest is to push technology. And I think last year, I was starting, you know, with COVID, you know, more fundraising and you know, a lot of discussion about should you raise debt or equity and meeting all kinds of people, and he was starting to be too distracting, wasn't able to put, you know, as much focus on what I'm good at. And what I think the company needs was just to kind of drive the product from 99% 200%. And therefore, you know, stepped down brought Cecilia Quist, who's a wonderful CEO, we upgraded the management team, we recently brought in also a new CFO, and in 2023 onwards, you know, we have a new management team, which is very functional, which is much more mature, you know, cares about corporate governance and transparency, and you know, kind of things that, frankly, we didn't really have time to take care about before. And I think getting more of a professional management, and it feels it feels wonderful now, I think, and I'm able to focus really on what is on what is key, which is driving the 3d and the AI, you know, applications for the delight of our of our OEM customer and the end customer of pulling.
Amazing, congratulations. Let's wrap up with a few in lightning round questions here. What commonly held belief about AR VR spatial computing, this whole space do you disagree with,
so I see AR VR headset based, I think as as very complementary, I've always seen the, you know, the technology as as a blessing, because of the investment that gets into it, it creates interest, it pushes the gaming studios, you know, to improve their 3d rendering techniques, it creates an incentive for people to develop in 3d, today, of all the games ever made, right? 90% or more are made in 3d, even if they're played on onto the screen, right. And I think that this trend will, will increase, the thing I disagree with is that vision that one day, you will permanently leave in a helmet that, you know, all your digital, let's interaction will be done through these classes, just because, first of all, there's some physics limitation, you know, in terms of power, in terms of bandwidth, and so on. In terms of heating, there's stuff that you cannot do, right, there's a reason why the apple headset has a little pack rat battery pack, I think was disappointing for a lot of people, but again, you have physics restriction, and if you don't put the pack, you know, in your pocket, and you need a bulkier headset, right, and there's, there's no way around this, there are laws of optics called there's a French word for the eight undo, right. And the eight undo means that if you want to squeeze the light in, you know, a very tiny area and the you know, the the angle needs to be very large. And that basically constrains what you can do, you know, in terms of the volume of this type of system. So, there a fundamental reason why, you know, until you get, you know, you get the contact lens, or maybe you get a chip in the brain directly to drive here, you know, to drive the imagery directly from the brain, there are a reason why I think the glasses are going to be limiting, right in terms of comfort, so on, just in the context of 3d communication, and if we want to have that conversation, and even, you know, even if I have a very kind of narrow pair of glasses, and so and I still have something, you know, between between you and I, you can't, you can't really see my, my gaze, you know, you can't actually see me, right, and so I think that's always going to be limiting. Now, I think it's wonderful for a lot of application, certainly a lot of professional application, you know, for guiding, you know, in the warehouse and training and tutorial and so on, I think is great, but I'm much more belief, I still believe, of course, in the in the 3d future, everything should be 3d, but it's more going to be through multitudes of screens that are in your life, you know, your phone is here to stay, you know, tablets, computers, but there's going to be a lot more screens in your home, you know, in public places, and so on. With facial recognition, you know, your, the screen will be aware of who you are. And right, when you look at it, it will display contextual information, you know, it will turn into that immersive you no space in front of you. So it's going to be like VR, but you don't need the entire room to be 3d, you're just that that portal in front of you, from which you can disengage very quickly. You just need to, you know, turn your head and you're back in reality versus having to and strap and clean and all of that. So, at least that's our tech. So there's this place for both, but we think that the 3d display, right, the display base 3d is going to be much bigger, especially in the next 10 years.
Besides the one you're building, what tool or service do you wish existed in this market?
Very fast head tracking, right? We want so that's a call out for all these are the XR practitioners that are listening. You know, we want low power, low latency. It doesn't need to be high resolution, you know, try game that is made available on mobile devices. I think this is the this is the single most important thing in providing this comfortable 3d immersive experience on displays. The rest we have the rest we have cracked. Yes.
Yeah. Fantastic. What book have you read recently that you found to be deeply insightful or profound?
You know what I just read the biography of Jennifer Doudna. Yeah, the codebreaker on the discovery of CRISPR, which is completely different, but again, very fascinating, the story of gene editing and how in just less than 10 years, you're starting to have revolutionary, you know, gene editing type of medicine to cure, I know, kind of disease that were sought incurable before. Yeah. So you know, I find there's some some similarities, right, in terms of discovering, like, fundamental scientific facts that look like completely, you know, magical when you don't understand them. And I think that's certainly one of the other areas that is really interesting. And large, because the, you know, I would say, not as complicated as, as quantum mechanics in terms of the the fundamental of the science, but it's through the sheer complexity just because of, you know, the complexity of biological system, that makes it very, very hard to understand. And I think when you once you can master this, it's going to also give a really, really magical result. And we were starting to see that. So I enjoy the I enjoy, and I recommend, actually, I'm reading it with my daughter, my nine year old daughter now to for inspiration for her so
amazing. That's so cool. If you could sit down and have coffee with your 25 year old self, what advice would you share with 25 year old David,
face yourself, right face yourself, you know, I think life in general, but the startup in particular, it's a marathon. And when you're young and passionate about technology or product, I think you always want to go fast, you know, you try to reply to all emails instantly and take all meetings and don't sleep and all of that. And at the end of the day, I think the advice that I would have is, you know, it's okay to say no, it's okay to take a break, it's okay to take a step back. And you know, try to kind of think, what you know, strategize on what you're doing, who you should be talking to who you should add, and so on. So, if I had to redo it, I would probably strip away, you know, at least 75% of all the meetings and all the calls, you know, that I have done and be more strategic on my time. But you know, that turned out okay, so far still.
So far. So good. So far, so good. Yeah. Any closing thoughts you'd like to share?
Yes, I think you should, you should all go and try our technology. You know, the it's called the loom tattoo. If you go on our website, they are inc.com you can you can have a try it again, you see what 3d Chat looks like taking 3d Pictures playing 3d games, it's you know, for anybody interested in XR and in headset and so on should should also try what it looks like, on a regular device. And you know, provide your your feedback so that you can help us grow the product and the company. You know, we always, always very, very happy to get the the constructive feedback of the community.
Yeah, I had a chance to try the latest add a web. And as, as I have been with earlier versions, I was very pleasantly surprised that how delightful that 3d experiences I really liked the the dial the depth feature, there was exactly 3d modeling app. I was looking at
them. I love your new model viewer hears Yeah, absolutely.
Where can people go to learn more about you and your efforts there? Yeah.
So you can go to our website that I mentioned WWW dot lair like the princess, Inc, like incorporated.com or otherwise, you can, you know, find me on my LinkedIn. You know, I'm always happy to connect with people from the community and exchange so don't, don't be shy and reach out.
mazing. David, thank you so much for the conversation.
Thank you, Jason, for having me.
Before you go, I'm gonna tell you about the next episode in it. Speak with Yacine Achiakh. Yacine is the co-founder and CEO of Wisear, a company creating neural interface devices for AR glasses and other wearables. They're packing their technology into normal looking earphones and working to deliver the capabilities of the mouse and keyboard hands free. We'll get into the company's journey, how the tech works, go to market strategy and more. I think you'll really enjoy the conversation and please consider contributing to this podcast patreon.com/theARshow. Thanks for listening.