The AR Show: Mike Wiemer (Mojo Vision) on Making AR Contact Lenses a Reality
10:48PM Aug 30, 2022
Speakers:
Jason McDowall
Mike Wiemer
Keywords:
contact lens
people
solar cells
creating
product
nea
eye
solar
display
company
system
ar
category
device
thought
vision
opportunity
problem
question
power
Welcome to the AR show where I dive deep into augmented reality with a focus on the technology, the use cases and the people behind them. I'm your host Jason McDowall. Today's conversation is with Mike Wiemer. Mike is the CTO and co founder of Mojo Vision, a company enabling invisible computing with AR AR contact lenses. Prior to Mojo Vision, Mike was the co founder and president of Solar Junction, a company that developed innovative multijunction solar cells that set the record for conversion efficiency twice. Previously, Mike spent eight years at Stanford working his way through a bachelor's, master's and PhD with an emphasis on semiconductor physics and optics. In this conversation, Mike shares some insights from his experience at Solar Junction, and how he came to discover the opportunity for smart contact lenses. He breaks down the set of very hard problems the team has had to solve and making these AR contact lenses and the successes Mojo Vision has had and clearing those hurdles, as well as the recent milestone of creating a prototype that people can now wear. He talks about the challenges of motor vision specifically and AR more generally,
I think what I've come to realize is that it's hard. It's hard because it's about creating systems. And it's not about a single technological innovation. And those systems span not just the physical domain, where it's like, how do I make it smaller or lower power, they span, product choices. They span into the kinds of data that we can, we can serve up right and the reasons people would want to buy these things. We can look at Google Glass for learning, and those learnings about what these systems can really be good for. It's not one size fits all right? I mean, different systems have different applications.
He goes on to share a broader perspective of the opportunity he sees for smart contact lenses and AR glasses. As a reminder, you can find the show notes for this and other episodes at our website, the AR show.com. Let's dive in.
One more thing, before we get started, after four years of running this podcast, I've decided to open a Patreon and fund ongoing maintenance improvements that way, rather than taking on advertisers, if you've enjoyed episodes in the past, please consider becoming a patron at patreon.com/the AR show, I'm going to find ways to engage more intimately with subscribers through events such as small group conversations with me or a popular guest, or some other sort of exclusive content. patreon.com/theARshow T H E A R S H O W. Now let's get into it with Mike.
Mike, you are a serial entrepreneur, having lived and worked in schools in Silicon Valley your entire career, will you share a story about kind of the magic of being an entrepreneur over a young entrepreneur in Silicon Valley?
Sure. So I think 28 years old coming out of grad school, I did grad school at Stanford, and coming out of grad school, and I thought to myself, I want to start a company and tried to put myself in a good position to do that. And we wound up starting a company called that we called solar junction. And the whole point of this company was to do ultra high efficiency solar cells. For those of you who might follow the solar category, these are ultra high efficiency, multi junction solar cells. And something that I think is completely and utterly absurd, just absurd about that story is that none of us on the founding team, none of us had ever even built a solar cell before we started that company. And I remember sitting around with my co founders in particular Holman and visit, who are you know, Silicon Valley ecosystem, people now as well. And we had bought these books on solar cells. And we had them open. And on this big table, and we're reading them, like to each other, we're reading them, and we're like, Oh, what is fill factor and what's voc and, you know, trying to learn the language of solar cells. And through the magic of Silicon Valley, somehow, miraculously, we got the opportunity to start solar junction. And I don't remember exactly the details, but something on the order of three years later, we'd actually set the record for the highest efficiency solar cell that anybody had ever made. And it's on the chart, you can look it up. It's an inter governmental agency chart, like different government agencies around the globe can can do these standardized solar cell tests and post numbers to the chart. So solar junction is on there twice. I'm proud to say. So somehow the absurdity of the Silicon Valley machine, you know, sometimes allows fun things to happen like that. So there is an there is an opener,
there is an opener. It was a reasonable bet. apparent in the end. Yeah, yeah. In the end, right. Yeah. We were able to pull off what you had claimed to do. But what gave you the inspiration or the confidence that this was something that you want to devote your time to?
Yeah, yeah, that that's a that's a great question. And I don't know if it'll come across in this interview or not. But one of the things that I'm really interested in is people and their stories and like this question of who are we who who am I? Who are you Do which is at its heart an absurd question that has no answer. And the idea that there's truth, to the answer to that question is just wrong. There's no truth to it. We are the stories that we tell ourselves. So here I am, I'm 40, something years old now. And and I've had the opportunity to start a second company now, and also kind of bizarre, a bizarre thing to do augmented reality contact lenses, as you very well know. And I look back at all those experiences of my life. And I kind of put together the personal story, right of like, Who is this guy, and I'm still figuring it out. And it's not truth that it might be wrong, and it'll certainly change tomorrow. But there are a few things that I think drive me one of them is self doubt, to be just very blunt, we all I think we all have these voices that say, I'm not good enough for, you know, you can't you can't do this in some way. And somehow that voice for me is combined with one that says, Well, I'd like to test I'd like to test that I'd like to see if I'm any good at something, right? I mean, whatever it might be, right. And then there's another side, which is something that I might think of his like obligation. You know, I'm a very lucky person. There are many people like me, but I'm a lucky person I have I have my health. You know, I've had great opportunity that I've taken advantage of in my life. And then there's so there's this voice that says, like, Well, why not meet? Like, why let somebody else what? Why is it somebody else's responsibility to try, you know, and then you mix that with what I've come to realize is a lot of curiosity. So I kind of watch like, what TV shows do I want to watch? And, you know, what do I want to listen to? What podcasts do I listen to? What books do I read? And it's almost entirely nonfiction. It's biographies and science books and documentaries. And you know, and I did a PhD for some reason. So I think when you mix all these things together, and you're in Silicon Valley, and you're, you know, late 20s 2829 years old, and it just seems like the right thing to do. I don't know, go for it, right. Like, Why be afraid of failure? I think I think we did the math and we said, we're going to be stronger if we try. Like we will grow as people and become stronger people if we try. So we thought we might as well try that we thought maybe we could do it.
For you growing up, what was the attraction to electronics and computer engineering and that sort of thing?
I actually thought I wanted to be a chemical engineer coming into school, I didn't really know nope, nobody knows. When you ask a high school student or someone like that, you know, what kind of person do you want to be when you grow up? Or what field do you want to go into? It's, I think, relatively rare that anybody really has true direction. But I did have some direction that I wanted to build things. And I wanted to go get an engineering degree. And as I said, I thought chemical engineering might be the right way to go because I kind of liked my chemistry class in high school, and I enjoyed physics. And I, for some reason, thought that was the way to go. And then I took my first electronics course is called e 40. A guy named Jawad was my my TA, who I'm still friends with. And I remember I mean, I can see the picture in my head, I was sitting in the back of the class. And Professor KURIAKOSE is up at the front, and he explains how a digital watch works. And like my mind was so blown, the picture is still etched in my brain, what my surroundings looked like. And there was it was like an emotional experience, right? And I said, Oh, my gosh, electronics. That's, that's the way to go. And by the time I got through grad school, I've kind of walked a little bit away from pure circuit design kind of electronics and more into device physics. And by the time I really got out of school, I felt like I had more expertise in Opto, electronic devices and semiconductor physics than I really did in in traditional electronics and circuit design. But there's a long winded answer to your question.
No, it's great. So Opto, electronics, semiconductor physics. Yeah, I can see how those connect to solar.
Yeah. Right. It's the theory of stepping stones. Right? That, that there's always something new, that's just a step away. And if you push yourself a little you can get there.
What was the original inspiration to pursue specifically solar? Like why? What is it? What was the idea that you guys had, that you're gonna end up being able to push the boundaries of what's possible and solar?
I think any anyone who thinks that people go and plan out a startup company that they're going to do have no idea how it actually works. In my opinion, startups are pure serendipity. You happen to be kind of at the right place at the right time with some interesting problem to solve. It's less about the solution and more about the problem, and in a story, and if you've got those elements of right play Straight Times story and a problem, you have a shot at starting a company in Silicon Valley to go get after it. Right. So our story was my story in particular was, I really wanted to try to start a company out of grad school. And so I was trying to avoid getting a job to make sure that I was in the right place at the right time, right, because once you get a real job, there's this shift sales, I mean, I'm not going to walk away from that. So I avoided a real job, I did some consulting gigs. In grad school, you're used to live in on bread crumbs, so I could kind of piece it together. And I happen to be in the right place at the right time with some friends. And we heard that NEA in particular, which is a big venture capital fund was interested in funding solar companies back at this time in like 2006, or 2007. And we had between us, Holman, she had done a PhD in what's called the dilute nitride system. It's a semiconductor material system that lets you tune band gaps in a certain range, if that means anything to you. And we thought, Well, maybe you could combine that with some other materials that are more well known to make a really good multijunction solar cell. And there you go. So serendipity, Holman had the material science background, and some device physics and Bridget and I had had device physics and testing and so on. And so when we bought those books and started reading them, it wasn't, it wasn't a new language to us. It was a new areas, new field, there's a lot to learn. We thought it was simple. Oh my gosh, it's hard. We got a lot to learn. But that's kind of how it came together a lot of serendipity, a
lot of serendipity. Ultimately, the product you created, you noted had said two records on that solar efficiency chart. That's right, what what was unique about what you've done, why was it so much better than what had come before,
the idea for making multijunction solar cells is not new, it's a very old idea. Fundamentally, you, you have different layers absorb different parts of the solar spectrum. And if you choose the layers and the spectrum portions properly, you can make an optimal and optimal efficiency solar cells that really harvests all the energy that there is to harvest in the solar spectrum. So the concept was not new. And even the use of the dilute nitride system was not new people had looked very extensively at that system, that material system to make solar cells, I don't know, three to 10 years before us, and they'd kind of thrown in the towel. They couldn't get the material quality of that system up high enough. And I think our realization was that Holman had just finished a PhD, where he was creating the highest quality material that anybody had ever created. So we thought, gosh, well, maybe we could, maybe with the quality of the material now higher, maybe we have a shot at making some good devices. And so that's more or less exactly what we did.
And these multi junction solar cells super high efficiency are not the ones that we have on our roofs, they are not exempt. Yes, yeah, who does use them?
Okay, so the main use for them at the moment is in space power. So most of the satellites that are up there, use these kinds of solar cells, because they are the most efficient. So when you start to do calculations of volume and weight, the more energy you can get out of your solar cell, the better off you are in that kind of environment. When we started the company, you may remember there was a giant solar boom, in that period of time, and then it crashed later. But there was a giant solar boom. And everybody was working on this category. And one of the paths was what was called concentrated photovoltaic. So you take a big cheap lens, like a Fernell lens, and you grab a whole bunch of light, and you focus it all down to a tiny little spot. And you put a more expensive but very performance solar cell in that tiny little spot. So we were working on a product for space power, and we were working on a product for concentrated photovoltaics. And eventually, I think it's fair to say the concentrated PV category kind of evaporated as the price of solar plunged in general. But the space opportunity persisted,
because it's still very much about maximum efficiency for minimum weight, not about cost. What did you learn through this experience about creating cutting edge technology that had high commercial viability?
Yeah. What did I learn? My goodness, there is, there's so much, there's so much to learn from these experiences. I think you can put your knowledge that you take from them in a few different categories. There's a lot of technical knowledge, right, but just kind of discount that and say, Yeah, you know, that's, that's just the normal food we all have to eat as as engineers is learning and perfecting that category. But I think for me, what I grew to appreciate is the difficulty of creating something new is extremely Hi, I'm taxing, it's very taxing. To put it bluntly, because you can't do things alone, you have to do them with a team. And creating an organization that works well, where everyone is headed in the same more or less the same direction. But without too much command and control that stifles creativity, and yet the right amount of rules and so on, so that people know what they can do all of these balances, right when you when you start a company, it's just this blank sheet of paper from an organizational perspective, and learning how to go through the different phases that a company needs to go through as it as it matures, was very eye opening for me. And, and at this point, sometimes I'll tell people, I'll say it here that I think the organization is maybe humanity's greatest invention. It language is something that kind of evolved, but the organization is not natural. We don't all just get together and work together smoothly. That's not That's not what we do. And yet, we've recognized collectively that our strength is so much greater when we work together, that when two people bump their heads in an organization, we're willing to work through it. So I guess in summary, I feel like a lot of what I learned is through that experience is the value of the organization and the difficulty in growing it and the idea from nothing into something. And I think Mojo vision for the part that has my fingerprints on it, I think I've improved in my skill set on organizational building and people leadership, from the first experience to the second one. Yeah,
yeah. I love this notion that the greatest human invention is this idea of the organization. Our superpower, as humans is collaboration. Most is collaboration.
That's it. And it's not natural. It is not natural. Anybody who's ever worked on a team can immediately recognize the absurdity of it. Yep.
It's hard. It's it is hard work. Yeah. The other thing that comes to mind is one of the greatest human inventions is the idea of faith. Yeah, this idea that you believe so deeply in something that you're willing to drag yourself through whatever is necessary for yourself that whatever is necessary, because you believe the outcome is worth it.
Yeah, that's another good one. Yeah. Isn't it belief is a part of those, that organizational process is to glue. Yeah, common belief is an important ingredient.
Yeah, absolutely. So you, ultimately, you spent this time at solar junction, created a very successful technology company. And then you had an opportunity to move on how is that the solar junction experience led you to become a co founder of mojo vision?
Yeah. So just the mechanics of that story is that after we sold solar junction, I stayed for a while, and then I left. And I knew that just like, seven or eight years earlier, I had tried to put myself in the right position for the serendipitous thing to happen. I didn't feel like I wanted to just kind of go out on the job hunt. And see what I could do. I felt like, I wanted to try again. For all those reasons, I outlined a little earlier, maybe, you know, self doubt, and testing myself and, and curiosity and so on. So I said, Okay, well, let's, let's just kind of give it some time. And let's see what I can drum up and NEA was one of our big investors at solar junction. And over that those eight years, I managed to make some friends there at NEA and apparently gain some respect of some of the general partners there. And one of them or two of them, in particular, offered me the opportunity to hang out there at NEA and be an entrepreneur in residence, which, I mean, talk about serendipity, you can't plan that, that just kind of happens occasionally to somebody. And it it had happened to me. And I take it as one of the great gifts that anybody's ever given me to be able to go into that environment and sit on the other side of the table, and to sit in the conversations that happen on the other side of the table and learn directly. How really great VCs think about startups and company growth and the selection process of what to invest in and how to think about these different categories and really bring a framework of thought to to that side of the table. Right that that was something that of course I've been exposed to through the process of starting solar junction, but not in the methodical, you know, you see one to five companies a day and you talk about them, not in that kind of boot camp way. Right. And I didn't spend all my time doing that. I was really in search mode, but I was exposed to this kind of conversation there at NEA And in search mode, what I tried to do is I tried to find categories and companies that I thought were interesting, either within the portfolio of Nea or outside the portfolio with the idea that maybe I could join one, frankly, or maybe I could find one that NEA was interested in investing in, which did happen. And then ultimately, I was sitting at my in my desk area at NEA and Greg Papadopoulos was walking past. And he said something about smart contact lenses. And I pushed my chair back, and I said, Hey, Greg, what did you say? And that question ultimately ended, ended up putting us co founders for moto vision together in the same room. And the rest, I guess, is the history of the last almost six and a half years.
In those original meetings that you had with the other co founders, what was it you imagined you were going to be creating? What was the original vision back then?
It was very much like it is right now. You know, we we set out to build an augmented reality contact lens, just full stop, you know, to put a display and user interface into what is today one, it's a massive wearable, right? Contact lenses. And yet it doesn't have any smart Senate. So we set out to make an AR contact lens. And I think today, we've substantially realized that on an engineering level, there are practicalities of how you do that implementation that have crept in. I mean, some of the early days, it feels like science fiction. So your imagination is let wild, right? I mean, you can just imagine all kinds of stuff that you could do with this, and you make these big lists of all these wild ideas. And I think we've managed to implement a lot of that. But the practicalities of things have also settled in right? You only have so much power, you only have so much space, certain kinds of problems are hard, you have to make trade offs. The trade space is pretty unforgiving. Right? So yeah.
Is you kind of there at the beginning, looked across the set of things that went into creating a smart contact lens AR within a contact lens. How did you begin to break down that problem?
Well, I can sit here today and try to remember how we did it, then I don't think I could do an accurate job of it. There's been so much learning from then to now. I don't think I could represent the worldview then today. It's just what I know today, but but you can break the system down into a number of functions. So there is a contact lens, it's a shape, it needs to fit the eye in a healthy way, it has to oxygenate the cornea, it turns out your cornea breathes oxygen from the air. Just think about that for a minute. For anybody who has never thought about that. We're used to our body getting its oxygen through our lungs, and then deliver it through our blood supply. Your corneal tissues are not like that they breathe oxygen from the air. So your contact lens cannot block oxygen. So there's another requirement. So there's a bunch of requirements around the contact lens itself, not to mention the fact that you have to be able to see through it, you can't obstruct the world, it needs to correct your vision. And then you have to start to say, Well, how do we put a display into it? How small can that display be? How small can we make it? How many pixels? Can there be? Can you do anything with that number of pixels? How do you if you take a display, you can do this at home, you can take your your cell phone, which has a display on it and hold it up as close as you can to your eye, you got a bunch of pixels that are within your field of view, completely unusable, right, you have to be able to focus the light from a display onto your retina, which means you need tiny little optics, those optics have to sit between the display and your retina to be able to grab the display light and focus it projected, it's actually a projector, project that light onto your retina. So you have to solve all of these display and projection system problems. And then you have to say well, okay, imagine this display is on your eye. When you move your eye to the left, the display goes with you. When you move your eye to the right the display goes with you moves everywhere you go. So your fovea, your center of vision is always pointed at the same pixel, frankly, on the display. So now imagine you put up a picture of my face on the display. And you are looking at one pixel in my face. When you look left, you don't look at my left eye, you're still looking at that same point on my face, right? So you have to solve this problem. And that means that you need a way to track the motion of the eye very rapidly. So that you can update the display content. Meaning if you look to the left, the display content has to shift to the right. So that the display content feels stable in the real world. This is another category problem. How do you do eye tracking? Then you say to yourself, Well how do I close the loop? Between motion to light and less than 10 milliseconds, in order to make sure that the imagery is stable, and you don't make anybody sick, and then you have to solve for all the power problems, how do you get power into the contact lens? And then you have to solve for the data problems? How do you get data in and out of the contact lens? And I think those are most of the subsystems. And the, the hard part is not any one of them. They're all kind of hard. But if I told you, we were going to start a company building really tiny displays, it'd be like, oh, yeah, I bet you could do that, you know, I wonder how small you could go, right. The hard part here is making a system where all of those pieces all with their own unique requirements, and constraints, can all work together simultaneously, to create an experience for a real person,
that system level thinking is ultimately necessary for a successful device and critical to make all of the necessary trade offs and packaging it into such an environment is you kind of reflect back on on all of those subsystems. And the many challenges that you've overcome in accomplishment we have so far, are there one or two that you're particularly proud of?
They are also difficult. They've all taken probably some years off my life. I don't know that there's one that rises to the top over another. But I would say that some of the interesting ones are things like, we make our own micro displays micro LED displays. And when we started the company, we went out and we talked to people in the display industry, and more or less described what we wanted, and got told, you know, it was impossible, or, you know, it was very, very difficult, and nobody thought they could do it or wanted to do it. So we did it, we had to do it ourselves. And that's a chip, a technology that our company is, of course, very proud of. Another one is can you take that light and focus it onto the retina? Now I remember back in 2015, I think or 2016, when we were starting the company, DARPA, if I remember, right, it was DARPA through a little Think Tank party, asking the question, if you had a display on the surface of your cornea, could you focus the light onto the retina, and they invited a bunch of professors and just smart people, right? And what really struck me there is that all these of all these smart people, some people thought you could do it. But there were legitimately intelligent people who were making an argument that you couldn't do it, that it wasn't possible to do it. So I think the projection optic is another another really interesting innovation to get that thing small enough to fit into a contact lens. It's a single piece of plastic. And yet it implements, I don't remember four, four ish, optical surfaces, within that single piece of plastic, everything self aligned. It's a pretty neat trick. And the last one, I'll say, which sticks out the hole that, as I said, they all kind of stick up is eye tracking? How do you track the eye from the eye? When we came to this problem in 2015? If you wanted to track the eye, video, eye tracker was kind of a mainstream way to do it. But that's a camera outside your body. Like where does that come from in a contact lens? So we went down so many different paths, there are start from first principles, and you start to think about how do you measure motion? What are all the different ways you can measure motion? Can you put magnets in a contact lens? Can you put RF coils in a contact lens? Can you put cameras their outward facing to measure them? Like, if you can think of it, we probably had it on the list and worked on it. And I'm really proud and happy to say that today. I think we've built maybe the world's best eye tracker, certainly up there as maybe there's maybe there's one other way to do eye tracking is as good as we can do it. But yeah, it's a it's a really great solution to the problem. That
eye tracking solution is dependent upon being on the surface of the eye. I imagine.
That's right, we turned that kind of oh my gosh, how do you do this into a really strong advantage? Turns out that if you can put instrumentation directly onto the object that you want to measure, things get easier,
actually, that's very cool. So you're creating your own micro displays, you have created a really complex and incredibly tiny optics system for the projection object. And you've solved a really challenging problem around eye tracking. Is there a camera in the mix also,
there is a camera we have a another story not unlike the display chip, where we went out to the camera community, and we said we want a camera that has this many pixels and it has to fit in this kind of power budget. More or less we're told ya If you can't really do that, that's to lower power. And, and so we had to do it ourselves. So we, we designed our own image sensor, it's roughly 256 by 256 pixels, we published an ISSCC paper a year ago or something. And, and the chip itself, if I remember, right runs in 60, microwatts, 60 to 80, microwatts, depending on the mode, which is pretty good trick. So we did build that that camera system, the first use case for that the one, the reason we built that camera is for helping people with vision impairments. So we have this concept where the video coming off with a camera, that data never leaves the lens. So it's not, you know, you're not going to take pictures of somebody in your life with this camera, the data never leaves the lens, it goes through an edge detection algorithm on lens, and is immediately shown on the display. And then you know, the data is gone, right? It's just instantaneous transfer over to the display. But what that lets us do is it lets us outline the edges of the things that you're looking at in the real world. So if you're a person with visual impairment with a problem that results in low contrast, vision, for example, seeing the edges of things in the real world, no matter where you look, no matter where you point your eyes, right, you're gonna see this content instantly. This is a feature that can be very powerful for that community. So that's the reason we created that image sensor and put it in. And that's a system that we've built, in addition
to the folks that you just described, the folks that have low vision where this sort of system is life changing. Is there another group that you imagine as an early adopter of this technology?
Yeah, absolutely. I'll just say straight out, what we're thinking right now is that it's people with athletes, active lifestyle type people, if you step I'm gonna, I'm going to drill into that in just a moment. But if you step back from the system, and you look at what we're really creating, it's a general computing platform, we're going to be able to show you your messages and your turn by turn directions. And anybody who's seen one of our demos can see some of that inaction. Our belief is that when we approach a new market with a new product, telling people as a general computing device, it's kind of like, well, what, what does that mean? What does it do for me, right, as a part of our company's growth story and growth arc, we do not have to be like Apple and sell 100 million units in the first 12 months, or else we failed. So we have the opportunity to start with a little bit narrower messaging, so that we can identify a group of a general group of people that we think we can help. And they can identify themselves as potential customers, for us and people who want to use this product. And then over time, these circles of users can grow. Until eventually it's it's just an generally accepted thing that a contact lens can be smart, and it can do a lot of things for you. So when we think about those early users, we focus more on what can we do for that category of person that's unique. So a contact lens cannot jiggle off your face as you run. For example, a contact lens does not fog up or obscure your vision in the rain. For example, our entire user interface is created and designed to use only your eye motion. Because we have such a fantastic eye tracking system, we can use the motion of your eye and where you're looking in the digital world to click buttons and do user user interaction. So you don't need your hands. If you're riding a bike and your hands are full, if you're walking with your groceries and your hands are full, if you're lifting weights, and your hands are full, you don't need to talk to yourself, you don't need to click a button and you don't need to wave your hands. You can interact with the system solely by using your eyes. So some of these unique opportunities that the form factor provides have allowed us to identify some of these early categories where we think we can if you are somebody who who rides your bike a lot for exercise, this is a product to can that can really help you with hands free eyes up information.
For those that don't have vision impairment. What is the impact of having the display directly in the center of their fovea? Well, normally we get all of the high resolution input into our eye. How is it that we're able to still see the world with high resolution while also having this digital overlay?
Okay, so here's here's the thought experiment. Most of your audience, I think are engineers and product managers who may be familiar with the Hubble Telescope, which is in the general category of Cassegrain telescopes. And if you look at the entrance aperture, so the the front end of that telescope, there is a circular light blocking piece right at the entrance to the Hubble telescope in Yet it's still the Hubble telescope, right? I mean, up until the James Webb here, it was taking the best pictures of the universe that humanity had ever created. So how does that work? Well, it turns out that you can apply the same principle to your eye. If you imagine an eye and it has a pupil, that black circle in the middle of your eye where the light goes in, if you put a small blocking feature in the middle of that entrance aperture, provided it's small enough, it will not obscure the focus, all of the information that's carried in the light rays, just still goes around that blocker and forms a nice image on the retina. Now what will happen is that blocker will reduce the amount of light that gets to to the retina. So it will, it will dim the image, but only by a little bit, roughly 10% In the case of of our contact lens. Now, your eye is a logarithmic ly sensitive device for light. So to put it in perspective, 10% is almost nothing, uncoated pieces of glass. So the windows that you might be looking at or the glass on your computer screen, these pieces of glass reflect or block eight to 9% of light. So a 10% block is really not that big a deal. So yes, we don't obscure the real world, you can still see it perfectly well. it dims the image by a relatively inconsequential amount. But the focus is still there, and nothing is blocked.
Eyes are amazing devices in our bodies. They're really fun. So you had hinted at this ability based on your ability to track the AI, this idea of having a completely hands free voice free user interaction model, right? Can you describe that model in a little bit more detail? What's kind of unique and special about what you're creating for interacting with the content through the lenses?
We are not the first ones to think about using eyes as a user input method. I'm not sure who the first was. But you can think back to things like people who are completely paralyzed from the neck down. And how do they communicate with with other other people, they can't use their hands up, sorry, there's people who are even paralyzed, they can use their mouth, right? There are systems where they can type with their eyes. So the idea of a user interface using eyes as the control process is is not new. I think what we did, along with most all of the other problems is we had a very unique requirement. Our requirement was when you wear this product, you should be socially acceptable. The whole idea here that we have is to take our devices and more or less make them go away. So that we as humans are in a little bit more natural environment with fewer screens and fewer devices in our eyes and attention is more back up into the real world. In order to do that, you need a socially acceptable way of interacting with these devices. So we had a very hard requirement from the beginning, that while we would support voice interaction, for example, it could not be the primary modality for interaction. We had the same idea around buttons. I mean, how do you click a button on a contact lens, we were basically just forced into this through our own requirements and and the tools that the contact lens provides. And what we found is that unique requirements over and over again, we found this unique requirements create new opportunities for creativity and solutions. And they it's not that we're smarter than anybody else. It's just that we took a different view, if you will, on on some of these problems. So what our user interaction software teams have now spent years on is, is figuring out how to make those user interactions effortless, and easy. And we've created not we test this not just with the contact lens, but also in development environments that we have external eye trackers and VR goggles and other infrastructure that our teams can use to develop these modalities. But it turns out that it's really compelling to use your eyes. It's fast, you do not make errors. It you know, sometimes people think, Oh, if I look at this thing, something's going to happen. And and I didn't mean it to happen. There are methods you can use to avoid that, that kind of false click of a button. There are other things you can do that don't require you to stare at an object for five seconds and kind of tire yourself out. So there's, there's tuning of all these parameters and creativity around user interface that make it to can make a very compelling experience.
What was the milestone recently that the company had a chance to announce?
Yeah, when we, when we started working on this, we set our goals to be I wouldn't say small, because there were so new and it was a very small team. But on the simpler end of the spectrum, if you will, right, we weren't trying to prove out big systems, we were trying to prove out one piece of the puzzle at a time. And what's what's really fun is that now that you know, six years into it, we actually have built what we think of as the real prototype of product. So the prototype now has battery power, and the user interface and the applications I was just telling you about the display with the micro projection optic, the eye tracking the radio, it's all there. And working together, not to mention all of the aspects around clinical fitting of lenses, and documentation and quality systems to enable human wear and so on. So the milestone here recently is to take that prototype of product, and have the first human, put it on their eye and use it. And we crossed that threshold in June, so roughly a month or a month and a half ago. So that was that was a really, really, really big deal. Because if the, if there's a question, Can you do it? Is it really possible? Can you really make an AR contact lens? Does the universe allow such a thing? Like, you can put a big check in the box? Yes, the universe allows us. So from here, what we do is more people will be wearing that. And we will get more learning out of that system. And over the course of the next, you know, six months, or maybe the year, but certainly the second six months, we'll make reasonable improvements to that platform. But all of that learning and building of the platform and wearing it. What it does is it really lets us understand what we need to do, and change to spec the real product and build the next version of this platform, which which will be the first product. So yeah, really, really, really exciting time for us to reach that milestone. And I'm excited to to get more people wearing that and get their feedback and learn.
Congratulations. That's an amazing milestone. Thank you. You'd hinted earlier in the conversation that there are practical elements of making this ultimately a commercially viable product. What are some of those those hurdles? I know, there's still a bunch of learning to be doing over the next few months. But based on what you can see today, what are some of those major hurdles that remain in getting to a commercial product?
Yeah, it's a it's a great question. I'm not sure I could, even if I wanted to outline in gruesome detail, all of the different things we want to do, I might steer clear of some of that. But I don't think I could outline all of them because part of the process is to learn from this current prototype, and use that process to spec The next product. But some of the things that we're of course dealing with are they're really tradespace kinds of questions. So they're things like, what applications do we want to support in the first product? How do the use cases and use paradigms around those applications impact things like battery life, I'll give a very obvious example. But watching a two hour movie is probably not the best use of the first product, I would like to think that there will be a day where if you used our product, you would be able to close your eyes, because the contact lens goes under your eyelids. So you still see all the content, I would like to think that there's a day where you can close your eyes and watch a movie. But that will not be the first product, the tradespace is not the right tradespace. Right. So we have product level application decisions to make. We also want to improve general energy efficiency across the board. So the chips and systems that we make in particular, we're looking at how to improve the energy efficiency of those to extend battery life, we will likely go through another cycle of of battery shape and size development to better optimize the size and shape and utilize the volume that we have in the in the lens. But there are other things that are looking very good. For example, our display chip is at this moment, you know, we have yet to sit down and spec everything out but the display chip looks like that. That's done. We have a display chip. So there are other components and subsystems where we think we have it and it's not going to change or it's not going to change in a meaningful way as we go towards product
as you think about the minimum necessary runtime power right this this whole notion that device is the ultimate wearable device. This so little space to cram battery into this thing. There's such high demand, I guess, on making every subsystem as power efficient as possible in order to, to extend that battery life, and you knew that at some point, the vision is to be able to close your eyes and watch a whole movie on one of these things. What do you think is the minimum necessary runtime in a commercial product?
Yeah. So to directly answer the question, and then color it in a little bit, we think two hours of continuous use, meaning, if you were to engage the system, bring up content interact with that content, visual content is constantly for two hours, that's probably the minimum viable product. Now, no one will use the content in that way. So imagine that you're on you're on your bike, or you're running and you want to look at your heart rate, you are not going to stare at your heart rate continuously for two hours straight, you know, the use modalities for this kind of system are gonna be very much like a mobile phone or a watch, where you have a session of engagement, and then it's off, and then you'd have another session, and then it's off. And when you think about this way, a system that runs for two hours continuously, is able to be on for 25% of the time over an eight hour period. Or you can do the math and scale it appropriately. But those kinds of numbers are actually quite high, your your mobile phone is probably not on for 25% of the time in your day, right. So we think two hours of continuous use something in that zone is the minimum for first product viability. But when I look at our architecture and what's available, and kind of the starting point versus what's possible, like we're on the right track here, the power requirements for compute come down all the time, our first product is not accessing the absolute state of the art, in low power compute, for example, we do a pretty good job. But if you look at the benchmarks out there, we're not designing chips that are at the benchmark of absolute low power compute design. So I know that we have room to improve on the power consumption side. And I also know that when when we look at the volume available in the lens, total volume available, how much of that total volume contains batteries? Not very much. So I know that as if we continue to earn the right to continue. And we can have a second product and a third product, the winds are at our back, the roadmap is there to improve energy consumption and improve total energy available in the lens. And to do all of that while maintaining safety. So we have very large safety margin on all this. We've actually, you know, so many requirements, right? But like, that's one of the requirements, this product cannot hurt you Period, end of story, even in some crazy failure mode that we couldn't think of, there just isn't enough energy in the lens to cause that to happen. And that's been one of the requirements from the beginning. So anyway, in summary, two hours of continuous use for first product, we think that's, that's the threshold. And when you look at the roadmap, and what we're leaving on the table in terms of opportunity, there's a lot of opportunity to optimize both consumption and total energy in the lens. So I'm confident the future's bright for those metrics.
Great, that's fantastic. Is there something I should have asked earlier, but is there a companion device, there's ultimately necessary to facilitate the data transfer to the lenses
there is we're not taking your whole iPhone and pushing it into your eye. So the contact lens talks to currently today, it talks to a companion device and other wearable that you have to have with you. And that's where the compute lives. And then that's where the other network interfaces like Wi Fi, and so on, live on that device. And down the road, we certainly have the ambition to merge that with whatever the days standard computing devices. So today, that would be a mobile phone, maybe in the future, it's something else. We certainly would like to merge that. But at the moment, we need control over that compute cycle. So we have to, we have to build that device ourself, and it's another wearable.
Got it? Can you share a best guess maybe at the timeline when you expect to or hope to expand the trials to larger scale audiences largest groups?
You know, in the very early days of the company, we approached the FDA and outlined what we were endeavoring to do. And I think we all see the opportunity for this kind of wearable to make an impact on human health. As I said earlier, in our discussion here, it's a it's a massive wearable. So many people put contact lenses on every morning and none of them are smart, smart, calm. Contact lenses with the ability to sense anything or provide us feedback, right? So, the FDA definitely recognizes the great opportunity to impact human health over the next 1020 years, not only with what we can do with the AR type product for people with vision impairments, but also what is the next product and the next product, you know, if you can fit a radio and battery and sensors and a contact lens, what else do you want to do? So that excitement led them to accept us into their breakthrough devices program. And the breakthrough Devices Program is one where we and them can exchange thoughts on different topics with each other over time. So that you avoid the typical interaction model, which is a company goes off spends years figuring stuff out, creates a ton of documents. And then one day they get to the FDA, they push all this all these documents over the FDA and they're like, What do you think. And then you get to find out whether the FDA will accept or reject your product or give you feedback and what the rest of the process will be. So the breakthrough devices program is, is there to help build mutual understanding between us and the FDA. So that when we get to that moment of pushing all those documents across the table, we both have a pretty good idea of how this is going to go. That's great. What a great program to it really is I'm very thankful to be a part of it.
As it relates to kind of achieving this vision and in continuing down this this path of, of continuous innovation and system engineering and all the trade offs and, and advancements that you're making. Who or what in this industry causes you the most concern over, say the next year, year and a half, or the next year and a half? I have a short term very short term from short term
I, I'm not sure if it's really that short term. But we want to see other companies in this category when so maybe we're unique in that I'm not sure. But because our form factor is so different. And essentially, everyone else who is in the category who who who can, who can put the label on themselves saying we're an AR company, or even a VR company, everyone is working on classes. And we to the best of our knowledge are really the only ones giving the augmented reality contact lens platform or run, it doesn't mean that you can't view any of these companies as competitive, but we don't really look through to the world. And in that lens, we see them as making all the boats rise together by setting consumer and enterprise expectations for what AR can do by creating the cloud platforms and data platforms that we're all going to want. Yeah, so I think I'm just, I'm just rooting for everybody, frankly, to make winning products and and to keep improving all of our technologies.
How do you think the glasses and the contact lens technologies will coexist? Or do you think that the contact will ultimately come to dominate?
I think there's a there's a problem with the question, actually, I mean, very respectfully, but but that no one asks that question about the corrective contact lens and corrective glasses, worlds. I mean, it just doesn't exist. And it's because that question really doesn't make sense. People wear glasses, for one set of reasons. People wear contact lenses for a different set of reasons. They are two different products, just because they both correct your vision does not mean that they're competitive. Think about all of the sports players that you can imagine, right? Imagine a basketball game, right? Think about all the people on the court, how many of them are wearing glasses. Zero, but you know, statistically, you you know that all these athletes, some of them need vision correction. They're wearing contact lenses. Anyway, I don't think glasses when or contact lenses when they have different value propositions. And that's exactly what we're looking to leverage in our first product and in our approach to the market is to put forward our product as being able to do unique things for our customers that glasses will very much struggle to do.
It is in its own space, something like that. Yeah. Yeah. As you kind of reflect over the last what's been six, seven years now, motor vision. Yeah. How has your perspective on this this general category of augmented reality? How has it evolved or shifted in that time?
So just like the solar company, I came to this company, very fresh. I hadn't worked in the augmented reality category before. And so the evolution over the last six and a half years of the category is in some ways, also the evolution of my knowledge and learning of all of that. And of course, we all know that there was also a big push for AR and VR in, I think, the late 90s. And maybe there was another one before that there have been some cyclical pushes in this category, right? I think what I've come to realize is that it's hard. It's hard because it's about creating systems. And it's not about a single technological innovation. And those systems span, not just the physical domain, where it's like, how do I make it smaller or lower power, they span, product choices. They span into the kinds of data that we can, we can serve up, right and the reasons people would want to buy these things. We can look at Google Glass for learning. And some of the ways they positioned that in the early days and more where Google classes today, and those learnings about what these systems can really be good for. It's not one size fits all right? I mean, different systems have different applications. So I guess I have a much greater appreciation for today, for the complexity of creating a new system that really solves problems for anyone pick a topic and solve a problem, right? The complexity involved in that. However, I've also come to learn just how awesome and transformative it can be when we figure it out all of that learning about what's important, and how do you make it smaller? And how do you make it lower power? And what's it good for? And how do you approach these different market segments? What do they need? All of that learning, when it finally does come to fruition into some winning products? It is going to be a very, very big deal?
Absolutely. Now, let's wrap the few in lightning round questions. All right, kind of picking up really on the on the last one here. But what commonly held belief about AR or even VR spatial computing? Do you disagree with
Ah, okay, disagree might be too strong of a statement. But I have a belief that simple information given to you at the right times, is immensely powerful, immensely powerful. And that, as we again, look for the starting products that are winners, the idea that we have to go into full awesome mode, where the digital world and the real world are combined. And you've got, you know, digital avatars and things you can't distinguish from reality. That is a very interesting future. And perhaps Perhaps one day, that future will come to pass. But I don't believe we have to achieve that, in order to unlock the value of what we're all working on. Simple information at the right time, is extremely powerful.
And that's enough to drive the industry forward to mass adoption. I think so. Yeah. Besides the one you're building, what tool or service do you wish existed in the AR market?
You know, everything we do is really so bespoke, because the problems we have to solve in the contact lens are just so fundamentally different. The solutions that that we need we've had to build. So I don't spend a whole lot of time thinking about those ecosystem level questions, because we just can't access most of them. I think where where we touch those surfaces is in cloud systems and cloud data, the big the big tech companies are building incredible systems to serve up and deal with all the data that the AR systems that are coming are going to need. And yeah, we'd love to access that data too. So that could be something I wish we had. Sure.
What book have you read recently that you found to be deeply insightful or profound?
I'm reading one about fungus right now. Which I think is pretty good. Here's, there's another one. I'm sure a lot of your audience has heard of it. It's called lifespan by a guy named David Sinclair. I think that one has had a pretty big impact on me over over the last couple of years since I read it. I enjoy reading books a lot, actually.
What's the name of the fungus book,
but that book is called entangled life. The fungus book. Oh, here's another one. We are multitudes. Yeah, that was another one. I just read about the bio microbiome right in your body. And that book was fabulous. Absolutely loved it, the complexity of our bodies and how, how little we have understood and yet how much is becoming open to us now. is really great. Yeah, there. There are a lot of good books. I mean, I don't know I pulled up my book list here. I could just ramble on a whole bunch of them. But
that's great. Something that I had learned very recently. Maybe I'd heard it once before earlier in my life, but it came back that the fungus isn't is not plant. It's not an animal, it's its own category of thing that has a very unique makeup and composition that kind of split from the whatever this Life Tree is that we all belong to a very very, very long time ago. And it's just a fascinating other area
super fascinating in one of the things that I think is really profound to think about. And I sure hope I get it right here. So if there are some fungus experts out there, please correct me in the comments or something. But one of the things I think is really fascinating is because it goes back so far in time, it has had to interact with all forms of life, from those early days to today. And as, as a basically cellular a cellular organism where its boundary to the rest of the world is a cell membrane, right, more or less, it has had to evolve all kinds of defenses and capabilities to deal with the changing climate, landscape, and evolution of all this other life, and all of the all of the new things that it has created. So when we look to new drugs or therapies for our own selves, part of the reason you can look to fungus for clues or compounds is because then it's been there and dealt with that before whatever that is. It had to it had to have a solution to it. Because it's been around so long and had to interact with all of us. All of us other creatures in so many different ways in our bodies and outside of our bodies. Right? Yeah. So it's learned a lot. And we can learn from that, Ernie.
That's amazing. If you could sit down and have coffee with your 25 year old self, what advice would you share with 25 year old Mike,
settle down? Settle down, man it. You don't know everything? And it's okay. I don't know if I was cocky in a way of like being abrasive to other people. But if I look at what I thought I knew, and I thought I could do at 25
man can do anything. Yeah,
I mean, yeah, I had a lot to learn. I still do, right. But that's what I tell myself is a little humility, man.
That's good advice. Any closing thoughts you'd like to share?
I'm really, really thankful to be here. So thanks for the opportunity. And maybe a parting thought would be smart contact lenses and era contact lenses are a real thing. And it's coming. Eyes up,
eyes up. Love it. When can people go to learn more about you and your work there at Mojo vision?
Mojo dot vision. That's the website, you can hit that there's a bunch of media articles written about us from time to time we we work with the press community to help put out our message so you can find some articles out there on us and they're generally all quite good, I think.
Amazing. Mike, thanks so much this conversation.
Thanks to a lot of fun.
Before you go, I'm gonna tell you about the next episode. It maximizes returns to the show, the former CEO and co founder of 6d.ai is working on a new project that melds the real world with virtual at Living Cities. In part one, Matt and I catch up and he shares the evolution of his perspective after sixty.ai sale to Niantic in part to his co founders, John Gaeta and Dennis Crowley joined for a deeper dive into the new company. You may know John from his time creating special effects for the matrix movies. And you may know Dennis as the co founder and CEO of location intelligence apps, dodgeball and Foursquare. I think you've really enjoyed the conversation. And please consider contributing to this podcast at patreon.com/thearshow. Thanks for listening