The AR Show: Bobak Tavangar (Brilliant Labs) on the Power of Combining Open Source AR Hardware and Multimodal AI
5:19PM Dec 5, 2023
Speakers:
Jason McDowall
Bobak Tavangar
Cayden Pierce
Keywords:
hardware
ar
technology
device
ai
monocle
life
company
world
glasses
open
notion
open source
cases
work
folks
find
industry
team
essence
Welcome to the AR show where we dive deep into augmented reality with a focus on the technology, the use cases and the people behind them. I'm your host Jason McDowall. Today's conversation is with Bobak Tavangar. Bobak is the co founder and CEO of Brilliant Labs, a company creating an open source AR glasses hardware platform. And pairing that with powerful generative AI models to help you engage with the real world.
We have a high degree of confidence, placing our bets on low power constrained hardware that would interface with very capable AI models that said, you know, some mix of on your device and up in the cloud, and making that the technical vector that would ultimately carry augmented reality to become a daily useful experience in our lives.
Bobak studies at George Washington and Georgetown University's before completing a graduate degree at Cambridge University. Why because a serial entrepreneur with a core theme running through his entrepreneur experiences, how to create better ways for people to find and utilize information based on the context of where they are in what's around them. My book also worked for a while as a program lead at Apple. And this conversation, Bobak covers a number of topics, including dating advice for physicists finding the essential essence of a product, his rationale for developing open source AR hardware, the powerful combination of generative AI and augmented reality, the lower of pursuing a broad set of use cases versus being laser focused, recent fundraising success, where the company is going and more, as reminded can find the show notes for this and other episodes at our website, the AR show.com. And please support the podcast@patreon.com slash var show. Let's dive in.
Bobak, physicists are not usually known for their dating advice, but you seem to have found an unlikely exception. What was the situation you found yourself in?
Well, I was a graduate student at Cambridge University and the college that I was attending Kies college, they have a formal dining hall kind of situation multiple times a week. So you have to show up at formal Hall and you're in your gown. And you know, the graduates sit upstairs, the undergraduates downstairs and the fellows all pile in, everyone has to stand up and the fellows come in. So I was fortunate over my time there that several times, Professor Stephen Hawking, the renowned physicist, wheeled in with his accompany nurse to our portion of formal kind of where the graduates were sitting upstairs. And he sat right next to me, that happened several times. And I was very, very fortunate. And a little star struck, as you might imagine. And, you know, at the time, of course, you could ask about physics and ask about the future of science and technology. And he had this really interesting kind of set up where it would it would pick up kind of micro movements of, of his jaw or something and like dictate, you know, what he was intending to say, onto a computer screen. So you could converse with him that way. But at a certain point, I kind of felt like, you know, probably everyone is asking Stephen Hawking about physics and the universe, how many people are asking him about other other very important areas of life, like, like love and dating, and, you know, I had been getting to know, my now wife, you know, that was a topic was top of mind. And so he has, you know, a lot of experience in that department. And I decided to ask him about it. And so I asked him, you know, what, what should I look for Professor Hawking in the woman that I want to marry? And his response really struck me, he said, find someone who's very Bing makes you smile. And once you find them, swiftly, close the deal. I don't think he used those words, close the deal, but swiftly tie the knot. And I took his advice to heart and that's that's exactly what I did about a year and a half later, I closed that deal. My wife and I got married. And it's in part thanks to Stephen Hawking.
That's really incredible on a number of levels. As a physicist fan that myself I'm, I'm in awe that you had opportunities, plural opportunities to speak with Stephen Hawkins some amazing, but love that you took a different tact and you asked him about dating advice and love, love advice? And how did you interpret this idea that you should find somebody who's very being makes you smile? What did that mean to you? Yeah, it's
quite mystical actually, when he said that, at least what popped into my mind was how someone presents themselves as they are in that moment, right? Because as people we are always evolving, we're always becoming but as they are presenting themselves in that moment, seek to find beauty and contentment in who they are. Right that That's how I interpreted it don't seek to impose what you wish they would be your how you hope they would change, but instead try to seek that beauty at that moment. And never lose sight of that. And that's that's kind of, it helped me actually reorient, how I thought about a lot of things in life. But that's, it's kind of this timeless advice they have.
It's interesting. So influenced your your daily life or helped to inform a relationship that you already had in place, and you continue to evolve that relationship. And as you noted, you end up marrying that woman that you're dating at that time. But it may have had other further reaching impacts, not just on your personal life, but also on your professional life. How do you kind of carry this forward, maybe not just that one conversation with experiences that you had there at Cambridge, forward into the early stages of your career.
I mean, if we just kind of work with that insight, you know, there's this notion of an essential essence of something. And whether that was in future companies that I would start or my time at Apple and appreciating sort of Apple's philosophy on products and markets and, and, you know, consumers, and then even the work that we're doing right now, brilliant labs, this notion that there is an essential essence of something and putting in the hard work to really refine down to that essence, is sort of this very nebulous concept, but it's something that drives a lot of my thinking, certainly, you know, to kind of tie it into the present day, how we think about product, how we think about AR,
the core essence of the thing. So along this, the lines of this notion of, of really, truly understanding the essence of a product, but it really is, at its core, was such a fascinating and interesting and deep kind of insight into any sort of product oriented thinking. And you out of school went off and started your first startup, they spent a number of years at building a new type of search engine. So how did this this notion of finding the essence of the thing of essence of a product kind of inform what you're building there for that first startup of yours? Yeah, so
at the time, we were really preoccupied with this very core human impulse of search, and what it means to search with all of your senses, what the process of learning looks like. And that's so we tried to kind of whittle down to that essence, and look at it from that standpoint. And then, you know, understand, Okay, well, when we human beings want to search and understand something, we don't just want to understand something in isolation, we want to understand something in context, that context really matters. And we kind of looked at the search landscape at the time. And, to a degree, it's still this way, there's very little context brought to bear on, you know, 10, blue links. So you know, we were really committed to applying various technologies from computer vision, to graph theory, graph search, to that problem, to try to elevate not only the thing that you might be looking for, but the context around that thing, and how you might be able to traverse knowledge to really go on a journey through knowledge, and not just kind of find the thing. So so that that was really, you know, sort of the essence that we were homing in on that point. And then, of course, that carries into the present day with brilliant labs, that, you know, ar, we kind of felt like was, was was kind of searching for a reason to exist when we started the company. And, and we thought deeply about this. And we felt that something that is about your person seeing and hearing the world as you see and hear it, it should be sort of like your copilot not to overuse this term that we're hearing a lot, but it should be the second mind. That is understanding the world as you navigate it, and elevating useful context, useful insights that would help you more effectively, more efficiently, more delightfully experienced life.
So as this was the sort of interpretation of what AR needed to be, he noted that was kind of searching for, for its own meeting in some way to kind of as an industry, what was what's its real purpose. And you set out to begin to solve that problem. was the motivation to go and start a company in this space. What did you see in the market? You saw this, I guess, at some level, this abstract opening, but what was the specific thing that you thought was missing? Based on what was happening at the time you started the company, which is what 2019 So about four years, four years or so ago? That's
right. Yeah. About four years ago, my co founders and I got together we did the usual kind of early stage thing, you know, notes on the back of a napkin 3d printed prototypes, sketches, and a lot of great brainstorming, and even some early testing of prototypes, where we landed was that back then, and still to a large degree today, much of what people consider to be AR was driven by that sort of vision of graphical intensity, you know, whales flying through the sky or a T Rex dancing on the table in front of you, you know, intense, fantastical graphics. And it, you know, it's kind of like a game console projected in front of your eyes as you navigate the world. The problem with that is, you know, let's leave the use case piece aside for a moment. You need incredible compute, and, and battery, as you know, the complexity of electronics and optical systems to support a wide field of view, that's going to lead experience that's going to last few hours. It's just it's forcing physics, and it just does not exist, certainly back then. And still today. And I think that's why, despite billions of dollars put into trying to force these physics, we still get a couple $1,000 helmets that sit on people's heads that very few of us are actually using, if we're honest with ourselves as an industry. So we kind of looked at this and we said, This can't be you know, this cannot be why augmented reality ought to exist, this cannot be the role, the existential purpose of these technologies in our lives. And so we committed to a different path.
And what was the different path that you committed to what was the problem within that kind of broader opportunity that you saw that you chose to initially solve? Yeah, so
we looked laterally at another very exciting and up and coming area of technical research, and that's artificial intelligence. Back then, back in 2019, we were very heavily focused on computer vision, and how, you know, even simple compute, using what they call tiny ml, or kind of heavily compressed machine learning models that sit on constrained hardware, low power hardware, can working with a camera helps you perceive and understand the world around you and recognize things to elevate that context and help you navigate life in a very fresh and interesting, useful way. But you didn't need overly complex, bulky hardware, you didn't need hardware, that would cost a couple $1,000. And it could there I say be crafted to look like a really beautiful pair of glasses. And so we kind of set about on that path. Of course, through certain connections of ours, we were aware of work happening at open AI, and other companies that were working on some of the large language model technology, a lot of this generative AI stuff that we're now all excited about. And so we we had a high degree of confidence, placing our bets on low power constrained hardware, that would interface with very capable AI models that said, you know, some mix of on your device and up in the cloud, and making that the technical vector that would ultimately carry augmented reality to become a daily useful experience in our lives. Let's unpack
this in both both directions that so one, one element is the hardware piece as you had highlighted, hardware is hard. And despite billions of dollars, we're still not quite at the ability to deliver a device that people will wear for any extended period of time. And the other is that you've, you're taking advantage of thin kind of edge compute thin clients that rely on something like tiny ML to do something small and useful at the on that piece of hardware, but also take advantage of larger models and more capabilities off device. Maybe we can start on the on the hardware side. What is it that you've created to kind of begin to embody the sort of physical side of the experience you're creating. So
our first device to market monocle launched back in February of 2023 this year. And we decided that from a form factor standpoint, it should dovetail with glasses, existing hardware that many of us put on in the morning, when we wake up take off at night before we go to bed. That that was sort of a you know, the half step that was important for developers, for for the industry for researchers to kind of begin to just understand these technologies in a way that didn't feel too, too aggressive. Like it wanted to just suddenly take over our glasses and be omnipresent in our lives. And so we started with monocle very much as a dev kit. It's totally open source from hardware schematics to firmware that runs on device to higher level abstractions software, all up and down the stack of technologies. we've open sourced and put it up on GitHub. And we're engaging with a community of folks who are building and dare I say even hacking monocle to apply it to a whole range of different domains. And then going forward from there. We do see an opportunity to sort of make a beeline for glasses for that device that is at once open. Open source is at the same time highly affordable. It will not break the bank and it has prescriptions baked in to the AR optical experience, we do see an opportunity to deliver a device like that it would be a first in the industry and interfacing with a cloud based large language model would be, I think, the use case that really makes it daily useful to be able to translate simultaneously, whatever you hear to be able to make sense of what you see, no matter what it is, to be able to reconfigure and generate images based on what you see, and share those with those that you love. I think that it is kind of this big step to our coexistence with very powerful artificial intelligence.
This notion of the half step, I think, is a really interesting and powerful one. And I haven't really seen any other company take this sort of approach where you put something out there that is, is meant to be, as you described it, hackable, right, it's everything's open source about it, including the firmware, and allowing your initial user base this group of developers to open it up and play with it, and experiment with it and learn it and and help to extend it to the extent that that that is helpful for you or for them, has a really interesting and powerful approach to kind of building out I guess, this, this community of of activists and tinkerers working alongside you, what sort of reception Have you seen since you've released the this initial kind of, I guess the half step, as you call this, the monocle device?
It's been kind of an amazing journey. What I can tell you is that we when we kind of published the buy button on our website, I told my team like, look, well done, we survived the pandemic, the semiconductor shortage, you know, have we shipped this device like we've already done? Very, very well, no matter what happens, let's pat ourselves on the back. Because it was a hard slog, just getting the device out. And, you know, look, even if we only sell a baker's dozen of these, our mothers still love us, like, don't worry, we're gonna be okay. So we were kind of trying to calibrate our expectations. The response was totally unexpected. So we we still continue to sell out of this thing. We can't make enough monocles to satisfy the number of inbound orders. And it's not an inconsequential number that we're that we're shipping every single month. And that's without having done any advertising around monocle or the company. So we literally published a buy button. I wrote a LinkedIn post. And the things sort of went mini viral. There were a few moments, you know, we we rocketed to the top of Hacker News. Y Combinator is kind of very famous newsletter. We're kind of website for the technology community. We had a few kind of mini viral Twitter moments. I think many people might remember reg GPT team at Stanford who did this really kind of funny take on GPT integrated with monocle to be used during like interviews and dates, sort of a tongue in cheek thing. You know that the press loved that. So we got a lot of inbound interest from journalists who want to write about us. So there were a few things like that, over the last few months. And without spending a dime on marketing. We just monocle started selling like hotcakes. And we, you know, the brand started to kind of take on a life of its own. So it's been kind of a wild ride for us. And we're a very small team of three. And so we totally didn't expect this, but but here we are.
Wow, congratulations. It's very impressive. Incredible. What is it ultimately that these users are buying what's in it?
monocle itself we were kind of very inspired by airpots in the sense that it's a little pocket size, sort of a trinket that fits inside of a charging case all of which is soft to the touch and fits in your pocket. So monocle comes with a charging case, the battery coin cell battery that sits within monocle, it lasts a little over an hour of continuous use. You've got a camera sensor, Bluetooth five, multiple processors and FPGA, of course, you know microphone touch sensor. So there's a whole lot packed into a very tiny space. monocle has a what we call a unibody enclosure, which means that the mechanical enclosure of the device itself doubles as the AR optic. So it's a geometric prism. And you get about 20 degrees diagonal field of view for RGB kind of micro OLED display there. And you clip it onto your glasses and it projects. You know that holographic sort of semi transparent image right in front of your eyes. And to our surprise, it holds up pretty well in direct sunlight as well. I think everyone struggles with this problem in the industry, but we certainly didn't optimize the device for this but for whatever reason, even without turning that display up to full brightness, it holds up decently well. So so that's that's kind of everything packed into it and that's, you know, to kind of bring this full circle that was on product level. What we do determined would be sort of the essential, you know, the essence of an AR device, those range of sensors that size, that way of interacting with it. Like that is the essence. I'm not using the word MVP here, but it's sort of that, that that essential, you know, spirit of what an AR device ultimately would or should become. And that's what we wanted to nail with monocle.
What was the biggest challenge in bringing this to market? You highlighted already that COVID happened in their semiconductor chips or shortage happened in there? We kind of package that all together, what was the biggest challenge or set of challenges in bringing that thing to life?
Yeah, well, I mean, none of us expected what would happen in the world over the last few years. So you know, we're certainly not alone in experiencing all of this. But it was pretty painful at times, you know, we started the company, a lot of wind in the sails very excited, we raised some early capital, and then the pandemic hit. And you know, all of us remember those early days, in our supply chain shut, we couldn't travel to see each other as a team. So even the process of collaborating as a team had to evolve, you know, all of us got used to zoom and, or tools like it to collaborate remotely. But then, of course, the following year, the demand picked back up and the semiconductor industry was having trouble wrapping supply to keep up with with unexpected demand. And so semiconductors were just, you know, all range of chips were in short supply. And lead times were the fastest were a couple of months, you know, but regularly, we'd see wheat lead times of 52 weeks, or more 56 weeks. And so, you know, as a startup, we had to kind of figure out really creative ways to work through that. Oftentimes, it meant paying four times the amount for a single chip, as it would have cost us to buy that thing a year earlier. So it was it was a pretty challenging period, you know, making sure that we were surviving as a business that we were making forward progress on product development and prototyping, even remotely, which, you know, for hardware, as you know, is incredibly challenging. And then, of course, making sure we were able to procure what we needed supply, and actually build out a supply chain without being able to visit our supplier partners. It was a sort of this perfect storm. But we found our way through it, you
found your way through it, you released it, what was the strategic thinking behind making it open source? Yeah, so
and we think about open source quite broadly, you know, we think about open source not only in terms of software and firmware, but also electronics, mechanical CAD files, we think about that in terms of supply chain as well. We're very fortunate to have strategic partnerships with suppliers that we absolutely love working with. So we, you know, very happily open up that information as well, to tell you where we source from and where they're based who they are, we just think that's really important. So, ultimately, you know, the strategic thinking is that number one you can positively influenced in the industry, if you are open about these things, if you know, if how we do the Bluetooth stack on our device, is less of a closed door proprietary thing, because frankly, it's not, you know, Bluetooth is with us everywhere. It's silly. If teams of engineers need to waste months of time, trying to kind of go through the blackbox exercise of reverse engineering and figuring out how to get Bluetooth optimized for their device. Why don't we open this up, this is not a competitive advantage. In 2023, let's open this up. Let's let's be more honest with each other. And let's focus on what really is competitive and makes all of us better. So it's actually this kind of deeper faith in the notion of, of competing or vying with each other on the things that matter, not wasting everyone's time with the things that don't. So that's kind of one philosophical bit of thinking there is that all of us can be better if we're more open about, about the kind of the rudimentary things. And then the second is, yeah, long term, there are certain companies that are more closed, we think that by being more open, we can engage with developer partners, and even suppliers with a high degree of reciprocity, that when we are reciprocal, and not just holding folks at arm's length, and sort of talking down at them, that there will be more platform loyalty, and that on a very tangible level, developers will realize that they can get more out of an open platform because they can customize their software on a deeper level in only a way that for example Apple can do within their own platform itself. So for example, you know, doing a first party app experience, you know, Apple can integrate their software deeply with their own hardware in only a way that Apple can do. And that to date has been you know that hardware software integration has been an incredible advantage for them. What if that could be an experience that every developer had access to what if every software developer could get access to the depths of the hardware in a way that allowed them to really exercise, excellent user interface, or a kind of fine grained interaction design added functionality that they couldn't exploit before, because they just didn't have that level of access to the underlying device to the underlying electronics. And so, you know, we kind of saw that as an opportunity in this day and age, to start flipping the script on its head and saying, Hey, how about instead of exercising an advantage as a closed system, relative to others, what if we just radically opened the door and said, our advantage should be everyone's advantage, which is, you know, when you do well, when you can integrate what you do deeply, then that does better for our platform. And that means that we have a more mutualistic relationship with our key stakeholders.
That makes a lot of sense. There's another open source project out there focused on on glasses, called the team open smart glasses, and they're working on also doing some things open source, how do you? How does this perspective or the specifics of what you're doing compared to that other that other project?
You know, we know them and a bunch of other sort of hobbyist kind of projects like like that, they're, I think, pursuing a very valuable set of ideas that similarly question, the need for a closed approach to this new era of computing, they developed for our hardware, you know, as you know, Team open smart glasses and others, they, they don't do their own hardware, they use others hardware. And so we welcome that, you know, we see them as another development partner. And so you know, as quickly as they and others are able to build useful applications, use cases and deploy them on our and other companies hardware, the better the overall industry will be. So they're really kind of doing a lot of great work to try to iteratively understand what what the killer app use cases ought to be for this new technology.
The other angle that you had talked about, as we kind of reevaluate or as you will reevaluate, reevaluate, and back in 2019, about how to do things differently. One of them was to make sure that you focus on the essence of the hardware itself, which includes a handful of sensors in a basic display, and to make that device that can incorporate and work into the piece of hardware that was already wearing the glasses are already wearing. And as we've just described this way that you're making it open source and the reasons and approach making open source. But the other side of that was the artificial intelligence and the generative AI. And this idea that with the right sort of computer vision models, or the right sort of generative AI sort of models riding alongside this hardware, you can create something more valuable. So how does this notion of generative AI really connect, or extend the idea of what augmented reality can be? The
way we think about it is to put things in very visceral terms. Today, almost all of the AR devices and experiences that are alive in the world today, for us to purchase and use, they keep you centered around your couch, that YouTube ads might look different than that, you know, they might look like there are people out in the park, dancing and having fun. But the reality is that these devices, and the largely entertainment focused experiences that they are designed to deliver, they largely keep you centered around your couch, they keep you at home, and especially following a pandemic, and especially following an increasing body of research about well being and about mental emotional and physical health of our species, that just more of that, like, you know, it's bad enough that we have this kind of cloister of screens, in our homes that keep us tethered to our couch anyway, that so now we need to spend a couple $1,000 on a helmet that just further, you know, deepens the imprint on that particular cushion of the sofa that you didn't feel like, this is why this technology should exist. And so for us, generative AI is incredible because it is as close to sort of the spontaneous spark of the mind. And when you can pair that with sensors that are aligned with how you see and hear the world, then it's it's like doubling your your kind of mental and intuitive capabilities, that it really does sort of helix, the notion of human computer interaction to a whole new level. And it keeps you critically not just centered around your home, but also out and about exploring the world around you. That is perhaps the first time that a device like this might be crafted to be thin and beautiful, something you don't mind wearing out and about in the world something that looks remarkably like the glasses that many of us wear every single day, but interfaces with this incredible technology that's just moving forward at the speed of light in terms of its capability and will very soon be multimodal very soon these these slides As language models not just take text as input, but they take images, streaming video, audio, IMU data, you know where you are and how quickly you're moving. And they'll be able to kind of process that alongside text to deliver intelligent output. So it's sort of the second, the second mind, that can be with you as you navigate life, no matter where you go. So we think that that's just, you know, on a very human level, that's why a new paradigm shift ought to be ushered into the world ought to have some place in our lives, it's not to have arguably impressive technology, that just helps us deepen the imprint on the sofa cushion even further, because we've just been through a pandemic, and frankly, did way too much sitting on the cushion on the on the sofa. And we need to start kind of being out and about interacting with real people in real places. And a device like this paired with generative AI can help us do that in an even more rich, experiential way.
One of the challenges that a device like this faces not just this device, but as a class, this notion of AR smart glasses. Face is that to be useful, it needs to solve a specific problem will deliver some useful bit of information or insight or entertainment, based on the moment. And the way that in some companies are are tackling this is that they pick a very specific narrow problem. And they try to to deliver with that constraint context, a useful bit of functionality feature enhancement to that end users livelihood or experience in life. And here, you're talking about more of a open experience as you engage with the world. You're grabbing the sensor data, the microphone data, the camera data, IMU data, and you're inputting that into a system notionally into some sort of generative AI system multimodal sort of system. And my interpretation, what you're saying is that you are using that collective set of sensor data as the context setting element to then infer from that to interpret from that the useful, helpful bit of information or insight or entertainment that's appropriate in that moment. Is that fair? Or are you thinking about a little bit differently?
That is definitely fair, that does provide the rich context. For a device like this to try its best to understand what it is you might be interested in, where it is, you might be where it is you're going. And you know, it's actually, at this point that I should mention as well, that at least in our case, we do all of this in a very privacy protecting way on device, between our device that you're actually wearing, that sits in front of your eyes and your phone. This is probably like, the holy grail of data, if you run an ad business model, we don't. And that's with great intention. And so all of this is processed and extremely privacy protecting weight on device. And unless you choose to share it elsewhere, you can view it, you can delete it. So that's sort of how we think about at least at least the you know, the privacy question. But yeah, I mean, it's coming back to your point about context. That's, that's how these devices will understand you and the world around you better. And you know, learn about that increasingly over time.
What do you imagine will be the first couple or handful of use cases that will motivate not a developer, but an average consumer to buy this and utilize it in their day to day life?
We thought about this a whole lot, man, especially during the pandemic, because we're all sitting at home. And we're all you know, dreaming up. I think, like many of us, the industry, the killer app use case where we hear that refrains so often, and I just kept thinking of like, a hammer and a nail, you know, like, is that what you know that like, AR as the sort of computing medium, and then artificial intelligence, as this really powerful horizontal technology that touches so many different domains, and areas of our lives? Is it? Is it too, bluntly reductive to think about this as a hammer and a nail? And I think where we landed, as a company is that as much as we tried to pigeonhole this thing, as like, oh, this will help this type of person doing this type of job, do that job very well, in this specific way. The we, we really considered a range of use cases like that talk to a lot of different folks in industry. And we eventually determined you know, what we should embrace the fact that these collectively, the the hardware and the software, are incredibly capable horizontal technologies that come around once in a generation and will transform this world. And we shouldn't try to pigeonhole this. It'll be to our detriment, the detriment of our community of developers and customers. And it just wouldn't be. It wouldn't be true to the technologies that we're working with that a really lightweight, open and affordable pair of smart glasses paired with an incredible gentleman AI that is multimodal, that it will be as useful in the construction industry, as it will when you're home in your kitchen, trying to drum up recipes based on what you've got in the fridge. And everything in between education, medicine, everything in between, it'll touch every facet of our lives. And you know, the AI has that capability. Increasingly, I believe that AR smart glasses like these will be the shuttle upon which this AI finds its way into these various contexts of our lives. So we, you know, to your question, again, because we really struggled with this, we we've kind of started to move away from that way of thinking. And I know we'll probably get pilloried by people saying, oh, that's, you know, startups have come before who have gone too broad and failed as a result. And I totally hear that. But this is sort of once in a generation technology that we're working with right now, both the smart glasses as well as the generative AI technology. And we would be foolish, I think, to try to pigeonhole ourselves that we ought to go broad and do it justice, or die trying among
the developers who are currently utilizing the hardware and software that you've created. Do you have a favorite couple of examples of what they've done with it? You can share? Yeah, so
there's some folks who use their monocle to control robots, you know, that they've developed and designed monocle in front of their eyes is sort of a holographic interface that gives them real time data about the robotic diagnostics, you know, the decisions it's making, and they can consequently control that robot straight from monocle. There's folks that are using monocle in the film context as well, very interestingly. So you know, putting in front of people's eyes to complement what a camera is taking in from a given scene to allow the entire cast and crew, not just those behind the camera to perceive what's going on on set. There's folks that are using monocle, of course, you know, with certain generative AI use cases, they're applying GPT that has been or you know, some model similar to GPT, a large language model, like llama that has been trained on their enterprise a specific set of information and company, internal data, etc. And they use that for training. So an operator or someone on on farm will use monocle and query the device about how to repair something, how to handle a certain task. And it'll work off of that internal set of data, the standard operating procedure, and it'll give them that real time feedback that helps them actually execute that task successfully in that moment. So there's, there's like a whole world of we totally didn't expect most of how people are using this device. It actually further underscored our own suspicion that when we made it open, when we release it widely, we will not be able to foresee how it's used, and that the applications are very diverse, that they are very broad. And that's been a beautiful thing.
How did that kind of learning from your customers play into the fundraising you did earlier this year, as you have lived and experienced fundraising for hardware technology companies is notoriously hard. And you were successfully able to raise earlier this summer, a $3 million round. How did you how did you approach that process? And how did this kind of early customer feedback inform that process?
I think depending on when we're when we're able to share this, this recording with the world, we'll soon be announcing an additional couple million bucks, in addition to that three, three and a half million that we raised. So it's been an oversubscribed round, which is great. Definitely a breath of fresh air following the pandemic, like you alluded to, it's it's hard raising for hardware. We've been very fortunate that we've been able to connect with operators like Brendan ERBE, who's the founding CEO of Oculus Adam chair, co founder of Siri, Eric music kowski, who was the deep tech partner at Y Combinator and famously built pebble you know, sort of the original open source piece of hardware that kicked off the smartwatch revolution in all of our lives. So, and many, many, many others like them, who've been involved as investors and advisors. You know, raising money for hardware is is really, really hard. And it tends to be at least in our experience, either strategics who understand the challenges and can help to mitigate some of the pitfalls, the costs. And we're very fortunate many of our most key suppliers, like moving on technologies in Singapore that specialize in world class, high precision optics, their investors in our company, very proudly so and they've been able to do a lot to give us that sort of unfair advantage that hardware companies need when they're starting out. You know, and other operators like I just mentioned that they understand the pain, of of building of recruiting of fundraising for heart Where there's a world of folks that would rather be, you know, getting paid a lot of money, pushing pixels for an enterprise SAS product, and a world of investors who would rather be investing in those enterprise SAS products. Very, very, very few want to come work for and invest in an emerging hardware company like us. So we've been fortunate to be guided to those investors, it's taken a lot of kind of roundabout conversations and perseverance, you know, as you might imagine, a ton of rejection. If you were stories there, that I'm happy to share, but we're fortunate to have found the investor partners that we currently have, and we're excited for, for kind of future conversations in that regard as well.
Maybe we can share one of those or two war stories, as you as you communicate to other founders about the process of fundraising in AR this intersection between AR and AI. What is a piece of advice you would share or or lesson learned through some sort of war story? Yeah,
I mean, lots of lessons learned and still learning, you know, war stories, there's comical ones literally being shouted at by an investor, because they, you know, wanted to express how well they understood hardware and their expertise and hardware, really, literally telling us to get out of their office and get lost, how not joking standing up, fingerprinted red in the face shouting at us. This was in the early days. So you know that this story is like that. But even you know, more mundane stuff that I think probably many hardware founders have experienced, you know, getting dragged out for months, only to be told that, you know, our funds, we like what you guys do a lot, and we hope to be customers one day, but we we don't do hardware must stay away from hardware, you know, which begs the question, Why were we dragged on for months, but you know, of course, you you take it in stride, and you keep those relationships, you know, alive and healthy, you never know, you know, you don't want to burn bridges. And then of course, even funds that call themselves explicitly hardware funds, who, at the end of the day, you look at their track record, and they're investing in mostly software that serves the hardware industry. And it's kind of some permutation of enterprise SAS, or like internal database software. But their fund is ostensibly a hardware fund. You know, we we've had a lot of challenges with folks like that, if anything goes are the most painful, because you start talking to them, like, like, Oh, my God, I'm so glad I found you. You know, you get hopefully what we're doing and what what makes this so challenging, but it turns out that they call themselves hardware, they're not actually interested in taking bold bets on on hardware, it's tended to be those funds that do not explicitly call themselves hardware funds, or deep tech funds. It's been those folks that, interestingly, have as much experience with AI. And what makes AI especially generative AI and a lot of the upfront kind of, you know, server costs a lot of the infrastructure GPU procurement. And then of course, the need to kind of staff and train, procure the data to actually build something useful. It's been those folks, in addition to, like I said, strategics, and people like Brendan arriba, and Adam chair, who were actually operators in this space, built and sold companies that, that have tended to get it. What we've learned over time is like, when we start to talk to folks, we'll just be very upfront with the fact that, look, we're a hardware company, and even more so consumer hardware. Most people, you know, they hear that, like, this is what will tell them, hey, most investors, they hear that, and they run the opposite direction. And we totally get why, you know, we wouldn't fault you if you want to end this conversation right now and run away. Because there's a lot of other companies that you can invest in, why risk your money with a consumer hardware company, we get it. And so we try to be as upfront about that as possible. And try to emphasize that as early as possible in the conversation multiple times, to really get a sense of whether this investor is serious about the conversation, or whether they're just trying to do a little bit of you know, whether it's just interesting to learn about a new space, but they're not actually sincerely kind of keen to potentially invest, or whether they're doing some sort of, you know, some some intel, just picking up some some useful intel that they can feed to a portfolio company of theirs, again, without sincere intention to potentially invest. So we we try to do as much upfront, even if that means, you know, we ruffle their feathers, because we're like trying to push them away from the deal. push them away from the conversation, because we just find that that that helps surface their intentions earlier and it saves all of us a lot of time them and us, but still learning lots of lessons there. And, you know, there's folks that you've spoken to on this podcast like Jeri Ellsworth, who I've been fortunate enough to, to get to know a bit, and she's got just incredible war stories, and she's been able to share some wisdom. So, you know, I guess the other thing that I've learned on this journey is that there is a small community, you know, charity is a part of that there's, you know, Brendan's a part of that. And you can mine a lot of useful wisdom and insights from those folks. And they can also make useful introductions to investors that are like known good, you know, people who who are sincerely interested in investing in risky hardware plays,
I think, regardless of whether you're going after hardware, or something else, this notion that you talked about, where you really have to find investors who fit whatever it is you're trying to do. If that fit isn't there, then it's not going to work. And even if they're generally open to this idea of, as you described me, they're open to investing in hardware or other hardware adjacent. And it seems like that might be your right fit. But if they haven't invested in something similar in the past, especially sometimes in the recent past, then probably not going to work. And this idea of filtering, either pre filtering based on the research that you do upfront, or the conversations we have about them with other people who might know them, or that you have directly with them early in those conversations, to filter out whether or not they might be the wrong fit. Seems like a really economical way to spend your time and focus your your energies,
I think that's right, even if that means you eliminate some folks at the top of the funnel, in the long run, it's worth it, it saves you the time, it saves you the headache, and fundraising is an emotional roller coaster. As much as you try to insulate yourself from those emotions, it's got its ups and downs. And the further you can kind of filter for quality at the very top, the better off, you'll be all through the process.
Yeah. And so you presented this story, this narrative about what you're doing and where you're going, which includes these hardware elements, and his aim elements. And as part of that is plan that you can execute over the next couple of years. So for you what, what's next on the hardware side, and what's next on the AI side, as you continue to evolve this product offering? Well,
we see them like train tracks, they're moving, at least as we need them to move, you know, speaking only in terms of our own roadmap, they're moving beautifully, and they're moving swiftly forward in parallel. So on the hardware side, there is an increasing number of processors that are low power, but have multiple cores, some of them even optimized for on device AI with a Secure Enclave to kind of protect sensitive data. So there's a lot of good stuff happening on the constrained hardware side of things, great stuff happening in terms of wireless, really promising things happening in the display world as well. And a few other areas. And so, you know, hardware, we're excited about, about kind of what we're seeing there. So in the coming months, we'll be launching a new device that I alluded to earlier, it will be the lightest and finish pair of Air glasses in the world. And it'll bring all the capability of a multimodal large language model, while keeping battery life extended across multiple hours, and sitting very comfortably on your head over the entire day. So you know that just that experience getting that right, has taken a lot of painstaking effort. And we're there today. So, you know, stay tuned for that. I think another very critical kind of step beyond that, on the hardware side, is getting prescriptions into those glasses. And so we're working with a partner to do that. And we'll make an announcement about that next year. But suffice to say, At launch, my personal device will have my prescription integrated into seamlessly into the air optic, and it will replace the glasses that I currently wear every day. And we think about that as flicking the switch between black and white and color television, that once we can nail that, you know, you get affordable, open light, beautiful prescription AR device that interfaces with a multimodal AI in the cloud, like that is just the Holy Grail. And that that's what you know, starts to move things more in the direction of like mass market switching from, you know, the sort of 2d Dare I say dumb glasses that I currently wear every day to something that's truly smart, and can actually be a bit of an Oracle for us in our everyday lives. So on the hardware side, there's a lot of exciting stuff happening there. It's moving in the right direction, while keeping low power and kind of cost effective characteristics in place and critically high yields out of the production line. And then on the software side GPT four is multimodal that's in sort of closed door release right now. Maybe by the time this podcast is out, it'll be open for all of us to enjoy. But you've got other multimodal models like lava and others that have been released open source for people to have fun with so we think the next six to nine months are going to be a story of taking a lot of the excitement around around On GPT, and large language models generally, and really applying that to to be multimodal, to actually mimic how we perceive the world, both sight and sound. And so you know, that trend is not going to slow, that's only going to continue the sophistication and the accuracy and the insight that these models are able to deliver, and then how we integrate that with the hardware that you wear every single day, especially if it has your prescription baked in. I mean, that's, that's a game changer. Dare I say it's a life changer. So. So that's kind of how we see these two parallel tracks of hardware and software moving over the next couple of years.
Do you in that time still intend to keep this as open source as you've had this first initial monocle product? 100%?
Absolutely, it is philosophically really important, we believe, but also strategically, it'll play an increasingly vital role in how we differentiate our platform, you know, to come back to the first piece philosophically, these technologies are going to be so tectonic so important for mankind. And there is a certain justice in keeping them open source so that anyone, anywhere with varying levels of technical ability, can scrutinize and understand and, you know, just like Lego blocks, even reconfigure these technologies, in a way that suits them in that moment. So platforms that develop this stuff, but keep it closed, we think there is something fundamentally unjust about that, philosophically, this is very, very important for mankind. I cannot stress that enough. But strategically, yeah, I mean, continuing to push that advantage of a more open and reciprocal relationship with our developer partners, let alone, you know, suppliers and strategic partners customers, we think that's going to be a critical advantage going into the platform competitions of the next few years. What does this look like in five years? You know, in five years, I think we're going to start to see, at least from other companies, we're going to start to see their early Smart Glass offerings. And I don't mean, you know, something with just a camera on it. I mean, like, sort of what we're about to launch this coming year, we're going to start to see some of the larger companies release some of that stuff, it's going to take them a longer time. Because obviously, it's a very hard problem to solve and the way that they're trying to go about solving it, running their more graphically intense software developer kits. And so it'll take them longer. But in the meantime, we'll have released beautifully designed smart glasses with prescriptions built in interfacing with richly capable near zero latency, multimodal AI, that gets to know you better, almost like a pair of Birkenstocks that you break in, as you wear them every single day, week, on week, year after year, this AI that runs securely, or at least captures the data that you perceive in a privacy protecting on device, keeps it on device, that cloud based AI will get to know you better and better. And so we see that, you know, just continuing deepening, I think our first mover advantage in that sense, and hopefully influencing the broader industry to kind of, you know, to kind of wake up a little bit and say, look, we got a we got to stop thinking about AR as you know, this game consoles in front of your face, bringing fantastical sprites, and you know, like T Rex on on the table top whales flying through the sky source of vision, like AR is, is the shuttle for AI. And the more companies that are clued into that, the better and I hope that we can play a role in positively influencing the industry to kind of move in that direction.
Yeah, I like that very much. I think when I started this podcast so many, many years ago, now, that was also in my perspective on where AR fit into our lives. With a set of sensors, it really was an opportunity to to be the eyes of AI to be the ears of AI that would allow us to then have this augmented extra bit of help, as we engage with the world, whatever it helped looks like relevant to that moment. I'm fully on board with this vision. This is this is fantastic. Let's shift just a little bit here before we wrap up. But this is now your third startup over the years. Plus you spent some time at Apple and learning from inside the genius that is that company. So how have you evolved as an entrepreneur through those experiences?
I think what I've come to appreciate, and I'm still deepening my appreciation of as the critical importance of the team and it's meant to sound so obvious to even say it to like hear my So I'll say it, because so many have said this before, but because I have lived it as the CEO of the company, and just seen, and I can talk more about this, what I mean in a moment, but just how a person thinks their perception of what they're capable of how they benefit from their formal training, but at the same time, are able to ingest a lot of informal training and experiences, to do wildly incredible creative things, there's so much there to unpack, you know, you could have a whole separate conversation about that. But I've just really come to appreciate how every human being is a universe in and of themselves, and how they show up, and, and just, you know, perform in their work is so wildly different based on that. And so, recruiting and team collaboration, management, all of that stuff, I've just, over time, continued to be dazzled by how fundamental that is, obviously, it's an incredibly human endeavor, we're making something for other human beings, you would not expect it to be any different. So, you know, that's, that's been my biggest takeaway. Absolutely, is team. And so, you know, following on that, we also think a lot about how we spend our time. And so time management is critical to and I learned a lot about this. At Apple, there's this notion of, you know, sort of peering around corners, focusing on what matters once you kind of have that focus in mind. And that collective sense of, of a team, a small team, you know, is also important, once you have that sense of what matters, really simplifying around that focus, to kind of unclutter, how you think, and free up extra bandwidth and an energy to really execute on something with excellence, your core focus, while not losing a sense of the broader opportunity. So, you know, there's a few things there that drive, how we spend our time, and how we communicate internally, as a team, especially a founding team, those have been some really key takeaways. And then there's this element of cost as well, that as a startup, you know, a hardware startup in particular, where upfront costs can be so high, very different from the software world. And we've kind of come to appreciate that, trying to build relationships with suppliers, and even you know, external contractors, other people that we work with, in a way that gives us a high degree of leverage over those costs. So that if a pandemic hits, or if you're suddenly faced with long lead times, on certain components, but you need a way to be able to continue making forward progress and development, but surviving, you know, on a cost basis as a company, that you can power down and up a lot of those costs. So kind of transferring those to be, you know, from fixed to variable and having a lot of fine grained control over that. That's, that's been also a central area of learning. And you know, there's going to be a point in time where we start to, the pendulum will begin to swing, things will shift, and we'll need to internalize a lot of those costs back to become fixed. And, you know, our, our cost structure will increase and, but we've noticed that at least in the early days, that's a critical consideration. The team, who they are, how they think, how they work, how we communicate, how we spend our time, what matters, what doesn't, and then, of course, how much leverage we have over our costs to keep us alive and, and kind of moving in the right direction. That's
a great, it's a great set of learnings. And it explains, to some extent how with a team of only three, you've been able to accomplish so much so far. So very impressive. Let's wrap with a few lightning round questions here. First up is what commonly held belief about this industry, right? AR and VR spatial computing Do you disagree with? Oh, well,
I mean, at risk of sounding like a broken record, it's probably this, I'm gonna go out on a limb and say this, it's a collective delusion, that AR ought to be about fantastical graphical things or entertainment use cases in front of your eyes. And so that's unfortunately, sadly, it that is the collective delusion that has seized even the brightest minds, is spending billions of dollars on r&d and building arguably great technology that has oriented what they're building and what they're building it for. And I think that that is a delusion. I don't think that that's what AR ought to be about. And I think that you don't need to take my word for it. The last number of years have pointed this out. And I think the next few years, are going to start showing a very different direction of what AR should be.
Besides the one you're building what tool or service do you wish existed in the AR market?
We think about interface a lot. I think what Apple released a few months ago was incredibly impressive and, you know, gaze detection and you know, eye tracking Picking up gestures, you know, there's a lot of companies working in that space and some that you've spoken to on this podcast as well. So I've no doubt that they will, you know, they do and will continue to exist. But we're, we're actively and eagerly trying to monitor that space. Because you know, interface is like, that's that's the the critical gap between the technology that one, where's the AI that sits in the cloud and your daily lived experience its interface. So that's, that's something that we wish we saw a little more maturity in, kind of that space there, but it's coming.
What book, video podcast, some piece of media have you read or consumed recently, you found to be deeply insightful or profound.
There is a book by Rick Rubin called the creative act, a way of being it is at what's practical, and at the same time mystical. And I try to read and reread it every few years. And it's something that I'm going to pass on to my own kids, when they're old enough, so I cannot recommend that book highly enough. It just, you know, it just puts your mind on different path, questions, assumptions, and really helps one to unpack sort of the meta of creative thinking.
I'm gonna check that one out. If you could sit down and have coffee with your 25 year old self, what advice would you share with 25 year old Babak?
One word patients. And it's a lesson that I have been trying to learn over and over again, in my life, I think actually having kids was a helpful forcing mechanism in this in this regard. Because, you know, patience is a necessity. And it's just, you know, the, that that fun paradox of life, especially if you're an entrepreneur, when you want to move fast, you want to get big things done, you want to see results. And at the same time, there are certain processes that do require patience, they need to emerge organically, and they cannot be forced. And I think appreciating that distinction. And having the patience to see something through when it's characterized that way is something that my 25 year old, self lacked. And it's something that hopefully, now that I'm in my 30s starting to appreciate, but I'm still on that journey.
Any closing thoughts you'd like to share? Yeah, I
think it's an exciting time to be alive. I mean, I just I wake up feeling so much excitement, so much hope for the world. Obviously, in the technology realm, there's, there's an incredible amount happening right now, especially in the AI world, and all of us have just been, you know, our eyes have been locked. They're just seeing what you know, what new stuff is released this week. But there is something else that I think we have not collectively as a species we've not been focusing enough on, which is human education and development. I don't mean what's formally taught in the classrooms, I mean, our values, you know, how we think about life, and how we treat each other. And, you know, as quickly as technology is moving forward, I'm also conscious that as these technologies become more capable, it will require us human beings, the users of these technologies to be very intentional with how they are used, because with great capability becomes an increasing capability to do good, but also harm. And it will require that that level of intention, and even restraint, in some cases, for us to apply them in the right way, that positively benefits humanity, even down to the individual interaction that we might be having. So I'm very hopeful, but I do wish to see and maybe we'll, you know, start to do some more in this regard, as well as a company. But I do wish to see more dialogue that focuses on how we human beings are getting ready for technology, as much as technology is racing forward, to be applied to our lives.
Like that's really powerful. We all need to be taking our AI in general technology, ethics classes, or refreshers, if we happen to have taken it back at school at some point. So that we're thinking about the implications. We're thinking about how to justly equitably safely privacy compliantly utilize all of this data that we are collecting about ourselves and each other. And how we engage with this technology and how it affects our lives. It's not simply about the data collection itself, but how we're, what we're sharing and how we're sharing with each other and how that sort of thing can affect us. Absolutely. Well, where can people go to learn more about you and stay up to date with what you're doing? They're brilliant labs.
Very simply, we are at brilliant dot XYZ WWW dot brilliant spelled the usual way dot XYZ and we're on all the usual social media, brilliant labs AR You can find us there we try to kind of repost some of what others are building the community and some of what we find exciting in the AR and AI world. And of course, you can kind of reach out on LinkedIn as well. I'm always excited to meet folks building in this space and have a chat.
Fantastic, brilliant. As you might say, Bobak, thanks so much for the conversation. Thanks, Jason.
Before you go, I'll tell you about the next episode. In it. I speak with Yi Shu Yi is the director of xr technology at oppo, a leading Android smartphone company that's innovating in AR. In our conversation. Ye talks about oppo his vision for the future of computing the AR tools that consumers will use every day and the challenges to get there. De stresses oppo is focused on design. So we go in depth on comfort, aesthetics and requirements for smart glasses that consumers will actually wear. Stay tuned for our next episode as I think you'll really enjoy the conversation.