The AR Show: Brian Vogelsang (Qualcomm) on Drawing on Lessons of the Past to Build Today’s Headworn AR Ecosystem
6:16PM Dec 6, 2021
Speakers:
Jason McDowall
Brian Vogelsang
Keywords:
qualcomm
smartphone
spatial computing
devices
ar
developers
building
glasses
xr
vr
ecosystem
technology
snapdragon
early
workloads
spaces
enable
mobile
today
evolve
Welcome to the AR show where I dive deep into augmented reality with a focus on the technology, the use cases and the people behind them. I'm your host Jason McDowall. Today's conversation is with Brian Vogelsang. Brian is the Senior Director of Product at Qualcomm focused on Snapdragon Spaces, an XR developer platform for headworn augmented reality. This AR development kit is built on open standards and meant to complement the efforts by Niantic's Lightship, and others. Brian started his career as the CTO and co founder of a very early Internet Service Provider startup. He then joined Qualcomm in the mid 1990s. Back when it was still a young company. There he focused initially on IT infrastructure and security before shifting his focus to the very early days of mobile app development within Qualcomm. For the last four years, Brian has been working with Hugo Swart and other members of the Qualcomm team and developing Qualcomm strategy, products and ecosystem around augmented and virtual reality. In this conversation, Brian shares how his long history of Qualcomm gave him a front row seat to the development of the mobile app ecosystem in the late 1990s and early 2000s, as well as the smartphone ecosystem. A decade later, he pulled some lessons forward to the AR ecosystem that Qualcomm is helping to build today. And he looks ahead to 2022.
I think that 2022 is about building the ecosystem and building momentum. I think we'll see adoption, certainly of devices like these air viewers that we spent a little bit of time talking about, you know, we saw these viewers introduced in 2018, with glasses like the unreal light, and more recently Lenovo with a three. So glasses tethered to smartphones, and powered by a platform like Snapdragon spaces, we'll see more products coming to market using this architecture. And then also VR devices that have video see through experiences. So you're doing augmented reality on a virtual reality device. I think that's another trend that we see accelerating in 2022.
We go on to discuss the differences in standalone versus companion devices or viewers, as Qualcomm calls them. And of course, we talk about the recently announced Snapdragon spaces, which is built on the open XR standards, and incorporates innovation around user and environmental understanding. As a reminder, you can find the show notes for this and other episodes at our website var show.com. Let's dive in.
Brian, we were both at AWS last week, which for me, it was the first in person conference I've been to since the beginning of the pandemic, I think the previous one was photonics West, just prior to COVID. But from going to AWS, it felt a lot like a family reunion, like a family reunion, which you're really excited to see all the rest of the family haven't seen them in a long time. And in for me, it was just one fantastic conversation after another. And the struggle that I had was there just wasn't enough time to go and to have all those conversations that I wanted to have and also see all the amazing technology that was being demonstrated that I wanted to see. But for you, what was the experience like for you?
Yeah, I mean, I have to say that the the highlight was really just seeing people and reconnecting, catching up in person. Just so many, so many fist bumps, it was really just great to be back with a community and obviously saw a bunch of really interesting technology as well. So it was fantastic. Any favorites on the technology side that
stood out for you?
Yeah, I you know, I got to see the Lynx Miss mixed reality headset for the first time and spent a little time with Stan, you know, we've been working with them for for a few years on that, including with Ultra leaf on the hand tracking. So I got to experience that sort of for the first time in headset, which was fantastic. And you know, got to get an update from Jerry at tilt five and see the progress that they've been making on their headset and the tabletop platform, which was which was great to see. And then you know, there was obviously our friends at Lenovo and a three that was really exciting. Some of the things that were announced around that. And then you know, force Qualcomm launch Snapdragon spaces. So that was probably the highlight for me at the event was, was getting that to market and really sharing what we're doing there with people.
Yeah, that's your baby. We're definitely a talk about that. Maybe just to comment on some of the hardware that you highlighted. One of the things I really enjoyed about the Lenovo demonstration is that they they showcased something that a lot of people have been talking about a use case, which is the glasses on your face can mimic multiple monitors. And that concept has been around that it's an extended display a wearable display that allows you to extend the monitors that are already in front of you. But then implemented, I think quite nicely, a solution that integrates with windows that allowed them to have this multiple monitor in a way that that worked really well. That was my general take on it. The integration with Windows was really great. And they had really nice job of presenting these three monitors. that use case has been bandied about for many, many years as this will be the initial kind of use case for it for enterprises kind of pick up these sorts of wearable displays as glasses or viewers as Qualcomm calls them and enable this sort of multiple monitor within the comfort of a single desk with one monitor. In front of you, anyway, that that was really cool. And I loved it Jerry's tilt five, I hadn't seen it before, although I had a chance to chat with her about it almost two years ago now. But the experience was delightful. I was really impressed with the the trade offs that she made and the experiences she was ultimately able to create her and her team. That was really nice. Anyway, so those are some some great new hardware introductions.
It was Yeah. And, you know, of course, the 19 launch the light ship platform, which was, which is also, you know, great to see, I think there's just a tremendous amount of momentum right now. And that's one of the most I think, positive takes I took away from from a web was just the pace of innovation is accelerating, we're moving more rapidly as an industry. And it's a really exciting time to be in,
in AR. Yeah, for sure. We're going to talk about Qualcomm's Snapdragon spaces, which I know is very near and dear to your heart. But maybe we can kind of help set the stage I want to talk also broadly about how Qualcomm is engaging on the chip side, and XR, but maybe we can put together to put this into context based on your own past experience at Qualcomm, you've been been there for a number of years now. And including the very dawn, really of the mobile app ecosystem. This is way before the smartphone revolution, even there was this period of time in which phones were getting smarter. They weren't yet smartphones, per se, but they were more than just used for talking to our friends. They actually had these apps on it. And I remember being involved in that community back in the day that Qualcomm's brew platform was the best of the best. You could go work in Java, or you could go work in brew. And if you worked in brew, you knew that you're gonna create a really high quality experience. This is my memory from from back in the day. Can you kind of recount for us? What it was like there at the dawn of that mobile app era? You know, how did that early ecosystem evolve?
For sure, yeah, so So I joined Qualcomm, in the 90s, actually in for information technology. So I joined the IT team and one of my responsibilities in it was the kind of infrastructure and the back end that ran Qualcomm's business. So the ERP systems, the HR systems, engineering, infrastructure, that sort of thing. And it's not super well known. But back then, in the 90s, Qualcomm built mobile phones, we had a mobile phone business. And you know, one of the first phones that we launched was a flip phone in 1997, called Q phone. And this this phone was one of the first with a wireless data and mobile mobile internet on it. In the state of the art, it was at the time, it actually had a dedicated AI button on it for launching with at the time was a Whap web browser. And you could do email and stock quotes and texting. We call it that I think we said the feature was a pager feature, because that was kind of what consumers understood. But you know, it was two lines of great grayscale text. And so this was kind of the really early days of enabling internet connectivity in in a smartphone, or in a phone. And we also, you know, produced another phone called a PDQ, which was another kind of first of its time, this was an integrated PDA, a palm pilot with palm iOS, and a mobile phone that was huge, was like nine ounces. But you know, this was early, innovative work Qualcomm was doing around getting the Internet into a phone and really the early days of mobile computing. And we sold this phone division in 1999, to Kyocera, and in 2000, some of that team came back and they were really looking at how can we get phones to use more wireless data? Because at the time, Qualcomm had pioneered a 2g technology called CDMA. And CDMA was, you know, far superior than any other 2g technology for data, it was one of the things that would differentiate it it set it apart. And you know, this was Whap was starting to take off Qualcomm was an investor and unwire planet, which was one of the original sort of browser ways to access the internet through an early non HTML based browser. And then something called IE mode had launched in Japan, which was, I think, a really a market signal that apps could be a thing that people might want to use applications on their on their mobile phones. And this was with NTT Docomo, they were a pioneer there. And so I think we realized that the app economy or an ecosystem of downloadable apps was the right catalyst for mobile data. And it wasn't going to be web browsing, if it was going to get people into into this. And I think we learned that being building phones. And so in 2000, we developed this thing called cold brew. And it was a really a couple things. It was a device API that developers could code to on various feature phones from from different manufacturers. And then the kind of the more interesting part of it was there was a Cloud Marketplace that was two sided and it connected developers who had helped locations built on grew with, like go to market channels, operators, carriers around the world who wanted to distribute those applications to their consumers. So it was sort of a way this marketplace connected developers with channels to market and allows operators to build their app catalogs and offer these first, you know, app based data services to consumers. And it allowed everyone to get paid. So it was a pretty interesting, you know, system. And so the brew business, enlisted my team's support or help in it, to kind of build that back end, the payment systems, the settlement systems, how app discovery worked, and delivery protocols. And we also built some of the very first mobile apps and brew, you know, connecting the Qualcomm workforce. So this was things like email or calendar or CRM systems. So we started to put these blue phones in the, in the hands of Qualcomm employees and wanted them to have access to mobile data, sort of dog food, the technology ourselves. So I ended up leaving it and joining brew as a as a product manager. And it's interesting, you know, Peggy Johnson, now CEO of Magic Leap, was leading this division at Qualcomm at the time. And, you know, she's an amazing, fantastic leader. And so this really got me excited about building products for mobile, and building apps, ecosystems, and working with developers, which I've been doing in my career at Qualcomm ever since.
As you highlight it, there's a lot that goes into creating a high quality ecosystem, right? Everything from discovery to payment to making sure that the experiences themselves are good, they have access to the right type of data, the right all the rest. And there's a lot that goes into that Qualcomm had created pre smartphone, right, these are when the feature phones are getting just a little bit more feature rich. And then the iPhone happened right then this, this whole kind of coming of the smartphone revolution, I remember you mentioned whap, which is wireless access protocol, that was a, that was a stripped down version of, of the web of HTML, basically, that allowed those early mobile phones with relatively thin, narrow data bandwidth. And not a lot of processing capability to have something that looked like a web experience. At least it had hyperlinks in it, and some bodily, you know, a bit of variation on the text, and some images, but then at some point smartphones allowed what looked like normal browsers and yeah, apps that could do more. How did that that next evolution of the platform, the hardware platform itself, affect the ecosystem effect brew? And kind of the whole app economy?
Yeah, I mean, as you you'd mentioned earlier, there were sort of two ecosystems that are formed, there was one built around Java and J to me. And then there was another which was, which was the brew, ecosystem brew at the time was about 70 operators. So these were operators who had adopted brew services, and we're offering them to consumers. And, you know, in 2006, developers on brew had earned about $700 million in revenue after app revenue. And by 2009, we've paid out about $2 billion to developers on billions of transactions. And it's interesting, because, you know, while we hadn't really reached, you know, the smartphone market, yet, the smartphones were not massively installed in consumers. Hands at the time, there was the sort of the beginnings of an app economy and developers, many companies that started back then were became very successful as they transitioned into, into computing with with smartphones. So I think that, you know, we, we looked at it as a way to establish the ecosystem, get the ecosystem moving, get people building applications that consume wireless data, get those in the hands of consumers, so that they can understand what it's like to personalize your phone. And then as we transitioned, you know, from a feature phone, to smartphone, obviously, the the input mechanisms changed. We went from soft keys and five way navigation buttons, and t nine, you know, the text input when you didn't have either a keyboard, like we had on blackberries, or a touch interface was very, very primitive. And this sort of restricted, I think, the, in some ways, the ability for the ecosystem to grow. So obviously, with the introduction of the smartphone, and both, you know, iOS and Android, that really started to catalyze things. So a lot of the a lot of the investments that developers made in ecosystem and apps and content on platforms like brew, they were able to ride those into the smartphone ecosystem as that started to evolve. But even if you look at, you know, 2009 I remember being on a panel in the valley is a VC panel on the valley and one of the questions was, well, why? Why have there not been really any large mobile exits, like no one had made a tremendous amount of money. Huge app exits at the time beyond jam dat, which was sold to EA and become a part of EA mobile. And so even in 2009, you know, two years after the iPhone, and you know, one year after the launch of Android, we the app economy hadn't really taken off yet. And you know, it took it took a little bit more time for that to get going. And eventually, you know, consumers stopped wanting feature phones. And so that was when we really kind of sunset, the brew business as the industry transition. And Qualcomm you know, we benefited from that we were the first to launch an Android based device. So the HTC Dream, or the G one launched in 2008, was running on Snapdragon. And we were really the enabler that helps that Android ecosystem grow in those early days. So well, we transitioned away from apps infrastructure and, and and the brew type business. We did you know, benefit from the growth that was happening in smartphone,
as you kind of reflect back on all of that long history that Qualcomm that you specifically and Qualcomm has experienced through that, that early days, the app economy, what are the patterns? Or what are the lessons that you think apply to where we are today, with headsets with the AR, or the XR sort of app economy?
I think that one of the things that's really interesting is, is we know, probably a lot more than we think we know about how consumers are going to use these devices. I went back and looked at what was our view on Apps ecosystem back in 2000, right at the beginnings when we were starting brew, and we were out talking to mobile operators, and and OEMs and app developers about the promise of running apps on on phones, and the kinds of applications that we were promoting or saying that would be possible on these devices, were of course games, but also music, you know, turning your phone into an mp3 player or music downloads, you know, at the time, in the early days, you know, we had the equivalent of a 14 four kind of data connection into the phone. So it would take 30 plus minutes to download a three minute mp3 song. But Qualcomm was evolving the wireless data speeds in 3g. And so you know, music download was a really interesting one for us. But also books and streaming video and internet radio, location mapping and your driving directions, and M commerce, group chat, messaging, and email, these were all things that we predicted would be really interesting in mobile apps. And ultimately, those things all sort of came to fruition. So if I look at where we're at today, in 2020 21, and thinking about where we're at, what are the patterns? Or what are the things that potentially will repeat, I think we probably have a pretty good sense now we can put our finger on how people are going to be using spatial computing devices and the kinds of applications and experiences they're going to want maybe maybe better than we think we understand.
Yeah, that's good. It's the things that we are in some ways to kind of rephrase the things that we're already familiar with, in our mobile devices, the things we use the mobile phone for, are going to find a natural fit at least some subset of those and maybe slightly different variations of those and find fit on the glasses that we have on our face.
I think that these kinds of experiences, they're just, they're just going to evolve. So will be we're still going to want to use social experiences, we're still going to want to play games. But like back in 2000, we couldn't predict some of the things that people would be using their their devices for, we thought up, okay, location and mapping or friend finding would be an important thing. But we didn't, of course predict like Uber or some of these other types of services. So certainly there'll be massive new industries that get built on spatial computing. But I think many of the things that we're familiar with today are just gonna evolve and have an analogue in spatial computing. So we have a pretty good sense of it today.
Yeah. As Qualcomm is now staring at the opportunity around XR, it's been several years now that you've been working with, as you noted, links and Stan there at links to create that extra headset and many of the other players, right, the Oculus has been really kind of the primary consumer in terms of volume of the dedicated chipsets that Qualcomm has made for this opportunity around AR and VR. Maybe even go back a little bit in there was a decision that was made at some point at Qualcomm, as you were evaluating amongst all the things that were outside of the mobile phone, specifically, whether it was drones or its IoT devices, or it's, you know, the AR and VR opportunity, really curious to kind of to kind of go back into the thought process at the time and to understand what was the original sort of directive for exploring this opportunity, and ultimately choosing VR and AR is something that's worth in Testing, creating dedicated silicone around. Yeah,
yeah. So I joined the the XR team who goes business in in 2017. And prior to that the business had been incubating in IoT. So you Hugo's working IoT had responsibility for a number of products, of which XR was was one. And we were doing early development of reference designs at the time. So reference designs are these blueprints that Qualcomm creates that assemble all the hardware and software elements, that an OEM would need to bring a product to market more rapidly. And we were working early on with with OEMs, like Pico and Oculus in the VR space. And in 2017 slot in, you know, a sort of the idea of putting your slot in your phone into into a VR device was, I think, the hype balloon on that was sort of deflating, and I think that we saw the future of VR as standalone. And that standalone was going to be really important for Qualcomm to invest in and we and we began making those investments in 2015 2016. And then in 2017, moved the business from IoT into mobile, and and Hugo really dedicated his focus entirely on XR as the GM there and that's that's when I came into the business in 2017. And so at that point, you know, we had been working in headworn, AR for many years, going back to 2014. With odg, one of the early AR glass pioneers. At Wynn Qualcomm have euphoria, we were enabling, you know, before on some of the early OTG glasses. And so we had experience building headworn AR, prior to that, but the market was just too early, there was too much heat produced on the device. It wasn't thermally efficient, the display technologies weren't as evolved. And I think really, the chips weren't capable of running the perception, technology and workloads properly in the devices of that that era. And in part that was why I think you've had a j right on last episode. And in listening to Jay and looking at how Qualcomm and reflecting on how Qualcomm exit ultimately pivoted and moved away from euphoria, we really saw the future at that point in time as VR and in our the best our focus and time and energy should be put in enabling standalone virtual reality. So that's, that's what we did. And we saw the sort of birth of standalone in late 2017, early 2018, we had Pico and HTC, and Lenovo launching devices, Oculus launched the go. So at that time, really the market started to inflect. And we saw standalone was was going to be you know, something pretty important.
You go back to the 2014 era, and the OD G's of that time, we're using the the mobile chips, right, the Snapdragon class of chips that we are also putting in a mobile phones, what were the set of capabilities that, that VR or AR needed, there were above and beyond are different from the chip architectures that made sense for mobile phones,
fundamentally, these devices, they need perception technologies, they need sensors, these are largely cameras, understanding the environment or understanding the user's position and orientation in that environment. And so we need really strong computer vision capabilities to be able to process those perception workloads. And, you know, I think that we were investing in at the time, these perception technologies, but we were using hardware that came from sort of the mobile phone, mobile phone portfolio. And so while we were adding software and perception technology on top of these, we weren't really customizing the the silicon to meet the specific needs of VR and AR, it was really about adapting the chips that we had to support the use cases of virtual augmented reality in large and largely adding leering software investments that were specific to AR and VR on top of that,
got it. And so you continue to evolve, right that XR two now has more capabilities dedicated to the the sensor processing, as you noted, the computer vision which is both a mix of maybe the the included hardware, but also a lot about the the algorithms and the software there that sit really right on top of that hardware to enable the sort of on device low latency processing of understanding the real world understand the user in that moment, and being able to combine them into creating a delivering a good experience, real time spatial computing experience.
Yeah, so I mean, it was really about leveraging our massive smartphone investment and the scale that we have the InVEST Then in IP blocks that are critical for immersive computing, these are things like the AI and machine learning compute blocks, or computer vision or GPU. And we added some minor hardware tweaks supporting new display architectures to reduce latency, and that sort of thing. But really, I think the investment we made was really around software and having perception and reducing the things that an OEM might need, in terms of r&d, to bring a product to market more, more rapidly. Yep. And so
you know, back in the 2014 1516 timeframe, as you were evaluating this opportunity, the opportunity you saw was primarily around VR, I was the one that had the inflection. But now as we continue to evolve, and there's this kind of broader perspective, or this debate, maybe to some extent, about whether VR or AR will be bigger, and whether you believe VR, or AR will be bigger, neither one is expected to be small. Ultimately, I'm in the camp that AR will ultimately be the larger of the two. But that still, from a rally perspective, based on volumes being sold VR is where it's at. But AR there's hope, right, that this thing is going to pick up over the coming years. Which kind of your perspective on this? Do you think that 2022? is the year that we'll sort of start to see some more inflection on these AR devices? Or do you think it'll be another year or two?
I think that 2022 is about building the ecosystem and building momentum. I think we'll see adoption, certainly of devices like these air viewers that we we spent a little bit of time talking about, you know, we saw these viewers introduced in in 2018, with glasses like the unreal light, and more recently Lenovo with the a three. So glasses tethered to smartphones and powered by a platform like Snapdragon spaces, I think it's going to be something we're going to see momentum around in 2022. So we'll see more products coming to market using this architecture. And then also mixed reality devices or VR devices that have video see through experiences. So you're doing augmented reality on a virtual reality device. I think that's another trend that we see accelerating in 2022.
Yeah, just to clarify the two buckets here, there's the standalone, which is HoloLens two is a standalone device for the quest is a standalone device. The links are one there, they're mixed reality device, that's a standalone, that's kind of a VRS device with video pass through which create enables some AR experiences. So that's the class of standalone devices. And it seems the trend definitely on the VR side is that the movement is towards standalone, that's certainly where the the mass the volume is, even though there's still some really amazing tethered VR experiences and devices. And then on the AR side, it's a little bit less clear, but we'll definitely have you noted, we've seen this kind of this companion device, I might classify it. So this companion device that you call the XR viewer, which is one that is tethered in some way at the moment, it's all wired, tethered to some other compute device, ideally a mobile phone. And so you know, to the Lenovo, a three, think reality a three, that's an example of the sort of device. Even even the tilt five is a tethered device, you need some other sort of compute the Enrile. We have recently saw new eyes kind of had a conversation with him earlier this year. These are all sort of tethered devices that are out there. How do you imagine these two different classes are released to AR specifically? How do you imagine these two classes of devices will be used differently? What is tethering to a smartphone enable or not enabled that a standalone AR device has kind of what you have this kind of sense of where the use cases will differ between the two?
Yeah, I mean, I think that if we look at standalone AR or VR, for that matter, you know, these devices are a little bit larger today. Obviously, if you're putting all this compute and battery and I'm sensing other things into the device, it needs to be a little a little bit bigger. And so these devices, I think tend with the exception of VR and consumer, the AR devices tend to be far more focused on on enterprise. So this is obviously guided work or design or collaboration. And with viewers, we're we're getting to smaller form factors. We're taking the processing and some of that processing and we're moving it off of the glasses. And we're putting it into this this device that you have in your pocket, the smartphone. And we're really helping those two devices work together in tandem to create a user experience that distributes the workloads between the the glasses and the smartphone. And so what this affords us to do is essentially build smaller glasses. We can we're not having to deal with the parts of the processing that really drive thermals and and heat in the device can in the glasses can be moved off of glasses to the to the smartphone. And so I think this takes us, you know, down into the 130 gram range or 100 230 gram range of classes, which is more appropriate for consumer adoption. So I think, especially as these glasses go wireless, so I think today we have devices like the unreal light, or the Lenovo A three, they're wired. And in the future, these kinds of this form factor will will go wireless. And I think this is this combined with lighter weight glasses that work in tandem with a smartphone are going to make it easier for consumers to embrace this technology. And it doesn't mean that the standalone devices aren't going to get smaller, they're gonna, they're going to continue to evolve and get and get lighter weight and that sort of thing. However, the I think, you know, in the enterprise space, there is perhaps less less of a concern with depending upon the workload, or what someone's attempting to accomplish with the hardware, with the device being a little bit a little bit bigger today, but certainly for consumers. And to get it into into people's more everyday use, we need to, we need to reduce the size. And that's what using the the glass with a smartphone allows us to do,
wearability being a key attribute of the glasses that consumers ultimately adopt smaller is ultimately better. And, and that's what offloading or sharing the the compute challenge between the two devices allows. Do you kind of based on your own reading of the tea leaves? If we were to project five years out whatever the the right sort of time horizon is? Do you think that the companion viewer type devices will be more prevalent or the standalone devices as it relates to AR,
I think it's gonna be a mix of both really, these devices are approaching the problem from sort of two different two different angles, but ultimately, they're they're likely to converge at some point in the future. But I think because I mean, one of the benefits of having the smartphone and the the glasses working together is that the smartphones upgrade frequently, you know, they're on a on a one year evolution cycle. So this means, as we start to add new technology into the smartphones that would benefit augmented reality, whether this is dedicated machine learning blocks, or enhanced computer vision processing, or other things. Now the glasses can take advantage of that evolution on a more frequent path where in a standalone case, it takes a little bit longer to go between sort of device cycles to get this more enhanced hardware capability. And so I think the the pace of innovation can move a little bit faster, in the case of a viewer, companion tethered, tethered to a smartphone because of this.
And so as we have kind of this, this opportunity to continue to evolve the smartphone on cycles that we have, as you noted, and taking advantage of more of the capabilities of that device independent of the capabilities of the or the different sort of evolution cycle of the glasses themselves. What is the need for compute on the viewers themselves? What are the how do we kind of split that workload between the smartphone in the viewers and what's really ultimately needed on the viewers if the goal is to make the devices as small ultimately wearable as possible.
So one of the goals is to put the really latency sensitive sensitive processing these perception workloads around environmental or user understanding closer to to the user. So this is things like positional tracking, tracking the user's location in a map or tracking their hands, those kinds of workloads are very latency sensitive. And, and so we have established an architecture where we do some of that processing inside the glasses, and then share the output of that with the applications running on the on the smartphone. So I think this also allows us to do things like take the very computer graphics intensive workloads, and run those on a device with a little bit bigger battery, so we can make and run it a little bit higher power, and you can get more performance off of it. So it's about it's about understanding how to distribute these workloads between the glasses and the smartphone, and making architectural decisions about where these these workloads should should run. And having the ability to evolve that over time. We started with what we call a more simple viewer architecture. And that's where you take a device like the end real light, but we consider that to be a simple viewer. It uses two USB connected to the smartphone and all the processing runs on the smartphone, the perception workloads the rendering and warping and everything that's needed for enabling an air experience with a device like the Lenovo A three that's what we call a smart viewer architecture. So we introduce a processor into the glasses and that can run some of these perception workloads. We still render to graphics processing and that sort of thing on the on the smartphone. So this ultimately results in longer, longer battery life and a better user experience because we're not having to deal with that doing that graphics processing on the glasses themselves. But we can also use this the smartphones performance that it has to, to run these workloads. So I think that it's really about a mix. And today, that mix is a balance between the glasses and the smartphone. In the future, I think it will be a balance between the glasses, the smartphone and the cloud. So these workloads will be distributed outside of the of the smartphone. And we've already we've already done quite a bit of work in this area for virtual reality, where we initially tethered the VR headsets to wirelessly to PCs and use the GPU on the PCs to drive and experience on a standalone virtuality headset. And then we move that to over 5g to the mobile edge compute. So running in the operators, infrastructure, and doing the the heavy rendering workloads there and streaming back to the to the headset. So this is something we've been working on for several, several years. And it's I think, just as important in augmented reality as it has been in VR.
Yeah, makes a lot of sense that there's an opportunity to do more on the device itself, even when connected to the smartphone as a companion. And it's great that there's an emphasis on optimizing what can be done there to take advantage of it and help solve the problems in a way that provides for a better user experience, while still, you know, emphasizing the challenges around wearability. And one of the things as we continue to see, as we continue to see the evolution of this market, one of the things we're looking forward to in 2022 is the evolution itself of the ecosystem, building out that ecosystem. And, and one of the big announcements at AWS he was Snapdragon spaces that you and the CIO was up on stage to announce, but that was really your product that you were managing, but now they're a Qualcomm. Can you describe what Snapdragon spaces is?
Sure, so So Snapdragon spaces, is a headworn AR developer platform that really empowers developers to create immersive experiences for AR glasses that can adapt to the space around us that can have an environmental understanding that can have a user understanding. And it's an open platform. So we we built it from the ground up using open XR, which is a Kronos standard. And it includes SDKs, for the development environments that developers are used to working with in VR, or AR today. So Unity and Unreal Engine. And so what this really does is it allows developers to access the perception technologies developed by Qualcomm that previously we had really reserved only for OEM use. So now we're bringing this technology into the hands of the developers. And we're giving them familiar tools and workflows that they understand and enabling them with, with access to these capabilities.
We talked a little bit at the beginning of our conversation here about all the work that Qualcomm did around blue. And that was really a software platform, a whole set of tools and capabilities around enabling mobile applications, which took full advantage of all of the wireless data and connectivity that Qualcomm was enabling. Can you talk us through kind of the thinking, the strategy here, of going beyond hardware, to better enable the XR ecosystem?
Yeah, so I mean, ultimately, we we really want to help accelerate this market, we want to enable developers with a cross platform user experience, so that they can, you know, build from multiple glasses with a single SDK, a single tool stack, that will really defragment the market a little bit for them. And, you know, one of the other reasons that we're doing this product is we want to be able to get feedback from that developer community on the technologies like perception and work much more closely with the developer community. And through that feedback loop will of course, be able to make these environmental and user understanding features better, which ultimately, will enable, you know, better applications and those applications to work more consistently across different devices. And these are, you know, these are things we talked a little bit about positional tracking, and, you know, anchors and persistence and image and object recognition and tracking and more sophisticated features like spatially mapping and meshing the environment. And so you can enable features like occlusion or having more advanced scene understanding, and then some of the inputs as well. How do you do hand tracking, you know, input in in spatial computing using the glasses. So with separate spaces where we really want to take these perception technologies, bring them directly to developers and empower them to create applications that can run across these viewers. You tethered to tethered? different different smartphones. And so it's really about I think helping accelerate the market have giving defragment the market a little bit provide developers with a more consistent experience across devices and do it in a in an open platform way and an open ecosystem way that see. The other thing here we're working with a large ecosystem of collaborators or partners in this it's not you know, purely purely Qualcomm on its own building the stack,
see noted and kind of focus on openness he had mentioned the KRONOS open XR spec is kind of being the basis upon which you're building out this. Are there other chipset manufacturers who are also building to the open XR specification?
Yeah, so there's a whole bunch of pretty much the major players in the in the industry and in VR, and AR are all involved with with open XR. And we've we've been involved with it for many years contributing to the specification. And we're really big believers in creating open platforms and reducing friction and fragmentation for developer community. And in part, that's one of the Charters of open XR is to enable, you know, app portability, or engine portability across across devices for developers. So I think it was only natural for us to embrace open XR in our own platform. And ultimately, we think this is really just what's best for developers, developers should have an open stack and the freedom to be able to move applications between platforms more easily. But at its at its core, the technologies, the tracking technology, I mentioned earlier, environmental user understanding those, those are the those are the core technologies Qualcomm's innovating around we're building algorithms that allow you to do slam, for example, that ultimately is, you know, it contributes to a positional tracking API for a developer or being able to do really good object recognition and tracking, you go announced that nav that Qualcomm acquired wiki to look into it was one of the pioneers in image and object recognition and tracking. So we're incorporating the some of the functionality, the core algorithms, computer vision algorithms that they've developed into into the spaces platform, like YouTube was a
key part of that. I remember my first AR experience way back in the day, it was on a wiki tube app. They had a world browser, I think it was called. That's right today. And it was, it was amazing. I remember thinking, wow, this is the future of mobility is having this incorporation of, of location that was kind of key enabling technology. And then this visualization of that location through the the visual experience, I just thought that was phenomenal. And we'll get to it, of course, it's continued to evolve that that platform are now part of Qualcomm. But you also added clay air to the mix, which in my experience was one of the best at hand tracking their their camera algorithms were really quite fantastic for being able to track objects in hand recognition, in particular gesture recognition.
That's right. I think one of the we we've been working with, with Claire, you know, for about three, three years, integrating their technology stack into our chipsets or XR chipsets and also working with OEMs to commercialize our platform. And I think one of the things that we really liked about that technology is that it can track recognize and track hands using using visible lights. So we don't have to infrared illuminate the hands or have other mechanisms to be able to track the hands using a different set of sensors or cameras, we can use the same sensors that we're using for positional tracking and slam. And we can just add another perception feature to that in hand tracking. So we're pretty excited about being able to bring that that technology into into spaces and get that in the hands of developers in the spring.
Excellent. We won't go into the details here. But I have to imagine that given the first couple of moves around Snapdragon spaces, there might be another move or two. As you continue to round out that overall offering. There's something we can talk about is how you reconcile what's possible through Snapdragon spaces with the efforts that Niantic is doing with their Lightship platform.
So Qualcomm and Niantic have been working closely together for for for many years. In fact, John Hankey in his in his keynote reference to that at at AWB. And so I think we really see our focus with spaces as had one. That's the, you know, predominant focus for us. We're not building or not intending to build smartphone based AR experience AR experiences with it. It's really about headworn AR experiences leveraging the smartphone. And we're really very focused as well on indoor use cases. So when it when we look at some of the technologies we're enabling, like Spatial Mapping, meshing, or semantic scene understanding of it's really about the room that you're in or the space that you're mapping indoors, whereas in an attic are very focused outdoors, so I think we'll collaborate with them and in our press announcement for spaces. There was a quote from John and team and a reference to how we're going to work together on these some of the you mentioned clay air. So some of the features we have around hand tracking or user understanding. These are not things that that my intake are building in Lightship today. So they could potentially leverage that in spaces. And I think they allude to that in the announcement. So I think, you know, this is all about accelerating the innovation of the developer community and getting developers building an AR and having them understand what's possible, using this technology, whether it's smartphone based AR and they're doing amazing things with the platform like light ship or the future headworn or it's Qualcomm's Snapdragon spaces today and headworn. I think it's all just very good that the industry has these new platforms emerging. And and developers can now take advantage of those and really dive into new types of air experiences that weren't accessible before.
Are there lessons that Qualcomm is pulling forward from the experiences of building out of euphoria that you can now apply to building out Snapdragon spaces?
Jay talked a little bit about this on the on the previous episode, you know, we started our research and development in augmented reality in 2007. That was the early early projects, and that ultimately opened our first research center in Austria, focused on air in 2009. And that early research effort became what was before Yeah. And this was just like, we're in the early days of headworn, ar, that was the early days of smartphone AR, wiki tude was there as well. You mentioned them earlier. And they were one of the first the first AR applications on Android in I think they were one of the first applications even on Android in the early ecosystem development efforts that Google did. But before it experimented with headworn AR in 2013, and 2014. But the hardware just just wasn't there, as we discussed. And, you know, I think in part, that was the reason part some of the recent of euphoria was was spun out that had worn wasn't ready. But today, the hardware is here. And we're really I think, on the cusp of this market inflecting. And this growth will have meaningful impact on Qualcomm's core core hardware business in the future. So I think, well, some there are some parallels between view forea starting kind of the smartphone ECOSYS AR ecosystem in the the early days of smartphone, and we're in the early days of headworn AR today. I think that today we're with spaces where much better aligned with, you know, Qualcomm's core business and its ambitions to help grow the augmented headworn AR market. As you
kind of articulate to yourselves internally what what's the role that Qualcomm wants to play within that market within that XR ecosystem?
Yeah, I think we really want to be a catalyst, we want to accelerate the innovation in hardware, in software in machine perception, technology and an ecosystem and for this market to flourish, we need to invest ahead, as we have done on mobile in the past. So as we expand support, you know, to support this ecosystem with spaces, our customers are going to shift from not just being OEMs, to also being developers. And that hasn't been as much of a case with Qualcomm in mobile in the more recent past, you know, certainly if you go back to the v4 or bu days, the developer was a customer. So in that sense, it's what we're doing with spaces a little more like beer, the brew and euphoria days in mobile. But yeah, it's it's really exciting to work with the developer community, there's a lot of there's really, I think, intense excitement for what, what the future holds. And then everyone sort of feels like we're on the cusp of this market really starting to inflect. So it's, it's great to be, you know, working with the development community again,
yeah. As you've been through this through a couple of cycles. Now, every time you kind of go out and establish these sorts of new markets, and he's emerging mobile related technologies, it's never easy. It's never smooth. It's always there's always a bit of a bumpy ride. It's we kind of got to go through this. What do you see as some of the biggest headwinds that you or other members of the ecosystem need to overcome over the next 1218 months?
I think we really we need to get the industry investing in air content, and bringing the right kinds of experiences to AR experiences that drive engagement and retention and sort of daily use. And we need it to be easier for developers to build with AR we need to embrace the mobile development community with the tools that can help them shift their development from mobile to incorporating augmented reality or spatial computing into their Are workflow in their environment. So I think in part this is what we're doing with Snapdragon spaces is is trying to enable the development community to overcome some of these headwinds. In the next we've anticipated them over the next 1218 months. An example of that all use is something we refer to a little bit as air as a feature. And and Steve Lucas, on on my team did a couple of talks at Ewe, which I think were recorded and will be shared, at some point here that talk about this, this concept. And there's, you know, there are sort of three different types of AR developers. Today we have headworn AR, and let's say, VR developers, we have smartphone based AR developers, and then there's just the mobile developer cohort, or how the people who are building from mobile today and they, how do we embrace that community and help that community build for AR. So one of the one of the things that we're introducing in spaces is because we have the application running on the smartphone, we can take advantage of multiple screens. So we can take advantage of the real world in spatial computing as one screen, and we can take advantage of the smartphone as another. And so if we put these tools into the hands of developers and allow them to sort of balance, what do what parts of the experience do they want to enable through the smartphone, and what do they want to do in 3d through spatial computing, I think that's really exciting. What we think this will enable is a developer to add, let's say, a mobile developer, they're not doing anything in AR necessarily, for them to add a sort of small AR feature to an existing mobile app. And it's not like they have to go build a 3d spatial computing app from the ground up, like they might have to do on a standalone platform, they can simply add a little bit of AR to a mobile app they already have that's published into into Google Play. And then if a user has that mobile app running on a smartphone compatible the glasses and they plug the glasses in, it unlocks some new functionality. And maybe that functionality is just a little bit of an experience. But over time, that AR based experience can evolve and become a bigger part of that application. And so I think one of the analogies that Steve used that I really liked was, he sort of used a pool analogy, you can get into the shallow end, as a mobile developer, you can wait in and get comfortable, versus just diving into the deep end into into spatial computing and building something, you know, from the ground up in 3d. So I think when it comes to the headwinds, we're trying to make it easier for developers to build in AR, and this idea of adding a little bit of AR is a feature is an important part of that.
Yep. A good entry point, both for developers and for users, ultimately, as we know, that's right, evolve. Yeah, yep. You mentioned one of the things that evolved in Qualcomm's thinking about the XR, the broader XR opportunity was initially it was VR is where it's going to be at standalone VR, that's the near term opportunity. And there's been recognition that there is a really interesting AR side of that opportunity, as well. And you're investing a lot right? The Snapdragon spaces of increasing the set of functionality that will enable developers to create more compelling experiences more easily. Within that. Are there other ways that your thinking has evolved about how this opportunity will kind of come together?
Yeah, I think we've always, or I've always seen this as the future of computing, immersive computing, spatial computing is is the future. And the opportunity is going to be massive, it could one day exceed mobile, it's like it's likely to happen. And like you, I think, I have the belief that augmented reality is going to be really the the thing that, that really the flywheel that really drives that. And ultimately, that might mean, using our phones less and less, until maybe one day, maybe we don't use our phones, phones at all. So I think that my my thinking hasn't necessarily evolved in that area. But I think my resolve has just become stronger that this is the future. You know, when I when I joined the team in 2017, I sort of had a had a sense that this was one of the reasons i i Actually when Hugo reached out I took the job with him to to work in this area, because I believe that this I had a sense that this was the the future but the the more I've worked in this space in the last four years, it's really become crystal clear to me that this this is the future of computing. And I've been lucky enough in my career to have worked in the early days of the Internet and the early days of mobile computing and cloud computing and now spatial computing. And I have to think that this is the most exciting and the most transformative that I'll probably see in my career or probably my lifetime. So it's just a fantastic time to be in this space and really excited about it.
Why do you think the wearable computing the spatial computing era will be more impactful than the mobile computing era?
Yeah, so I think that I mean, certainly there's one element of this, that is getting us to stop looking at these rectilinear screens and, you know, be stopped being heads down sort of embedded in our devices, I feel like there is at least a promise that spatial computing is going to help us maybe be a little bit more in touch with reality, despite the fact that reality is going to be is going to be blended with with augmentation. So I have a hope, you know, that will evolve a little bit of the culture, the way that people use their devices. And ultimately, this could be, you know, exciting in terms of our society of a house and be less, I think, dependent on that mobile phone screen for the kinds of computing that we do.
How, let's dive right into the lightning round questions here and, and go with what commonly held belief about spatial computing Do you disagree with?
Yeah, so this is interesting? Well, I think the the belief that we need to have sort of small, sleek, 30 gram all day wearable glasses to get utility out of spatial computing is something I disagree with, I think that we've proven that there's enterprise use cases, there's consumer use cases with larger form factor devices, certainly in VR with fitness and health and wellness and gaming. And we've seen a lot of these market signals in virtual reality today, in larger form factors that I think are applicable to AR glasses. And air glasses are already getting much smaller. We're at, you know, the 100 230 gram range that can deliver these kinds of experiences today. So I don't think I don't think we need an all day 30 gram wearable really to start creating value in this space, I think we can start doing it now with the glasses and form factors we have today.
Now, but one of the potential uses of AR which one are you most excited about?
I'm really excited about AR in education. I think that the curriculums evolve, and we can teach some of these concepts in using 3d and spatial computing, I think that's going to help people find more meaningful ways to learn and give them agency and learning that they can't have perhaps today in in 2d computing. And Qualcomm sent a lot of early work in VR around this with VR campuses and connecting students, we did a project with Morehouse College, and another one with American High School, allowing students to attend together in virtual reality and just seeing how that impacted students during the pandemic and seeing the promise, I think of what spatial computing could mean for education in the future. It's really a thing I'm excited about.
What book have you read recently that you found to be deeply insightful or profound.
One was a book called Deep Work by Cal Newport. He's a computer science professor at Georgetown. And it's really about developing the skills that allow you to focus without distraction on really cognitive demanding tasks. So it's lessons about how to be able to focus in an increasingly distracted world. I think this kind of ties back to the earlier comment about how we're attached to our smartphone screens. And so it's about producing better results in less time and develop the right habits really to, to avoid distraction that kills kill productivity. So that was a good one.
Yeah, that's a great one. That's that's one that I also should, would benefit from. If you could sit down and have coffee with your 25 year old self, what advice would you share with 25 year old Brian?
well beyond not selling my Amazon stock in 2001, I think a big one would be to travel more. And you know, I've had the opportunity to visit many parts of the world. I've been lucky as a business traveler, you know, my roles, various roles at Qualcomm. But really, I think I tell myself, Hey, before you have kids, and before life gets too hectic, spend the time to if you have the means to try and visit the world see, see a little bit more of the world in your 20s and early 30s.
Yeah, good advice. Something I'm still trying to live out. I didn't maybe catch the deadline there by the end of the 20s. But I love to travel. Yeah, same here. Any closing thoughts you'd like to share?
I think really, if you're, if you're a developer who's was working in XR or considering jumping in, we're at an inflection point, there's never been a better time I think to build for VR or AR. We're still on the ground floor. So now is the time to step in.
Where can people go to learn more about you and your work their Qualcomm?
Yeah, so I'm on LinkedIn and Twitter, but I would encourage you to learn a little bit more about Snapdragon spaces to visit qualcomm.com/faces. We've got a lot of information out there and you can join our mailing list and we'll keep you informed about the evolution of the platform as we go into general availability in the spring of 2022.
Awesome, Brian, thanks very much for the conversation. Thank you. Before you go, I want to tell you about the next episode. In it I speak with Kavya Perlman, Kavya is the founder and CEO of the XR Safety Initiative, a nonprofit effort to promote privacy, security and ethics and to develop standards around application security within AR and VR. This conversation we discuss coffee has passion for cybersecurity, how she came to focus on AR and VR technologies and key elements of the privacy and security frameworks proposed by XRSI. I think you really enjoy the conversation. Please follow us subscribe to the podcast. You don't miss this or other great episodes. Until next time