The AR Show: Maryam Sabour (Niantic) on Active, Outdoor AR Gaming
10:47PM Apr 10, 2023
Speakers:
Jason McDowall
Maryam Sabour
Keywords:
niantic
ar
device
game
building
hardware
vr
thinking
experience
headsets
people
world
gaming
started
users
design
solving
entrepreneurship
aesthetic
problem
Welcome to the AR show where I dive deep into augmented reality with a focus on the technology, the use cases and the people behind them. I'm your host Jason McDowall. Today's conversation is with Maryam Sabour. Maryam is the general manager and head of business for the AR headsets group at Niantic. Maryam studied law in business at university going on to earn a juris doctorate at McGill University. Early in her career, she was drawn to entrepreneurship having founded an E commerce company and a legal clinic for startups. Later, Marian was drawn into the world of VR, where she spent several years working as a founder or consultant across several projects, before taking on the role of the Business Development Lead for new antics then nascent Lightship AR platform about five years ago. In this conversation, Maryam shares her path from law entrepreneurship to Niantic and explains why Niantic is now becoming a hardware innovator.
I think in the industry, we're seeing this divergence of two paths in terms of the players and the types of devices they're building and the types of use cases that they're building for. On the one hand, we're seeing indoors, these entertainment focus devices. So these are our Mr. Headsets, which means you can get away with maybe an aesthetic, that's not as nice because you don't have to take it outdoors. But you can have all of the processing power, you need to be able to provide, you know large field of view and really immersive games and have combat compelling gameplay. On the other hand, we have all of these outdoor focus devices, but because they're outdoors, they're leaning into a very small, sleek aesthetic. So everyone wants to get to that perfect pair of smart glasses. But it also means it's going to be a smaller field of view. And it's going to have requirements that are not going to enable you to do gaming, it's going to be more like your smartwatch on your face, where it's really bite sized information. And so utility is probably what's going to be driving that, again from the consumer lens, those types of devices. Now, what we're trying to do is the best of both of those two paths. So we're trying to solve all of the indoor problems, but then all of the outdoor problems because we want the gaming and we want the outdoor.
Maryam goes on to discuss the trade offs and challenges in making a device suitable for outdoor gaming. Some lessons learned from the earlier explorations, how the hardware fits into the broader Niantic strategy, and how the company thinks about privacy. As a reminder, you can find the show notes for this and other episodes at our website, D ar show.com. And please support the podcast at patreon.com/theARshow. Let's dive in.
Maryam, you're incredibly busy as an entrepreneur as an executive working in this world of AR. Have you always been this way? How do you keep yourself busy? Or did you when you were young?
Interesting question. So I was born in Vancouver and raised in Toronto. And growing up in Toronto, I was raised by a single mother, who was naturally very busy at work all the time. And an older sister, who was also very busy being a teenager not wanting to hang out with her little sister. So I actually had to keep myself busy a lot. And I found that I occupied a lot of my time with gaming. And so I think there's this you know, this full circle in my life, which is I did a lot of gaming as a kid. And I think it actually really helped build a lot of my strategic thinking problem solving critical thinking. You know, if you think about games like on Super Nintendo Tetris, or Super Mario World, Zelda, less so Mortal Kombat, but you know, a lot of these games are all about puzzle solving. You know, how do you get to Bowser and defeat them? How do you find the secret levels? Or how to open passages in Zelda? So I really think that that problem solving that kept me busy as a kid ended up being kind of what brought me into the world that I'm in now in the games industry, also as a lawyer using that critical thinking, and just solving real world problems now.
That's so cool. Just as an aside, my daughters are really into this educational game experience called synthesis. Synthesis is an offshoot when Elon Musk it seems to be this thing in Los Angeles, when very rich people decide that the school system doesn't work for their particular family lifestyle need or their children, they go and make their own school. So Elon Musk was one of those that done something like that. And at SpaceX on the campus, he had started this, the school for his kids and kids of some of the SpaceX employees and other friends that he had called Ad Astra and Ad Astra was this amazing, thoughtfully, well developed school curriculum that was run by this guy. I think his name was Josh dawn. And he had lots of cool ideas about how to educate kids to be successful in this this era of, of creativity in group problem solving is so essential. And one of these concepts that he had was this game sort of simulation experience where they present a problem, and the kids just have to figure out what's going on and ultimately solve it as a group. And he extracted it into this online game experience that he calls synthesis. So there's really a difference now aerials, and basically the construct is that the kids show up, they don't tell them the rules of the game, they don't really get to know their teammates, they just have to figure it out together and figure out how to collaborate and ultimately solve the problems often in a competitive environment, the myths of collaboration, competitiveness I was going on. And at first, the crows would come back from these gameplay sessions, and they were so frustrated, and they're usually complaining about some other child and how they completely dominated the experience and or weren't participating. They weren't like a good teammate, they were over time, you can see that they were learning how to manage all the different personalities and solve these puzzles. And they love the puzzle. They love the games themselves, as well. Anyway, so it's just interesting. Gameplay is such an amazing environment to learn certain kinds of skills and you took it sounds like this wealth of problem solving sort of experiences from your, your gaming as well.
I love that. I want to play that game now. Right?
Yeah. So this focus this early enthusiasm you had for games? How did that lead you down the path of law and entrepreneurship, how those things come together for you.
I think law and entrepreneurship kind of have different stories, but in general, so I mean, I should probably start by saying that the public schools in Canada are really great. But 30 years ago, like stem was not as prevalent as it is today. So So growing up, you don't really know what options exist beyond, you know, a doctor, a lawyer. Business means something with numbers, and maybe some kind of scientist. So law and business were things I understood, I felt I'd be great at again, because I love problem solving. So with law in particular, there's always this kind of social justice aspect to it, which I think is often why people go into the law because you hope that you can grow up and change the world, right, make the world better fight against injustice. So I felt if I wanted to make an impact in the world, I needed to quit myself with the understanding of the law. And it was kind of this like, if you want to beat him, you got to join him kind of mentality. But startups and tech startups in particular, and entrepreneurship, that's kind of a little bit of a different story. It wasn't until university that I started to actually be exposed to eat tech, as an industry, I should say, I didn't even know computer engineering was a profession that existed or that startups was a stage of a company. I actually remember one summer I was interning during law school with Ruben Ashtar, who was this tech lawyer at the time for many Y Combinator companies. And I didn't know what Y Combinator was. And when he found out I think it was like blasphemy. So he had me like sign up to tech news in my email inbox every day so that I could absolutely read through it and become more tech literate. It probably is one of the most impactful daily habits I've had since but really that exposure didn't come through up until university. And then ultimately, when it comes to entrepreneurship, my passion was built through doing. So when I was in my undergrad, I started an E commerce, jewelry business and went through the lessons of, you know, how do you incorporate How do you make your first sale, hire your first employee, learn to be really scrappy and resourceful, right? Like Teach Yourself Photoshop to take photos and edit inventory that we were hand crafting or learn HTML to build your company website. So, you know, it really came through through doing and then in law school, I had this exposure to the Center for Entrepreneurship at the university, there are where I began working, mentoring other startups. There are some really amazing projects and startups and eventually the inventions and innovations coming out of the university, like you could from the medical faculty, I remember there were students who had researchers who had invented a new way to cure pancreatic cancer, or a group of engineering students who had invented type of cement that could absorb co2. So you know, this constant exposure across different disciplines, these different innovations is what really deepened my interest and knowledge and really pushed me further down entrepreneurship, but closer to tech, early stage startups and then using the law to help me kind of be better at being able to address problems in that space.
It sounds like this kind of all led up to one of the one of your other early entrepreneur experiences you noted started the E commerce company focused on on jewelry, but you'd also ended up trying to help other entrepreneurs through a legal clinic. Can you describe that a little bit?
Yeah, that's compass startup and legal clinic. It's a nonprofit that I started when I was in law school actually working at the Center for Entrepreneurship. Because there were all of these startups have the same problems when they are starting out. So you can be you know, PhD who knows everything about your domain expertise, but you still have to put together a shareholders agreement and not navigate that prenup. And you still have to understand, you know how to hire your first employee, or incorporate or all those basic steps, but it's really inaccessible to founders and to entrepreneurs. You know, legal services are not cheap and understanding the law. It's the legal jargon is specifically crafted in a way so that it's not easy to understand. that you need someone who is there to explain it to you and be able to decipher it for you. So the premise of the clinic is really to provide those free services and free legal information to founders. And so it's still running today. I'm the chair of the board. And we've just expanded our mandate to now also include Investor Protection beyond just advising founders.
I don't suppose that advice includes how to protect yourself from Silicon Valley Bank or other banks closing suddenly, overnight.
You know what I think, as of today, it will definitely be part of the work the caseworkers are going to be doing and really studying why that's the case. Why are the policies that way? And how should we be protecting ourselves?
Yeah, amazing, unfortunately, unfortunately, for sure. What ultimately hooks you into this world of AR and VR? And how do you find your way from Toronto to the Bay Area.
So I think that had multiple touch points with AR VR, the one that really stands out the most came through a project I was working on in a class called law in the Anthropocene. So it was about climate change. And it was really about looking at how among all of the social issues that exist, it's most difficult for people to change their behaviors as it relates to curbing climate change. Because it's so far away, we don't often see that immediate like cause and effect, right, we drive our car, and we're not thinking oh, like, I don't see the pollution, necessarily, I don't understand how I am creating this problem globally. And I had come across research that showed how putting users in VR experiences could actually create this real life change in your behavior, even for these really faraway types of social issues, or that feel far away. And specifically, one of the studies that they had done, they had users who were in watching a video of someone cutting down a tree, versus actually putting you in a VR experience where you were cutting down that tree yourself with a chainsaw. And they found that the users who were in VR, and were actually embodying that perspective of, you know, holding that virtual chainsaw, cutting down that tree, hearing it go down feeling that effect, actually exhibited behaviors afterwards, you know, for the weeks that followed, that were much more environmentally conscious, versus those who just kind of watched a video of it happening. And for me, that kind of clicked something and how amazing it is that you're able to really embody these other perspectives in VR that enables you to change your real world behavior. And so just really opened up a whole new door for me and thinking about how these technologies could really create real world impact.
So that's amazing. So you felt you felt the impact? Not only just study it, you can even turn to experience effects impacted VR? And how did that ultimately connect you to the city of San Francisco and make your way there.
So at the Dobson Center for Entrepreneurship, for my alumni, there's, there's actually a ton of Canadians, there's like 500,000, Canadians in the Bay Area, something like that. And then a ton of alumni who come from McGill, so that Dobson, Center for Entrepreneurship at McGill that I was working at, I had pitched my director that, hey, I think I should go out to San Francisco and connect with all of the alumni we have out there. Let's, you know, connect them to bring them back and invest in the startups that are coming out of the university and kind of create this bridge between Montreal and San Francisco. And so he said, Yes, so I spent about five months out in San Francisco doing that. And throughout that, you know, grabbing all these coffees, cold, LinkedIn, all these different alumni, a ton of them naturally were in tech, and many of them were also in the AR VR world. And that eventually led me to upload VR. And so I met the folks who were who were leading upload VR. And at the time, they had just opened up a co working space at like mission and 11th. And so I started working out of there, and then found that they also were an early stage startup who needed help and thinking about how they wanted to create this entire education arm for AR and VR. And so I started to consult them. And then as I consulted them, I then had other companies in that space, like retina VR, who's working on analytics for VR, and then Lumiere VR, kind of in the film and entertainment space. And so it kind of just started to, I kind of started to just be exposed more and more of the companies in the space. And at that time, you know, HoloLens had just came out, the HTC Vive, was just coming out. So we had these hollow docks, which were like the first place you could ever experience these, you know, fully sixed off VR headsets. So I just really got so much exposure, being in that space and connecting with all those companies. And I just couldn't go back.
Couldn't go back. How'd you end up making your way to Niantic and you know, that experience?
I actually came across a blog post that my colleague, Jani Solheim, she's our head of social impact. She had written this blog post that talked about all sorts of things the Niantic player community was doing. And if I'm being really honest, at this time, so this was in 2018, I was not playing Pokemon Go and I actually thought, oh, like Pokemon Go was a fad. It was a thing that happened and then, you know, I didn't really know that people were still playing But I was reading about this community. And that was not just the PokemonGo players, but also the ingress players that Niantic had. And I started to learn so much about the way this community is all about getting outdoors. But I think the article specifically was talking about how they all came together to clean up oceans on Earth on Earth Day. And it also mentioned, you know, like, there are people who are actually overcoming social anxiety through these applications, because it helps them go outside in the real world, you know, even just a simple walk helps. And then you meet real people face to face. So it helps improve depression. And, you know, people are losing weight. And so this community, to me was just so amazing. And it kind of came back to this feeling of using immersive Tech for Good. And seeing that that was exactly what Niantic was doing. They were using their products and their technology to be able to actually make real world impact. And not just, you know, another gaming company where you're, although, again, I'm a gamer, you're not just sitting on the couch gaming. So I looked at the job page, I saw an open role for business development. And I think it's been about five years since I applied for that role. So the rest is history.
I'd love to get into some history and even actually go back a little bit further in time. And that's maybe you can help us appreciate this. The pre story of Niantic was either the early days or call that a lot of the folks from Niantic came out of the Google Earth business like they're building kind of mapping Google Earth sort of understanding spatial location sort of software. And before AR was the thing was really the focus in the world of mobile at the time was the fact that the phone knew where you were, which opened up all sorts of amazing opportunities, including, you know, enabling companies like Uber to easily you know, communicate to a driver where you are to come pick you up and, and things like Niantic and Pokemon Go. So this notion of location was really the core of Niantic ability to create these games and experiences. But we in this industry, and we were very focused on augmented reality primarily here with AR show. But in this world of AR and VR, the the view the lens through which we often look at Niantic is through the one of AR so how is it that AR became part of the Niantic story and the journey? And how does that fit? And how do you how do you think that company maybe fits into the broader ecosystem?
I love that question because it really lets us talk about what makes Niantic very uniquely Niantic. And I think when I first joined, I didn't even know this about Niantic. We were 200 people when we joined when I joined and we're I think about 1000 people now. And at the time, I think I was probably one of the few people who actually came from the VR world. And in my mind, like if I think VR I think AR XR, but to your point, most of the folks who were in the company came from mapping came from Google. So for those who don't know where Niantic started, so if we look at John hankies story he actually founded keyhole, and keyhole was what essentially became Google Earth. So Google acquired people in 2004. It was the precursor that became Google Earth. So the whole keyhole team became the Executive Leadership, essentially leading Google's geo division. So that was Google Earth. And then that was followed by Google Maps, and then eventually Street View, among other products. And their goal was to map the world. And I think we can all agree that they impressively achieve that goal, because I'm sure you've used street view many times in your life, or Google Maps, as have I to even just explore other places that I have never been to. In 2010. Niantic was founded within Google as an internal startup led by John and they wanted to build experiences leveraging the map. And I think John has talked about this publicly before, but he kind of had this aha moment when they noticed all of the different things that people were actually building on top of Google Maps. So Niantic Labs was essentially doing a similar thing here. But what most people don't know is that the first experience that Niantic built was not PokemonGo and was also not Ingress, because some people think that was the first experience, but it was actually a field trip. Are you familiar with field trip, I'm not a field trip was essentially a virtual tour guide, where you can pull up different information about places around you, whether it's, you know, events around you, restaurants around you historical facts about places nearby. And the team had actually gotten all of this information about the world through like physical books, encyclopedias, you know, things like Atlas Obscura, right going and actually transcribing and digitizing all of this information to create this database of points of interests. And then the app was just in your ability to be able to have this virtual tour guide and learn about the world around you. And it was actually put on Google Glass. So Google Glass was very early, and hence field trip was very ahead of its time. But that's really the roots in terms of where Niantic started and what they were trying to do. It was a couple years later that the team launched Ingress, which was the first location based game and that had about a million people playing Ingress one year later, so a very hardcore, passionate community of players, and then they spun out in 2015 launched PokemonGo and, you know, and then the rest is history, but to come back to you or point about kind of where Niantic fits in. Now that we understand that history, I think most people when they think about AR, they are defining it as part of the spectrum of XR. So you know, VR is on one end of the spectrum AR is on the other end, like Niantic perspective on AR is a bit different, because of those deep roots in mapping. And an understanding that augmented reality is not just visual, I think you can augment reality and have a tie between digital and physical without needing visual AR. And and this might be a little bit controversial, but I think we get outsized credit for being pioneers and the traditional definition of AR, for a game which had maybe very little or if I dare say no ar at the outset of Pokemon Go. Instead, it had very strong ties to the real world because of location. So when you came to catch a Pokemon, it felt like it really existed in the world, not because you could see it, but because it only existed in a very particular spot that you had to physically walk to, to engage with and then catch. And I think the game map was, you know, is the real world map. So it understands your world, and understands what city you're in and understands the local park where you want to go catch that Pokemon, and even understands the weather, if it's raining outside, it's raining in your game. So I think these connections can be even more meaningful than visual AR, and especially on mobile, because it's so limiting to interact with AR today still. So it's even more important to lean into these other ways to bridge the real world and the digital world. So I think going back to, you know, mapping and location and thinking about AR as not visual, which is how most of that as a spectrum of VR and XR, but really thinking about AR as just as a way to bridge the digital and the physical,
bridging the digital and the physical with this perspective of the map in the real world. And having to physically move to the space, which really enhances so much about this augmentation of your real world experience. And so with that perspective in that background, maybe talk a bit about the motivation for ultimately exploring hardware, and alternative forms of headworn sort of AR see through sort of AR hardware, and how that kind of fits pushing beyond what we maybe already have with the devices that are out there how that fits within the broader Niantic strategic direction,
no one is building hardware for our specific use case. And the reason why is because our use cases outdoors, and it's gaming. And that is probably the hardest use case to build for. I think in the industry, we're seeing kind of this, this divergence of two paths in terms of the players and the types of devices they're building and the types of use cases that they're building for. On the one hand, we're seeing indoors, these entertainment focus devices. So these are our Mr. Headsets, which means you can get away with maybe an aesthetic, that's not as nice because you don't have to take it outdoors. But you can have all of the processing power, you need to be able to provide, you know, large field of view and really immersive games and have compact, compelling gameplay. On the other hand, we have all of these outdoor focus devices, but because they're outdoors, they're leaning into very small, sleek aesthetic. So everyone wants to get to that perfect pair of smart glasses. But it also means it's going to be a smaller field of view. And it's going to have requirements that are not going to enable you to do gaming, it's going to be more like your smartwatch on your face, where it's really bite sized information. And so utility is probably what's going to be driving that, again, from the consumer lens, those types of devices. Now, what we're trying to do is the best of both of those two paths. So we're trying to solve all of the indoor problems, but then all of the outdoor problems, because we want the gaming and we want the outdoor. And that was really the reason why we felt we needed to have a hand in helping accelerate this come to market, which is we also have the content. We also have the users. So why don't we work backwards from the application and from the users to really understand what is the MVP that we could get to? And how can we help hardware makers have everything they need to be able to enter this market so that we not only see devices available to us that that we encourage more entrants more competition and hopefully democratize the market.
Can you talk a little bit through some of the implications of having that sort of focus that needs to be an outdoor gaming experience large enough Field of View bright enough to be useful outside? Moving, how to give us a perspective, like how bright is bright enough for? Or how long does the place? I shouldn't need to be based on what you've learned so far?
Absolutely. So maybe I'll work backwards through some of the questions you just mentioned. But in terms of how bright if we want to have the device work outdoors, and I think Jason, you're in LA so you know, think about LA sun. At minimum, we want to have 2000 nits per eye, and that is still assuming that there's some kind of dynamic dimming whether it's electric chromic lens or something that's still you know, maybe up To 95% dimming as needed, depending on how bright it is during the day. And so that's really the the desired spec or perhaps even the minimum spec that we want to achieve as it relates to brightness. When we're thinking about outdoors, it also means thinking about weather proofing, you know, so what happens if it's raining outside? Can you still play? And what happens if it's really hot? How does it affect the thermals or if it's really cold, because I'm from Canada, and we've got winters. So you know, some of that is also layered into thinking about the design. And then battery life is important. This is less for outdoor, but more for the gaming portion, which is, you know, if you're going to take this device outside and you want to game with it, on average, you probably want to see, you know, one gaming session would be about 20 to 30 minutes on average. And so you want to be able to do that maybe a couple times. So I think an hour for battery life is the is the minimum. And ideally, we can get to some other things that come to mind is, of course, the aesthetic, we know that everybody wants to see sleek glasses, but that is not in fact, what Niantic has led with in our design, and we can talk about that too. But the aesthetic still matters. So even if we're going with a gaming aesthetic or goggles aesthetic, it needs to be something socially acceptable. I would also say that it specifically doesn't have to be all day where when we talk about outdoor gaming, because you're doing it in sessions. So it's really more of a session based design in terms of how we think about how we're going to use it,
given that the inspiration the source, the the native, most of the time use experiences on a smartphone, how does the content that you're kind of pulling from inform the hardware choices or the gameplay sort of experience? Now, in part, what I'm thinking about is like the orientation, maybe even of the phone that you normally hold in your hand and how that translates to what you want to show on the glasses?
That is a great question, because so our teams internally have been doing so much prototyping with so many different devices to try and understand what types of experiences are possible and what types of experiences are not. And part of that is to also help drive the hardware requirements. But to your point about the phone, you know, there is a question as to how does the phone play into the experience here? Are you using the phone as a controller, are you putting the phone in your pocket and you're not even leveraging it is there you know, something that you could be doing across phone and headset, maybe some people are on the phone and you're on the headset and you're interacting with the same content across cross device. So we think cross platform and cross device is extremely important. But from an experience standpoint, there are all different types of experiences that you would be building depending on what you have at your disposal. Even the idea of the phone as a controller is interesting, because, you know, in a lot of the prototyping that we've done, we've actually seen that there's a ton of gameplay that might actually require a real controller, something that actually has buttons, maybe a joystick, maybe, you know, some kind of sixth off movement that you're able to do with it. And so even the phone might not be ergonomically comfortable to use it in that way. So there's so much research that has come out of all of this prototyping to really inform the types of experiences that could be possible, depending on the different types of inputs and interactions that we could enable from the hardware side. So we constantly have this balance of, you know, the creators want everything. And then from the hardware side, you know, we really just want to understand how can we give them the MVP, and just have some really compelling experiences, maybe it's not everything, but just start with the MVP, because we want to get something out of the door. That's also thinking about cost, right? Because we'd love to have something that can do everything. But it also means cost goes up. And often it also means size increases so that we then we lose that aesthetic, for sure.
In the gameplay testing that you've done so far, and kind of some of the conclusions that maybe you've drawn from that. Do you imagine that this sort of device is something that people carry around with them all the time, so they can then pull it out and do some spontaneous gaming sessions? Or do you imagine it's going to be the sort of thing that they, you know, it's like this sporting balls that they've got at home, I got the soccer ball, they gotta organize a couple of friends gotta go home, get the equipment, and then I'm gonna meet them somewhere, some afternoon or on the weekend, and we have a gameplay session together, that modality make the difference makes sense, like, where do you think users will mostly fall on that spectrum?
You know, I think both of those speak to portability as a necessity. And I think that's what we're indexing on. So if you want to be able to play it with friends, and you bring it out only when you only when you want to play with friends, then you can do that. But if you want to be able to take this device and play it solo, because you want to use it more for exploration of the world, but you're still doing it in a session based way meaning you're doing it for an hour or two at a time. It's not meant to be worn all day as you walk around everywhere yet, then you can do that. And so part of the think And around the design was to make sure that it wasn't so delicate that you felt like you couldn't really, you know, comfortably throw this into a backpack because you're afraid something might break. But at the same time, it wasn't so rigid that it was, you know, it was so large that you couldn't throw it in a backpack, or you needed to carry it in a separate case. So this idea of collapsibility was important for enabling portability, which meant that we wanted to lean into fabric materials and things that essentially allowed us to collapse the device into itself so that it could get into a very compact size.
So one of the public announcements that you had in terms of partnership and making this hardware a reality, was it partner with Qualcomm losing track of time? Maybe it was two years back now? What is it kind of think about the relationship with Qualcomm and some of the other maybe public or private relationship you've had? How do you describe the approach that you've taken to ultimately develop this sort of new class of hardware that you think needs to exist?
So in terms of how we look at the development process, it has been extremely application driven. And I know I said this, but to kind of talk about a little bit more in terms of like, what the actual process looks like, we've done a lot of consumer research. So because we have an incredible community of folks that we can tap into who are existing Niantic explorers, which is what we refer to our community, as we've been able to do a lot of studies in terms of putting different devices in front of them, putting different experiences in front of them, and just understanding, you know, what, what works and what doesn't, what are the types of things we should be thinking about as we design this? And what do we need to get over to really make this something that becomes that that reaches critical mass. And so we have developers who have helped inform a lot of the hardware requirements, which are the creators that I spoke about previously. But then the consumers are a large bucket of folks who we've learned so much about UX from, that I can give a ton of different examples here. But you know, even just thinking about, we were just talking about the controller, so that's top of mind. But thinking about, you know, if you were to actually use a controller in public, like, what would that look like? And would someone potentially misconstrue that as a weapon, would you potentially be afraid to wear this entire system outside, if it looks very expensive, because you think someone might steal it from you, you know, so these are some, you know, very real questions about how people might feel about wearing something or wearing something outdoors, and the barriers that exist that you have to think about. And if you don't actually talk to consumers to understand all the different places they might be wearing this, how they might be using it, then we're not going to be designing a device that can really reach that can really include everybody and be and be accessible to everyone. And then there's also a ton that we've learned on the experience side, from doing these studies, that really speaks to the importance of field of view, happy to also go into that if you want to talk a little bit more about it, yes, please. So on field of view, it can be very limiting if we are under 50 degree diagonal field of view. And, you know, we can talk very specifically about vertical versus horizontal field of view and what the minimum is there. But really, it's just that the the variety of experiences that you can build as you get a larger field of view increases. And if you have a limited field of view, and is more likely to break immersion, they can create discomfort. And there's also less that creators can do even as it just relates to like UI real estate. So you know, thinking about breaking emergent to try and paint, paint that example a little bit better. Like imagine, you want to see a large object and the distance, and it's Godzilla at the top of the Ferry Building Clock Tower, if you can only see like half of Godzilla his body, it's really not as compelling and you are breaking that immersion. Similarly, if you have a very small, you know, virtual Chihuahua, and you know, you see this little dog and you want to go walk up to it to pet it, and as you get closer, you bend down to interact with it. But again, you can't see all of it and your field of view, because it's so limiting. Again, you're you're breaking that immersion, and we've seen the same thing. So as you look at all the different types of characters you might interact with, imagines some kind of NPC or like a virtual Spiderman or virtual LeBron James, right, if you're like six feet tall or seven feet tall, if they're virtually standing in front of you, and your vertical field of view is not large enough, again, you won't even be able to see this virtual person from head to toe. So those things mean that the types of experiences we have to build are going to be much more limiting. And so there's always this push and pull between the hardware, the hardware engineers who really want to try and think about, you know, let's save battery life. Let's go for a smaller field of view, we can get to a better looking device versus the creators who want to really just create the best experiences and need the space to do that. I have one more example on on another type of game that we could talk about, but just going to defer to you on time. Yeah, no,
let's do it. Let's do this other example. The other one
that that came to mind as fast paced games. So you know, we specifically talk about act or active outdoor AR gaming and So I think active gaming is really fun because it gets you moving. And that's one of the big pillars for Niantic is movement. So if you think about fast paced games, imagine you're outside in your backyard and there's like a pinball, like a real world pinball, and it you know, ricochets off different objects in your backyard. So in a game like that, with a limited field of view, you are going to be moving your head around like crazy trying to constantly follow these fast moving objects in this very limited space that you have. And so not only does that not enable you to even play a game like that, because it can be confusing. But you have to move your head around so much that it's just uncomfortable, right? There's a discomfort that the user feels, it becomes tiresome, and you know, if you weren't in your backyard, and if you were out in a park, it might look unnatural if someone else sees you doing that. So, again, these different genres of games and the different style of characters, and just what we're able to do is really limited if we don't have the minimum field of view,
what do you think that minimum is? If you define it? Can you share it?
Yes. So the minimum for us really, from a vertical field of view is about 30 degrees, and the horizontal is about 35. And when I say minimum, it means that we can live with it, is it really going to give us the ideal experiences we want? Still, no, the desired field of view is larger. But that is really like the absolute minimum. So
this is one of the key lessons that you've learned as you've gone through and done all this prototyping. You announced, you're up on stage. I think with Qualcomm back in December, it was, and you're talking about the second major iteration of the hardware itself of the headset? Can you share some of the lessons you learned from the first iteration from that first generation? What you got? Right? And what were some of the things you were looking to get better to improve upon in that second iteration?
You know, I wish I could say that the first generation was something that we had designed thinking about everything holistically. But it kind of wasn't the real purpose of why and how we built that initial that first generation. The the first internal prototype was really what can we build in the fastest time? You know, so this is something that we built, I'd say in about eight months, with a very, very small team in the beginning, that could really just help us validate optics outdoors. So I think the the biggest focus on the very first device was doing this deep dive into understanding what is the minimum requirement on the optics and display side that will enable us to have this outdoor experience we want to see, because there wasn't a single device on the market that worked for outdoors. Either the brightness wasn't there, or didn't have connectivity, or, you know, the use cases weren't even there for them to really be targeting it. So you know, we even had our developers using HoloLens devices trying to build outdoor experiences. But of course, that's really not what HoloLens was, was built in mind with. So the first generation was really about that validation of what are the minimum requirements on the optics side thinking about field of view, brightness, color, uniformity, and even the aesthetic. So you know, looking at wave guides versus freeform optics or birdbath optics, where you really have a different aesthetic and that thickness of the lens. And then we wanted to use it as a platform to ask all the questions, right. So now let's just put this device in the hands of our, of our creators and developers, and just let's understand what do they need? Do they need a controller? Do they need audio? Do they need eye tracking? You know, how many cameras or sensors do we need? So a little bit of this we had done for that first generation, but most of it was using that first generation to answer those questions. And then once we had all those learnings, we were able to apply them forward to the actual design. So although it was a two step approach, the the final design is really the first design I shouldn't say since the the first generation was really just a learning prototype.
Got it? That makes sense. So now with all these kind of core learnings around optics and display, you have a really informed first new first generation of the hardware, even though it's technically gen two, where do you go from here? I guess now that you have gen two, you're out there, you're playtesting have a clear, the clear vision of what is necessary is embodied in this current prototype that you're currently working on. Where do you think you need to go from here? What additional refinements Do you anticipate are happening here, in this v2 or from this v2 device,
naturally, a hardware partner is going to want to do their own refinements because it's a reference design so they can take the design and and, you know, refine as they like assuming the certain minimum requirements that we recommend are still met. But I think price is always something that needs to be refined. And in order to refine price, that means that you still need to potentially make changes to the device and hope that it still meets the minimum requirements. So I think that price is always something that can be improved on. We also are using the AR to the Snapdragon AR two chipset versus the XR two which is what our initial internal prototype the gen one had. And so with the AR two, we can actually get even more improvements on the form factor on The weight on some of the bulk in the back of the headset where we have the battery, because we're using quite a large battery there. But we can definitely decrease some of that size now that we're taking advantage of the AR two. And the last one, I would say is really the interaction paradigm. So we know with the Snapdragon spaces platform, we'll have a hand tracking. But beyond hand tracking, what do we want to do from a controller standpoint, again, we talked about phone as a controller is an option. But in a lot of the experiences as seen in the video that we launched publicly, you'll see that our team is actually using some third party controllers available in the market, like the finish controller and other and basically trying to actually control characters in the real world, where you're able to kind of like a real world platform, or you're able to control these characters with the controller. So how can you do that? If you don't have a controller? And if you do, what should that controller look like to meet all of the things I mentioned before? Is it a ring? Is it a wristband is it actually you know, something that looks closer to a real controller just somehow designed ergonomically to be a little more enabling in your every day. So that's definitely an area that's still ripe for design.
You can do that in there, but maybe you can be a bit more explicit, or you can share a bit more about the place in which the hardware fits within the broader Niantic plan, talked about it being a reference design, and about creating sense of competition creating opportunities for other hardware makers. So how how does this work? Ultimately, play into enabling the market is Niantic just to be very explicit is Niantic going to be selling Niantic branded hardware that Niantic works with some ODM totally manufacturer? And so
I'm glad you asked that question, because I think despite trying to be explicit, there's still often confusion here. Niantic is not a hardware company, and is not going to develop first party hardware. So we are a content and software company, which means that we are going to create experiences for headsets, we are going to extend our AR capabilities specifically for headsets to create these compelling experiences. But we ourselves are not going to create and manufacture the devices, we want to lean into the hardware partners in the ecosystem who have the expertise to do that. And just empower them, give them the entire stack, give them all the tools, give them the design, give them the software, give them the content, and then let them lead using you know their expertise to then help bring that to market. So we want to have the hardware partner bring this device to market. And we support them to do that. So what that actually means is what the reference design where we are today is that we're looking for hardware partners who want to commercialize the design. And then at the same time, we're going to be continuing to develop experiences that will be catering to this headset, and develop capabilities on the AR and geo side that will be able to be leveraged on these types of devices. And ultimately, we're doing this so that we see again, more competition in the market, we see a headset that caters to our outdoor use case, and then a device that can actually take advantage of all of the outdoor AR location based features that we're building through our Lightship platform and our eight wall platform. And so
you're really teeing this up to be an amazing opportunity for the right hardware partner. Because you're bringing a set of content and impassioned user base, and a bunch of research and a reference design that allows somebody to deliver a product and tap into an ecosystem that potentially is ready to fully embrace it.
Absolutely. And the hope is that eventually there will be more than one partner. And so we'll see many headsets out there that can compete against a lot of the a lot of the walled gardens that we see,
you brought up eighth wall. So about a year ago Niantic bought eighth wall, how it really focused really on the on the website of the experience, that sort of web first web XR sort of experience. How does the work that you're doing on the hardware kind of intersect with the work that eighth wall is doing ninth through eighth while I was doing the sort of web first or web XR sort of experience?
Funny enough, today is the one year anniversary of the acquisition of eighth wall. So Happy anniversary to to our team. And this intersection is critical in terms of how we see the space evolving, we absolutely think that more and more experiences are going to be delivered via WebEx star as the technology continues to evolve. As it relates to headsets specifically, you know, in order to enable eighth wall on headset devices, you need to have a browser. And so there are a few companies who are working on creating browsers and integrating browsers into into different headsets, which is wonderful and you know, we are huge proponents of that. And in supporting whether it's well vague or others to to integrate with with devices so that we can bring the catalogue of content and all of the powerful capabilities that eighth wall has to any heads that. So eighth wall actually has something called Med reversal deployment. And it essentially is a way for web developers to build an experience once, and then deploy cross device, meaning you can build an experience for mobile, but then it will auto magically be formatted to work on PC, mobile, tablet, and even headsets. And so we actually have some exciting updates coming to Metaverse deployment sometime in the summer, so look out for that. But in terms of how we think about headsets, it means that we really see cross device being important. And through eighth wall, we can already target headsets that are existing in the industry. So as we think about our reference design, it would just be another device, that eighth wall would be able to power in addition to the existing devices in the market. And that what's unique is that this goes beyond AR, because with eighth wall, it also includes VR device, this
nice, circling back, really to the beginning of the conversation, the meaning of the story of Niantic, and this notion that the team came out of mapping and understanding where where users are in the real world and information about the real world in which the user is in the moment. What is Niantic approach to thinking about privacy, because we we talked about where you are in what's around you, often what's around you is very private where you are, I'd be very private. So how do you approach this notion of protecting the privacy of users and their private spaces,
we take privacy very seriously. It is always a choice to participate. And it's always consensual. Niantic doesn't know where a specific user is at any given time. And I think sometimes it's a mistake to talk about privacy without actually understanding like, what are we talking about as it relates to the product? And how does the product actually work? So if you'd like we could talk through how VPS works. And then that's probably a better way to talk about an answer the question. Yeah, let's do that. Let's explore VPS VPS. For those who don't know, it stands for visual positioning system. And that's essentially the service that allows you to access Niantic 3d map of the world. And there's really two cohorts of folks who would be at any given time interacting with this map. The first group of people are those who are submitting AR scans to the Niantic map. And then the second group of people are those who are consuming experiences using the map, which is using that localization service called VPS. So to really understand the user flow on the on the first side, as an end user, or as a developer, you can submit scans through an app that Niantic has developed called Wayfair. So you can download Wayfair from the App Store or the Play Store. And then as a user, you might want to submit a scan, because you're really passionate about the community, or because you want to see really specific locations that can be lit up in the Niantic game once they become an activated waste spot. Whereas as a developer, you may want to submit these scans because you want waste spots that will be very specific to your own application for your end users. And for clarity, when I say the word way spot, this is you know, point of interest. So like in Polka Monaco, it's essentially what a pokey stop is where it tends to be a public place of interest, like statue, Park Library, permanent art fixture. So the way it works is that you would essentially go to the place you want to activate as a way spot, you would open up the wayfarer app, you would take a 3d scan with your phone of the place you want to add to my antics map and then you submit a way spot can only be activated through about 10 scans. And they have to be about five hours from the oldest to the newest scan. And I should mention that everyone who uses the app, when you first log in even I've had to do this, we'll go through a Trust and Safety kind of training and like tests that you have to take. And it basically ensures the types of waste spots that you're submitting are actually in line with the eligibility requirements that Niantic lays out. And we have an entire mapping operations team that curates and looks at all of the waste bots that we have and manages the trust and safety of this community. And then the last thing to mention on that side is that the the AR scans are not human readable. So these are machine readable. It's scrubbed of any license plates or people's faces. And then once these scans are approved, they either become meshes that developers can download take into unity and then overlay some AR content onto or these activated way spots in games and then players can access them. So that's one way in terms of how people might interact with the map, in terms of where the data actually is submitted to that is something that end users developers, players can actively go and do on their own account through the wayfarer app. On the other side is the actual service. So this is when you want to actually go and experience something at a specific way spot what happens. So let's say you are in a Niantic game, or you are in an application developed by a Lightship or an eighth wall developer and I'm the user I would somehow be prompted that there's an AR experience at a specific location. Because I'm in the Ferry Building right now, let's pretend that it's at the Ferry Building. So I walk over, I go to the waist spot, I point my camera at the Ferry Building, I allow access to my camera in the app, and then using the phone's GPS location, and the image frame that the camera sees in that specific moment. The user that I've done, I'm enabling it, the VPS service is able to understand Oh, Miriam's phone is looking at the Ferry Building, and then it can serve me that AR content that should be overlaid on the Ferry Building for me to experience. And then that's it until the next way spot, when I choose to want to activate that AR experience. I'll bring out my phone again, it'll understand where my phone is served me that content, and then I'm on my way.
Very helpful. Thank you. Something else I'd like to circle back to is this early career experience that you had in the legal profession? And you know, you're still Chairman still working with this legal clinic helping helping startups. But love your perspective. For those that haven't haven't walked that same path? How that sort of training, that sort of experience ends up applying to the role of being a product manager or a General Manager? In a product company like Niantic?
That's a great question. And it's actually something that I didn't learn it myself, I would actually say it was a mentor who, who taught me that it is a great fit to go from law or lawyer to pm or GM. And this individual, his name is Bill Mooney, and he was one of the alumni that I mentioned, when I had come out to San Francisco taking these cold calls and doing these cold LinkedIn messages to connect with a fellow alumni down here. He was one of them. And he himself had started in law school, he was a lawyer, and then his career changed. And he became a game producer. He was a VP at, you know, all the major game companies Singa, he was Chief Product Officer at Roblox, he then went on to skills and went on to King. So he now has his own gaming company. But he always said that, when he looked to hire product managers, he'd always look to see if they had a legal background, whether law student or lawyer. And the reason why is because it's a very similar process in terms of what you do in your actual job. So you start with a client or customer problem, and you have to really understand what the customer or client problem is. And then you have to be very data driven and trying to solve for that problem. It's very logical, it's very objective, it's not emotional, you have to be very proactive in thinking about what are the assumptions that you are making about the problem? And then what are the risks that you need to identify to, that you might be up against. And then lastly, it comes back to this, this problem solving mindset, this ability to have critical thinking, which is being able to really think about all the possible outcomes, and then which path is the best one to execute on. And so it is a very, it's very similar, whether you are a lawyer who's doing this for a client, whether in contract or tort or other whether you are a product manager doing this for a tech product. And so it's the very same skill set that you are applying across both
sounds very logical, it makes a lot of sense. Let's wrap up a few in lightning round questions. First one here, what commonly held belief about the world of augmented reality Do you disagree with
that AR needs to be visual, I think I kind of talked about it. And I'm definitely biased because I'm at Niantic. But like I said, I think AR does not have to be visual, I think that augmenting your reality can absolutely be about finding ways to bridge the physical and the digital together. And that could even be something like spatial audio, if I could just hear on the chair beside me, there was you know, a little bird chirping. And spatially, it was accurate, I would really feel like there was this little bird chirping just on the arm of the chair beside me without even seeing anything. I think the same is true of location. The same is true of haptic feedback. And I'm sure there's many more ways that that could be true.
Nice. Besides the one you're building, what tool or service do you wish existed in the AR VR market?
I wish I had a machine where I could just 3d print my ideal functional pair of Air glasses that cost about maybe 100 bucks and does everything I needed to do? Is that a fair answer? Because if not, I can also give you a more serious one.
I love the magic wand answer. It's fantastic. What else you got?
I want a no code AR creation tool. So something to make it easy for everyone. Even your daughter's you had mentioned who could create AR content super fast. And I think that, you know there's a lot of talk about Gen AI right now. Maybe with the proliferation of Gen AI we'll start to see more of those products come to market but I want it now.
That sounds great. What book have you read recently you found to be deeply insightful or profound.
So the last book that I read was Man's Search for Meaning by Viktor Frankl and Viktor Frankl is a second HyTrust and neurologist who was also a prisoner in Nazi concentration camps during World War Two, and he chronicles his experience in the first half of the book. And then in the second half, he actually talks about the psycho therapeutic method he developed, called logotherapy. And it's all and you know, he talks about what he believes are the three possible sources in terms of how we find meaning in life. And its purposeful work, love and courage in the face of difficulty.
That is very profound, very nice. If you could sit down and have coffee with your 25 year old self, what advice would you share with 25 year old Mariam,
Mariam, don't be afraid to fail and to make bold bets? That is absolutely what I would say because I think nothing in life comes without risk. And if we aren't taking those risks, we're not getting out of our comfort zone, which means we're not growing to our max potential. And that's true, not just personally but also if you're thinking about you know, a product or a project that you have, in order for it to reach maximum scale, you need to be able to feel comfortable failing and making bold bets.
So good. Any closing thoughts you'd like to share? Well, thank you
for the conversation. today. It is always a pleasure to chat. And I think it will be exciting when we do this again in the future again and again, as we both continue to push the industry forward. And my doors always open to others who want to connect on pushing the industry forward together. So for anyone listening, feel free to reach out via LinkedIn or on Twitter. My handle is at savour Mariam.
Amazing. Thank you so much for the conversation. Thank you. Before we go, I will tell you about the next episode. In it I speak with Karl Guttag. Karl is an author and speaker who writes in depth about the display and optics technologies in AR glasses and his blog at Kguttag.com. This is my third sit down with Karl for the podcast. And we get to dig into a number of topics including rumors about Apple's efforts in VR and AR the potential for video paths through VR, micro LED display technology leading optics technologies and what Karl thinks has the best chance of winning in the medium and long term. Once again, the conversation is split into multiple parts and I think you'll enjoy each of them. Please consider contributing to the podcast@patreon.com slash the AR show. Thanks for listening