Editing Reality with Anjney Midha (Ubiquity6) | Disrupt SF (Day 1)
2:17AM Sep 6, 2018
We've got a brilliant session now with Brian heater who's the hardware roles of TechCrunch. And he's going to be interviewing anjney midha, who's with ubiquity6. Now ubiquity6 is a, an early stage startup. But it technology is right on time, obviously, because AR and VR are relatively picking up traction. Now. We're still waiting for Apple's headset to come out. I wonder when it will, will. They do want even. But we're going to hear now from him and discussing with meta about how ubiquity6 can have an impact. Big round of applause. Ladies and gentlemen, for these guys, come on.
This is exciting. It's like a it's kind of a big debut for you. Yeah, this
is fine. This is cool. I can't see too many people with Cisco. So what
I want to actually start by giving a little bit of background for the company because you were a very new company you're founded in 2017. That's right. So the focus is creating a massively shared real world AR experience. You haven't really shown all that much. so far. There is however, right around the corner. From here at SF MoMA, there's some agreed exhibit and couple weeks back, you guys, you guys showed something off there, we did,
I think we we ran the single largest shared AR experience ever attempted, you know, hundred and 50 people playing together in real time. So the kind of stuff you'd expect with an emo online, but all happening a few blocks from here. So
let's actually start with the video. And you can actually walk us through what we're looking at here. I think we've got a little bit of that exhibit and some of what you guys have been working. Yeah, let's look at it.
So I think what what's going on there is so that as Brian was saying, that's a few blocks from here, people are on their own phones, looking around, and you're looking through the phone at other people interacting with the space we had, I think on the second floor that you're seeing there, the sort of large Minecraft like game and all those blocks are people actually placing persistent AR blocks together, or their own
selfies on the ceiling. So you can go back and see those later. And then the idea is, over the last year, we've been working on other things like this, where you could either play with large numbers of people in a public space, like the MoMA or with your friends and family and private spaces, like your living room, or your dorm room, or the office in a public park on a weekend, or, you know, on on, that's actually I think, on Market Street. And the idea is to overlay one massively shared mo universe on top of the real world that you can access with your there are a
couple of examples in that video that I want to point out that I think demonstrate this pretty well. One of them is right around here. It's kind of a farm Ville experience. Yeah, somebody walks in into a park and plants a seed, and everybody else who's on that layer can kind of go and water and, and, and care for the plants. Yeah, the
idea is, once you have a city mapped out, which is what we've been doing for the last year, you can fill it all kinds of shared AR experiences. One of those is a large scale cooperative games like Farmville, where, unlike Farmville, where you used to have to go online and play with somebody who was, you know, around the world, in some kind of virtual world, now, I can put down you know, you know, bunch of seed in, in, in a public park that somebody I might not even know is able to collaborate with me on and compete with me on and enable these kinds of shared interactions that live in the real world in the places you hang out in. It's, it's tough, though, to really get that across when it's a video like that. Yeah, especially
with and we discussed this before, I think a lot of this has to do with the fact that it is a relatively very new company that the graphics are kind of crude. I mean, they're, you know, they're big polygons. And when you're just looking, you know, inside the museum space, and there's a bunch of blocks, it doesn't really come across in the way that it would, in an actual AR experience. Yeah,
graphics are one of those things that improve very fast. And so when we, you know, we're a small team or 30 of us, the thing that we really wanted to focus on was showing people that this is possible on your device today, you don't need to go spend like $2,000 getting a piece of equipment that sits on your table, this, you can pull up the phone you already have. And so you can either spend a dime, sort of optimizing graphics performance, or you can spend time on the core stuff, which is computer vision, and really hard to make stuff show up in a shared real time manner. And so the reason it's not sort of 40 or photorealistic is because in our guests in our plate, as we realized people actually didn't care about the graphics, as much as they cared about people saying, you know, when I'm placing a block in space in front of us, this has to show up in real time. And so you can either spend time working on that massive networking infrastructure on graphics, so the graphics will get better, very fast. And as, you know, GPUs improve constantly. But the thing that's really powerful and I find humans tend to be a lot more, you know, index a lot more on is this is this actually mirroring my real world interactions. So, speed and, and accuracy of spatial coordination as opposed to graphics mean, you've got a tough line to walk here, we've seen that TechCrunch, we've seen so many super hype companies never actually released something into the real world. So you've, you've got to create the best user experience possible. But you also want to get something out the door. So what what kind of timeline are we looking at right now, for people out there to actually be able to experience this on a wide scale? Well, that that's the reason we put this in people's hands a few weeks ago, because like, as you're saying, it can be a slippery slope to say, look, this new medium AR is new, let's wait till get super perfect and polished before introducing it to people. Or you can just trust that humans will, you know, learn, figure it out and give you a really great feedback, and then iterate on that. So there's a trick to finding sort of the sweet spot. So we did polish it a little bit before we put into the hands of people at the at the moment, these were folks who'd never done AR before 18 to 65 year olds who never even played a massively multiplayer game, or even a game and they just got it. And I think sometimes you just gotta trust that users will will figure it out, and then give you feedback that you can iterate on. But obviously, the next step is having that on a massive scale. Yeah? Are you looking at any? Can you give me anything? Is there any kind of specific time Well, before the end of this year, we're enrolling people out on a waitlist that you can sign up for on our website, the thing to get right is making sure this works everywhere in the world, on any iOS and Android device that you have. And so today, it works on some of those devices that we're learning on. And the idea is to scale that out to everybody by the end of the year. So really, really cross platform. Yeah, mobile devices, potentially, maybe facial war and hardware is that it's in the cards for you guys as it is, we the the idea is, this is a software layer, you know, it's an mo that that lives in software on top of the real world. As long as you have a camera and you have an IMU we can position you and allow you to open up that world from whatever device you're on. So as long as you have those two things on there ton of devices, including you had one displays today that have those two, you can, you can have that shared AR experience. But you're right. cross platform is the starting point. Because there's no point of having a shared experience that I can actually share with you. Just because you're on a different phone, or different OS. So speaking of super hyped companies, you were involved in Magic Leap in the early days,
you had said, you know, you did go and start this company, what is it that they weren't giving you that you need it in your own startup,
I think the world changed, right? They were they they started working on on that problem pretty close to when the iPhone came out. And so if you're starting to work, first iPhone, the first iPhone, right? So you're starting to think about this stuff in 2009, like you said, it's not actually as, as Mike said, with rainbows. With my order to report there's a science fiction novel that really inspired me and my co founder rainbows. And I went to your office this week, and you have just a pile, you handed me a book, it's like required reading, basically. And so if you start with that vision in 2009, it's certainly conceivable that you got to build your own hardware. But in the, in the intervening like, sort of eight or nine years, the world has changed so much that you need a really good reason to not just use this as the compute unit for the starting point. And I think that's, that's what made us really excited about starting this at ubiquity6 was all the missing pieces have fallen into place where you can just give people this shared AR capability with the phone they already have. Whereas 10 years ago, that wasn't that wasn't the case. So
the the upside, obviously, is that everybody has a super computer in their pocket at this point. The downside is, is this really, I mean, is that a good experience for AR I think it's a starting point. Yeah,
and I think we started the company eventually, for this notion of ambient AR that lives in a very easy way in, you know, through through glasses through a visual medium that is constantly figuring out where you are layering these AR experiences for you. But there's no reason you can't make a lot of hit a lot of valuable milestones along the way, just with these because, you know, there's 3 billion people walking around who do have problems or experiences that they want to be able to access without having to drop a ton of money on a new new hardware device. I mean, you want to give people again, obviously, you want to get it out the door, but you want to get people the best experience you can. Because if somebody first AR experience is terrible, there's a good chance they're not going to do it again,
the first really mainstream AR or a Jason experience was Pokemon Go. And that's when at this point to this day, when people think of AR they think of Pokemon Go. Now, you know, we've talked about this a bit, you said that that is not really a proper AR experience. And I'm wondering if that is potentially a detriment to the industry.
So it depends on what timeline you're talking about. I think it was a fantastic starting point, right? Because it will prove that 500 million people around the world wanted to be able to interact with digital things like character they cared about in real world locations, the thing people should not draw a conclusion about is, this is where we stop. Yeah, because as you're right, it wasn't a shared AR experience. If I was interacting with the digital character here, you were living in your own universe. So we were actually playing completely on our own. But that's, that's not that's not actually the way humans interact, right? When I'm talking to you right now, we everything we're doing together, shared and responding to the same environment, the same reality. And so as long as people understand that stuff is going to move very fast. And I think this is why the laws of physics of software are very different than the industry is fine. But I think since it has been a while since people have seen an update to that experience, partly because the tools that were used to build that experience are quite old, you know, they were both the game engines that they were using, and are using, whereas they started building those like nine years ago, we're building our own authoring tools that are focused just on AR work super fast super, will allow you to prototype much, much faster for for sharing your experiences massive networking out of the box, it's not it's not a next step, if that's where you start off with from a first principle. So
maybe a better way of phrasing it then is like, how do you not fall into the hype trap that VR fell into recently, I mean, you know, there was a huge upswing and as we were saying before, now, they're all of these headsets, sort of gathering dust and people's houses and office,
right? Well, step one is don't force them to buy hardware,
you know, just let them use stuff with the device already in their pocket. And VR is, is is very immersive, because it is a hardware module and hardware form factor that we haven't had so far. But I think AR is finally here. That's I'm saying, you know, it, it was conceivable, if you if we were taught having this conversation nine years ago, I wouldn't have a good answer for you to and say, we just got to take the leap, you know, and make no pun intended and, and force people to the early developers to drop a ton of money on this. And our belief was no, you just gotta start now acknowledge that people have are walking around with supercomputers. Sure, it might not be, you know, the kind of ambient dr experience that people like science fiction talks about. But just give it give people who already are walking around with these very powerful computer devices, capabilities that that we can deliver via this computer vision stuff we're working on, and then go from there. Listen to what they want, before you ask them to invest a ton of money in hardware. You saw the news, Snapchat Snapchat Spectacles, I did is very difficult to say. Yeah.
What are your thoughts?
I think it's, it's a very, very good indication of more to come on the next 12 to 24 months. I mean, they're not they're not AR hardware proper, right? It's its head worn cameras. Yeah, that's the secret strategy, or the Trojan horse, right? is how do you get enough sensors in people's hands at a cheap price point as or on the face? Yeah, exactly. That sets them up for immersive shared AR experiences, or any kind of VR experiences a year or two years from now. And the starting point is put a camera on people's faces that is figuring out where in the world they are to usefully deliver an experience they might want to play whether that's a game a location based guide, indoor navigation, and really all you need for that is dual rear facing cameras. And I am you on which is, you know, gyroscope and accelerometer. sending that to a device that's doing enough compute to give you a useful feedback via and that might be via audio, or it might prompt up a notification to say, hey, pull out your phone, there's, there's an AR experience for you here. Is there a disconnect there? I mean, you know, you're recording it all here, obviously, Google Glass, Magic Leap, all of these things have a heads up displays on them. So that's, I mean, that's, that's a one to one experience, right? This is collecting what you're seeing. And it's also projecting that information back at you. Yeah, this is one of those problems that we I think we traps we fall into as like techies and engineers, where you're going, Oh, what is the coolest thing that we can deliver using technology that we were inspired by science fiction, you know, by and visual AR is one of those starting points that people just kind of assume that's what they are, is, but turns out humans are very, very good at audio like audience a very immersive augmented way to augment your, your information. And the way you you navigate the physical world, you know, when you're driving, and you're getting turn by turn navigation, a lot of that information just parsing via audio, when we're on a phone call your New York and I'm talking to you, you're actually I'm I'm talking to you. But in your present your your, your mental model is building visual images of what I'm talking about. And you're pretty immersed in what I'm saying if I'm a good communicator. So I think AR can be just audio and still be pretty immersive, as long as you have the visual feedback that's responding to the real world space using cameras. So obviously, Apple is getting very aggressive into the category. They made an acquisition pretty recently with Mike before he came on stage, sort of tease the panel by mentioning these potential Apple glasses, right in your mind, what do they look like?
I don't think they look that different from the stuff you're seeing from Snapchat today. I think Apple's mo has always been tried to make people excited about Apple products as fashion or luxury products. They're proud of the form factor they did that with the air pods it took a while for them to get that they weren't the first people to do Bluetooth you know had your phones but when they did it you looked at those and said those are look that different from the kinds of headphones I'm already wearing. And they're not the first to anything except for removing Porter almost never the yes exactly
there's always happy to lead the way on that one
oh yes adapters are good margin product. But uh. I think with glasses, the luxury like form factor that all of us expect from a pair of oak leaves or Ray Bans is the closest you're going to get as a starting point and I think the idea that you need a display once you relax that it's quite clear that were there already products on the market today that can put a pair of cameras and I am use on your face with glasses that don't make you look super dorky. So I mean, in the sense then, air pods are kind of a stepping stone towards augmented reality audio. Like if, if, with the limited information I have sitting outside Apple, what do you know, you know, they had like NBA the NBA is so you know, I do think that like audio air, the air pods are, have been a sleeper hit from where I'm saying, if you just you look around. And I don't think San Francisco is always the best microcosm sort of for the rest of the world. But whether you go to New York, and many parts of Asia, Singapore, China, you know, Shanghai, Bombay, like air pods, have been a pretty large scale sort of sleeper hit for people who have that affordability. And then as you get, you know, larger and larger scale prices drop. And I do think people will start to interact with AR first through audio before having to pull out a display or are in any sense, our Apple or Google competitors, not with one another. But with you. Oh, I see. Um, no. Without what they're doing, we wouldn't exist, right? The idea is that are that the idea of being able to figure out where devices in the world that's called local tracking has been around for a while, a lot of people actually companies used to try to do that, starting in 2011 or 12 months, these I'm news got good enough. And then what Apple and Google really do is say, Hey, this is a first class citizen. This we should have developers have high quality API's that they can call this synced these IMU sensor outputs with the with the actual bare metal in a way that made the precision a lot higher. And so when we're able to do a shared space, that's because Apple and Google have taken the stance and and increase the the precision that these inputs have on your phone. So they do what's local, called local tracking. And we say, Well, if Brian's over there, and I'm over here, can we resolve that to a single global coordinate shared space. So Brian, and I can actually interact with each other. And we're only able to do that with the kind of quality that we have today. Because they've they've push the industry forward on the on device, you're creating something that's kind of, I guess, akin to a browser, maybe the best way to describe it, right? Right? So you're circumventing or at least working on circumventing the idea of necessarily having all of any content partners create individual apps on the phone, it does that make it does that make it difficult, obviously, every single company wants to develop their own app, right, they want when you turn on your phone to have their name on there. And to have that controlled experience, right? I think it's akin to the the internet, right, which is that you have a website, and people are able to launch your website, wherever they are, whenever they want. And physical space is very similar. I want an experience that's tied to this conference room, or this, this, this venue when I'm here with you, without having to necessarily have to go through the process of setting up a whole new application just for that. And I do think AR starts to look more like layers, which are like websites, you launch in spaces, then standalone apps. And I think that there's their Apple is doing a lot in Google's a lot to make that easier. I think people aren't used to that consumer behavior yet. But the closest you've seen to that working in the world is just websites. So what you just show them how cool the technology is, and then they jump on board. Well, many times they're showing us right. And I think that this this notion that AR is going to make all our lives more useful because you can finally bring physical spaces online is not new. And and I think Apple and Google are great like in that they do think quite ahead apple in particular. Because the kinds of decisions they make to bring something to scale like a market usually take two or three years of planning. And so so these are not new ideas, people they're going as a pretty rich ecosystem of people to, you know, giving feedback to each other. And I think when it comes out, as long as dams are able to do things they've already imagined without learning too much more tooling, then everyone ends up being net net better off so they're gonna kill me if we run over. So I'm going to ask you very simple yes or no question
is ubiquity6 working on hardware?
Not today. Okay. It could
be in the stars.
Absolutely. Yeah. Right. Thank you so much. Alright, thanks, Brian.