Everything Everywhere XR #14 New Input Paradigm 1/2
1:06AM Sep 7, 2023
Speakers:
Alessio Grancini
Keywords:
xr
hands
feel
developers
input
vr
tracking
experience
work
designer
people
tools
game
mixed reality
years
design
job
prototype
bootcamp
focusing
Our interactions are shaping themselves in the panorama of the XR industry. Meaning that now we have all of you know a range of capabilities for special interactions and we need to take advantage of it there is a lot of good ideas around there are no standards that have been decided yet. So I think this two episodes that will be this one and next one will be focused on XR input specifically, but before to go in that direction, please, you guys can introduce yourself and I feel like everyone really could be use some of your story and and you know, like take advantage of your insights for how you got started and how did you get here, and why you're so passionate about this field particularly.
Perfect. Dave, would you like to start first?
Sure. Hey, everyone, I'm Davido. I have a long background in immersive technology. I started my career at Ubisoft making games. Got to meet some of the best game developers in the world who are actually the best developers in the world because making inter software ones and zeros. Entertaining is very challenging. And I was able to build a career on that working with some of the great devs that, you know, I got to build ecosystems around developers early on at Leap Motion doing hand tracking for VR. Working at the OG meta, the original meta, focusing on augmented reality and use cases around enterprise. And then I worked at you know, several other companies, the the second Mehta, some blockchain companies, but I'm really most excited. I joined as an advisor with Ferran on XR bootcamp really focusing on education in virtual reality, which really, you know, gets me super passionate because I believe it's the future. And it's a tool that's going to be a game changer, maybe even the end game in terms of how we're using technology to improve improve our lives.
Great, let me continue from that. They've shared all his background. I'm also a similar actually background. I would call myself much more on the enablement and on the ecosystem building side. First, I started on the game industry and incubating game studios. And also I hope into the AAA side of things as well like Dave at Crytek. And then, of course, this also gives me the opportunity to deal with more on the development side, especially on game engines. And for the last six years, we are combining the incubation enablement helping developers studios, building communities and ecosystems actually, with the help of first the organization called VR first, which is actually has been running for the last six years, building labs around the world working with top XR manufacturers distributing headsets, like the I'm talking about the days that we were still using base stations so it was much more difficult to distribute and create some kind of like an adoption that compared to today. So and now here we are for last three and a half years. And we are working mostly on educating the developers and studios on VR. AR and also helping XR platforms to decrease or remove the developer frictions as much as we can with different tool sets materials, and different incubation and funding programs. So we are here to give some maybe inspiration to other developers here are helping them to saw maybe one of the interesting problems that they have. We are happy to support them throughout our programs. Thank you for inviting us Alessia.
Thanks to you guys for for for showing up. I feel like you know like for me this is like an opportunity also to meet and get like closer to people that are really shaping the industry so anyway, like for me it's like an honor every time to see people that you know like are working on this stuff. And they I didn't have them around my range because so many people are doing so many things and I feel like I also do this just to keep in touch with what's happening outside of my, you know, small world of what I what I work on so yeah, like, it's our education. It's definitely needed for expanding XR. I think that you know, for me, when I learned it when I learn everything about unity and things like that was not such a standard route. So I remember that was a lot of you know, alone bringing some learning from game development and, you know, the operating operating them into some more, maybe enterprise the way like I remember that unity was not really coming Unity and Unreal. Were not really coming out right away, but some enterprise bandel and some products that would do things that you needed, or you needed now, or now maybe it's normal and so you were kind of like hacking your way around for doing all of these things. And and now I feel like he's becoming there are there are a lot of good signal that this is becoming more and more structures standardized, opening style, everyone who stopped doing the same things and is going to improve in the future. So like, I feel like we have a greater opportunity to talk more about XR bootcamp. As I mentioned this, this podcast is part of like a discord channel that has like more than 1000 members. So everyone is going to eventually be able to retract this video and just like take a look at it and you know, understand what is it more XR bootcamp. So if I want to, let's say, you know, start developing and say I want to get into XR as a, as a as a student, or as a professional that never had a chance to be exposed to the field, like what are the steps that I should take to do that?
Yeah, I mean, this is a question that I can take your one more than one hour but because we are doing a lot of one on one mentorships as well and also open days that people can ask questions. And this is the of course the most maybe common question that we have. So how should I start or am I too far from being employable as an exotic developer or designer, right, so this is something I mean, I have one good one bad news. Good news is it is not so far but maybe a little bit like a bad news says you need to really work hard towards that. So the time that you need to commit or the effort that you need to do in this maybe intensive period is quite high so you will have some blood sweat and tears but the end result is something that you would not even believe yourself because this is the most of the time happens on most of our students as well, even though they don't have so much even coding experience before they manage to learn these tools with us. Because at the end of the day, the tools are not so complex, right that there are also tons of free YouTube tutorials and tons of source resources out there that you can you can at least attempt to learn but the critical thing is what will you do or how you will respond to blocker issue right when I have errors at the building in unity, how I will respond to that is it easy to find the blockers that I have on the internet right. So these kinds of things are the part that make as far as we have seen that make people a little bit demotivated or maybe give up at some points while they are learning by themselves. But this is definitely doable. So in terms of like XR upskilling I would say today is the perfect, I would say period of the exogenous three that most of the developer environments or pipelines workflows are quite well established. And maybe the only thing that I would say is it is an almost like a rapidly evolving journey. So even though you feel like you've learned a lot today, this doesn't mean that in two years time, whatever you have learned will make sense for the industry. So you have to always also sync yourself with the latest tech or features. Like maybe we can dive deep into that with Dave on what what should we expect from meta Connect which will start I think which will happen in 20 days, right? So even 20 Days Later, we will see some stuff that will change. So keeping up yourself is I think more difficult than learning the basics. So it is something that I would strongly recommend for those who like to break into the industry.
Yeah, I thought that was really well said. That was cool. The only thing I would like to add in terms of anyone at any specific level of wanting to get into XR, whether you're a professional developer at a fortune 500 company, and you're just super excited about all these new headsets coming into market or the new term instead of saying digital headsets or head mounted displays. People are just calling digital eyewear just to kind of ensure that there's accessibility to it. But my advice is, this comes from my background and game development. You know if you want to actually become really great at something I believe Picasso had said this quote, you have to steal with both hands. And I'll tell you what that means, especially at XR development or in game development. When you're trying to make basically art write your own art. You have to experience everything. So that just means like, Hey, if you're into enterprise applications, and want to find use cases in XR, or you're a game developer and want to figure out how you can you know really add more immersion that you can never done before. Play everything, right? Make that your daily habit. And it doesn't matter what level you're at at that point. I mean, I'm constantly playing games, checking out applications to really understand how other people are doing it. So I can steal with both hands by borrowing some of the best mechanics or best user experience or the best camera controls or character motives of a specific application.
That that's that's actually a very good suggestion. I think because especially like I don't know, personally in the role of prototyper I found myself to being asked to do certain things, especially on a very on a very tight timeline. And sometimes this maybe some some people really don't have the understanding of how big are certain application meaning that like some some some applications sometimes are so like, complex that for you. Maybe you clearly don't see it, but in the background, there is so much stuff going on and they were like why don't you do this part from scratch? Why don't you do this part from scratch and we're like, Well, if we approach the problem this way, we can just say bye to whatever you want to do. And we're gonna stay here for you know, way more than we are timing and we were more we are kind of like allocating time for and I feel like it's really like a duty of every developer or anyone who wants to learn these things to don't do things from scratch, but really taking whatever else has been done. very successfully. And maybe like giving your touch understanding how it works and things how, you know, I see a lot of in bringing the you know, the discussion on the input side of things. I used to just check on Twitter, these you know, amazing demos that I've seen, you know, of so many developers that they get on the mental quest, or they get like on the elite motion and they do a lot of beautiful hand tracking demos. They're like, Oh, this is the way that you want to type on the keyboard. And this is the way you want to write things and this is the way you want to place your your keyboard in front of you or this is maybe you know, like all of these legal concepts that belong to the workspace slash gaming like all of these things. And at the same time I see them all and I'm like these are all very valuable what's gonna who's gonna win, you know, like that's it just whatever the people come you know, complexity like the most and they feel like it's more natural. I just saw so mad at that just like, like posted this kind of keyboard on the on the table. I think. I've seen a lot of demos like that and I was like, then if they nearly did very well, like this probably might be the way to use this thing. If they don't, if people feel like a little bit, you know, it's still hard to do long term. Maybe it could be excited the very beginning but then long term becomes very difficult. So we might see some new iterations in the future. So it's a very exciting time actually, because I see there is opportunity for each developer to do their own version of the things so yeah, getting on the input sidetrack. I would like to know what you guys actually liked the most until now some example on the market that you experienced and you you liked a lot without even naming the product. If you don't want to like this is not mandatory. I'm not promoting anything here. Just just your feeling, just your natural feeling what you feel what you feel is the most useful natural input method specifying maybe the purpose for this input method and why you thought that that was actually you know, fried.
Dave is actually probably the one of the first people who have been initiating hand tracking as well. So it's a deep motion. So I think you have quite a bit of history in terms of inputs right there.
Yeah, I mean, Leap Motion was really an outlier where there were two monochromatic sensors on this on the stick of gum hardware that would track your hands based on 2d images that these models these black and white cameras would capture. And then we actually had an algorithm to understand where your hands were in relative to 3d space, you know, space age stuff, but you know, the company was founded and led by David Holt's who created mid journey he's also a really good friend of mine. You know, you can see that you know, this type of genius was really out there. Now imagine, you know, nearly 10 years ago trying to convince developers to incorporate hand tracking. Right fair on I think those are some of the early conversations that we had when you were at Crytek. Crytek is also one of the biggest engine game engine companies in the world. That's how fair Han and I had met. And it was my job to actually not only help developers onboard with with hand tracking, but I had to take 15 steps back just to convince them why hand tracking was even useful or what what they can do with it, or why would people want to use it when there were so many different challenges of tracking hands correctly. Now, mind you, you know, some of those ideas or affordances or design decisions, just were not even available early days, like for example, the color of skin tones. We found that early on in terms of hand tracking, like hey, if we're not actually testing with, with multitudes of different variety of diversity, hand tracking just doesn't work because we're not training the model that way. So we learned that we also learned that there's occlusion where things get in front of the camera, that of you know, distorts hand tracking, we had to discover that. And then, you know, we just, you know, there was a lot of other discussions that we had with how does this get integrated into a headset? But going back to your question, Alessio, regarding inputs that I'm really excited about, of course, I have affinity towards hand tracking, but you know, there's just so much with input today, right? There's, there's even treadmills and there are controllers that people are just so much used to in this demographic who's interested in XR, they grew up with a game controller in their hand. And when Apple like Apple vision pros recent announcement when they announced they weren't going to have native controllers, that cause kind of an uproar, right, because people are so used to that type of input. So yeah, I think the but going back to the question again, started being a little long winded here. What I'm most excited about it not only with hands and the idea of better hand tracking with better array of cameras, which is coming to the market with every new headset that's coming out, but I'm also super excited about treadmills, believe it or not, you know, treadmills. We're now getting to the to the state where they're becoming kind of affordable but in terms of some of the use cases that you've been seeing with a lot of the treadmills today, it actually makes sense if you're going to the fundamentals of what people will be using VR for training, entertainment. And when those type of more virtual reality experiences come into play, you're probably going to want to move and you're probably gonna want to track your legs. Because just like how hands are important to extra experiences, legs and feet are as well. He kind of brings more of that presence into that experience. I'm also excited about voice. I also lead developer ecosystems. For Samsung's new Bigsby around voice AI and you know, the power of just using how we communicate today with technology right and the advent of chat GPT and being able to communicate it with your natural language that I'm also really excited about, man. I'm just really excited about everything going on this this space is so so dynamic. And there's just so much happening in this field.
Maybe I can add from where you left off. I think the hand tracking part is very interesting, especially nowadays because mixed reality becomes standard write or either see through or pass through, it doesn't matter. We have to somehow bring the beating our environment to to our eyes, right. Whatever device we are using. I think the situation during the Leap Motion times that no one heard even imagining this kind of like mixed reality or augmented reality on a variable right or they don't know they didn't know how to do that right back then. So hand tracking was just the some kind of like it and another like an input paradigm, right, but right now, then you'll get from it even like a human mind perception perspective. If I see my hands, I think I don't want to use anything else right? If I see my hands back then unfortunately, we couldn't see our real hands. So only maybe the interpretation of computer vision but now I'm seeing my hands almost like every day like camera feed right? Or see through right. So there is no other way of like using your hands as an input controller, rather than rather than having another device with you. Of course, there are many different haptics and other requirements that might be needed for different apps, but for like a typical, I would say input required requirements on meta verse or virtual environment. I think hands are really the first maybe go to the input paradigm that you would expect but of course, things are quite evolving, right. So I remember we maybe you will remember this year as well. We are working we have been working and we are still working with the team behind hand physics lab. This is the first hand tracking Yeah, focus game on the meta store as well. So back then, when the hands 1.0 is available, it it was very difficult for them as well to be able to explain that most of the since most of the game dynamics or game mechanics are heavily relying on the capabilities of the hardware, heartbreaking hardware of also computer vision. Sometimes you don't feel like intuitively feel the verge virtual objects in your hands or grabbed objects, etc. So but whenever the hands 2.0 comes, for example, it immediately didn't change anything on the game. But the whole experience completely evolved right so actually, sometimes apps or games or use case are they ahead of what we can achieve with hardware and computer vision. But it's good right? We are always having this as a like a almost like a reality check indicator that we can always check okay, it doesn't feel intuitive. Can I do this stuff that I'm already doing real life as close as possible? Right. So this kind of I think cross checks was very helpful. And the moment today that very come I think is quite fascinating. In almost all the headsets, there are different kinds of gestures that you can utilize with hand tracking and our computer vision capabilities are very, very, very better. Right. And as they mentioned, two hands was also a huge problem for the last maybe a couple of years, right that you can actually have difficult difficulty when you try to make your hands come closer, right, but they are even solving probably one of the most difficult computer vision problems here. Thanks to AI and different CV capabilities. I think we are coming to a level that has become like an almost like a standard for any device. And it's like maybe the first tool before or input paradigm before you start using any other inputs at and so I'm really happy that we come that far in such a short period of time in two three years time. Look at what we have achieved as an industry. It's really promising for the future, to be honest.
Yeah, that was great that you mentioned head physics lab because that was the first experience when a game developer said said hey, hold up here. You know, hand tracking. No one's really utilizing it. And I was actually responsible for helping launch the new hand tracking app this the second meta, and it was very important to figure out what those use cases were because there was nothing to actually show, you know, a wide variety of use case of how you can actually use your hands interacting with things virtually and have physics labs was great because it also used a lot of psychology to have people actually feel heat with their hands or even feeling haptic feedback that wasn't there. And those are all tricks that you can do with lighting with different design affordances with different design mechanics that you can actually design around hand input. And you know, I really appreciate what you said fair on regarding bringing it back to how people actually use tools in their everyday lives or using hands right because that's how most people naturally interact. Because I think that's really important thing to communicate to this audience is that all of these inputs are really mirroring everything we can do as humans, but input allows us to use the technology for different use cases. For example, the mouse is never going to you know, take over the keyboard, you know, controller that feels like can be a virtual sword will never be replicated just by holding your hand in midair. Right. So these are all things these inputs are all going to build upon the experience so we can actually you know, feel the metaverse be in the metaverse and you're going to need your natural hands, you're going to have to have eye tracking, you're going to have to need your legs and all these other inputs, you know from keyboards, to controllers to whatever. These are all things that people naturally use in the real world but now they're just going to be a digital twin of all of these type of experiences and input controllers.
Absolutely and I want to ask you guys like what's your preference in terms of like, you know, for example, since you talked so much about and tracking and and tracking is busy is for sure, like the, you know, the big thing that is coming and it's coming. Every time like that we see an improvement. We are super excited because we're like, you know, we might just get rid of everything we've been thinking until now is the way to interact with things. At least for you know, some some some part of our daily life. Do you like for example, more occlusion of your hands, like meaning that when you see like when you're in a pasture experience you actually see a real hand or you you enjoy to see him mesh for example, what's your what's your favorite preference? Because I have a lot of discussion with my friends and I'm like, I actually just want to see my hands. You know, this is my personal opinion. Like I don't want to see hands on my hands. You know, like, that's what I thought it was weird when I tried like a lot of different experience where they tried to remap your hands it gets a little weird because there is some scaling things going on and I just want to see my head but there is the challenge is there. Like if you are in AR there might be some you know, some kind of mistake in terms of like, I think that my hand is there but it's not actually there. It's like I don't know, five centimeters shifted. Or for example in VR. What I noticed is that a lot of tests that I did with Matt I was like doing this or doing like games where I just punch my hand and I seen that the hands actually visually are punching but I don't see them actually touching. So these kinds of issues. I feel like that everyone is facing them and everyone has their own preference. I wonder what you guys think about that.
I have actually interesting stuff to share, but I'd love to hear first day anything you want to add or anything because my my answer will be a little bit diverting this question. That's why I just want to make sure that you first maybe answer if you have anything.
No, I think the only thing that I can add is that we spend a lot of time doing this at Leap motion, which is really studying the hands and the anatomy of the hands. Because if any of you just close your eyes, and I asked you to imagine the palm of your hand or the backs of your hand, or you know, maybe specific scars that you have in your hand, you can actually pretty pretty much envision it just as clearly as you can do with any close, you know, body part, right? And so yeah, you do get a little bit of Uncanny Valley where things just don't feel right when your your hands are kind of, you know, mapped on a glove because the size may just not be right or just doesn't look you know as you know is what you remember your hands to look like. And that just kind of takes you out of the experience. And that's, that's the worst thing you can do in designing a great XR experience is taking you out of that present moment and making you feel like you're not actually embodied in that virtual experience. So I do agree that you know having your real hands is is the goal just like having your your real eye tracking and you know, your real persona with cameras actually mirroring your actual avatar because you want to represent yourself and most people kind of similar simulate their selves an avatar with their own characteristics. Because we know people like themselves, you know, and and I think that's super, really important for designers. And another thing that I did want to mention since we're talking about input enhanced specifically, when you're looking at designing experiences most people if you want to present UI or information for that player to then you know, comprehend an experience and to tell them what to do or you know how to how to actually act. In an experience. Most of the UI is out spatially right. And most designers when they consider how to convey information there's usually quadrants of where your eyes go in far out space from right in front of you to the sides to the bottom. And that last place or position where designers design UI that's informative, but not as important terms of hierarchy is actually around your body. Right? Because people when they're experiencing, you know, virtual reality or even AR that it's very seldom that they actually look around their own body, but there's some developers that have taken that to their advantage to design experiences like virtual smartwatches that can have endless screen or have different modalities. Or they use we did this at Leap Motion a little trick like this, this arm band that would also have a UI that would be displayed on your forearm. So there's a lot of things that you can do that designers and developers can then exploit that people haven't done but you know a lot of these these frameworks of understanding how to design good XR experiences. Again, it goes back to just you know, experiencing whole bunch of different apps.
So actually, I have similar like perspective with Dave, what I have observed so far, if I think you can make a maybe even a decision with like a very quick check of your game or app. So in my opinion, if your app or game focused mechanic or focus, I would say interactions are happening directly with your hands. Like as if it's the protagonist of your game or app than in my opinion, hands should be virtual rather than real so because then the whole concept is because you are doing something with your hands that your hands are at the center of the camera and everything, in my opinion, more in this most of the cases as far as I have. I can imagine right now all the hand tracking applications and games. I think making a virtual hands makes much more sense. And for example, the climb game that when I was at Crytek we had the chance to release this as a launch title of Oculus. It's very interesting, because you are using your hands. And keep in mind that the hand tracking we had no Inside Out tracking back then right so you were doing that with base stations, right? Because you you always need to look a little bit high and then the controller type of controller, you can do that. Of course it was not a hand tracking focus game. But the interesting thing is, the hands were almost like the epicenter of attention. Because when you get your hands you were understanding how much like tired your hands are. And then there were bars that's showing that and you need to make your hands a little bit with what was called I know like this bite does, too to make it more friction right? So that it will not slip right. So all these that actual gameplay mechanics are heavily relying on how your hands look like right that you understand you're about to fail or fall, right so in my opinion, this is this is the this is the way I would say and the same thing with him physics lab as well. Like your hands are almost like at the core of the whole experience. And you definitely need a virtual hand and in hand physics lab is you can see it's like a physics as well right. So, the, the most of the stuff that as far as we observe is since you are literally trying to bring a lifelike physics, not only from a virtual interaction perspective, hand to environment, hand to object interaction, but also the physics of the whole your hands movements, right. So it's actually something that any kind of glitch is not I would say easy to forget from a user perspective, right? Because you are really pushing the limits of the physics there. And so again, from my perspective, hands should be virtual or mostly virtual if the hands are really at center of attention. And But one interesting use case that we haven't seen so far is we call it contextual, and pass through. If you look at XR, bootcamps, YouTube, you can see our pass through examples, sorry, pass through open lecture with one of our lectures they are really focusing mostly on pass through types. Billy and Daisy did a very interesting talk. You can watch on example cam YouTube. So there is actually a potential that you can use your hands or any kind of input interaction for like this contextual pass through. When I say context, you don't understand contextual awareness. I'm talking about actually, like, changing the layer of the pass through from like, think of the older the whole scope of or the whole reality scope right. So we are chill mixed reality, right? So from that perspective, you can actually play with the reality level based on the environment. Just to give you an idea, if I have some kind of like a cup in my on my desk, right? What if the pass through immediately disappears and I can start seeing my hand and the cup that I'm trying to approach right whenever the system or my game understands that I'm actually trying to approach to a physical object, right. So immediately it shows me the physical object that so I will not hit the cup and I will also actually even grab the cup, right so it's much more like a contextual pass through that you can create, in my opinion, I think this is an area that we haven't explored as game designers, maybe with the help of new mixed reality perverts headsets, or SDKs. I think we will have more opportunities that imagine an environment that is reacting to your interactions or your intentions, understanding your intentions and then actually playing with different regional pastoral opportunities around here rather than bringing full passthrough or full, full virtual environment. I think this guy, this can be quite interesting.
I think that's super cool. Considering that we've always discussed this as an industry but where are those real object world tracking for objects? So if i Why can't you just put a sleeve on a cup with a QR code? And that way I can always even in VR, I can understand when my drink is, right. Yeah, I predict that's going to be a kind of a big thing. But actually, I'm so glad you mentioned the client because I actually wanted to talk to you about this. The client too, last week, because I was like, Man, I just remember the climb to just in terms of the graphics and in terms of how optimized it was. If people don't know it's basically climbing game. And you know, like Farhan describe, the main character is you but you only see hands floating hands, right? But the visualization is dynamic. The actual hand motion actually feels very natural. Did you guys do photogrammetry? I'm kind of switching gears, but I wanted to ask this. Did you guys use photogrammetry to capture the environments and how did you do that into the grid? I
think so. I think so. Of course, it's a different engine. I think so especially the second one. Should be photogrammetry base, especially the scene scenery is amazing, right? Yeah, it's
dynamite. Yeah, okay. Cool. I'll have to pick your brain more after
yeah, that's, you know, like I was looking into the climb. Yeah, I think I saw this. This was like feature on many, many platforms actually very, very beautiful. And the fact that I think that lets like you said, a contextual information reveal it's something that probably is not being it's not being explored enough like you know, like I've seen a lot of interesting developments in the last in the last years like for example, when Apple kind of released the, you know, some demos about vision Pro that no one got to use that yet. So we I don't really know if you guys did, you're lucky. But like I've seen the basically there are some it's very mixed that up. It's very mixed up, right, like so there is like you feel like you're in the lobby and I just like compare it with the lobby in mera that you are in this full, you know, kind of calming environment, but then in that poll is actually just half of your space. So behind is actually a real space. So, there is some you know, like there is like, like you said there is some start to be contextual or maybe they have something they did was like trying to in the in demo that I've seen on TV like there is this the the interface tried to be transparent and even if it's you know, like we are expecting everything Apple to be transparent because you have like these panels on the US that are you know, your dock where you have all the apps are actually transparent. But in AR is totally different effect, right? Because it feels like is almost is is aware of your space, like you can see like glory, like potentially if someone passes behind your Minoo or your kind of like UI is going to be blurred out you know away from the from the panel. So it's, it's the same move, but especially kind of like has such a different power, you know, so definitely like, I feel like the contextual features that's so strong as a concept and I can't wait to see what, what, what, what everyone does in that direction. But yeah, you guys mentioned multiple times that thanks to AI, we are getting places that we were in before and especially within tracking and other inputs. So like, you know, like it's a buzzword in the last years. I don't want to, you know, fall down into a cliche, but I want to try to apply what I've been seeing to your use case, which is education. There is like a big movement of you know, large language models and all of this is being used in education and I'm actually it's a bummer that I'm not at school now because I really would me curious to see how you know, general newer generation are actually taking advantage of this because you can there are a lot of loopholes probably, you know, maybe the teacher said you can use that. But then you kind of do in a way you know, like and you I don't know like but I'm pretty sure that everyone has like a consistent usage of it because we saw some. I remember there were some stats that were saying like school is like, he's in break. So there is the summer, the use of Chad DVT when like down and then like, restarted to go up when everyone's go back to school. So they're probably a big, big usage of it. I find it very, very helpful for learning I feel like no one really has the patience like that machine that tells you things constantly no matter like no bias wherever you are, and it's such a powerful tool that you know, changing really everything at least in my daily life. So I would like to know from your perspective, since you have this kind of, you know, continuously developing schools as XR bootcamp. I don't feel like it's so appliable. As you know, I don't think it's so strong in trying to in explaining you how to master a very wide environment like could be AR and VR. Like if like any area, there are a lot of tools. You can't explain in one text, how to use Unity or how to use unreal but you can very much ask questions about very specific topic that you arrive there just if you work enough in the environment to do those questions. So there is still like a lot of part that comes from you and not from the from the machine. So have you guys explored the usage of large language model for XR bootcamp or have you envision something that you would like to work with in the future?
I think it's for us the most important part is the not directly maybe MLMs, but much more like how it can change our daily workflows, right? Because right now, the biggest issue with all these Gen AI tools, they are helping you on one specific thing, but when you get your daily workflow are spent, how much person or how much like a efficiency increase you may have, right? So what we are looking for is like, can we combine some of these tools in a meaningful, consistent and actually, like an automated way that can start influencing your workflow efficiency, productivity, right. So, I think this is something that we are interested I mean, there are very interesting tools already on the for example, Skybox generation, right on block eight laps or others different tools that you can literally, especially from a technical art perspective that you can literally, I would say, equip yourself with different superpowers with a very limited number or capability of a team right? But still, we believe that we need maybe a couple of more years to really influence this workflow. But this doesn't mean that we are not exploring because we are also planning a few fellowship programs focusing on on Gen AI, powered workflows, focusing on like, XR game or Unity or Unreal developers as well, or designers. So we believe that it will definitely be helpful. More in the next couple of years. And on the other side, are our programs are known to be a little bit demanding, I would say like we are probably one of the most demanding program because especially some of our programs are skilled to job related which means that we are helping people to find jobs afterwards. So they need to build an really outstanding portfolio with us right, in such a short period of four months, right. So in order to achieve that, to be honest, our our assignments are quite challenging. We have easy medium hard assignments, and they are quite challenging. So and most of our courses, I would say is heavily based on Project Based Format plus these assignments. So especially on the assignment site, since these pair programming or code developments, like you know this get up copilot a certain type of stuff. This is coming right now. So we know that this may affect the way people are solving our assignments right. So we are also considering to take precaution I would say for for this kind of pair programming. Of course this is not a we are not some this is not something that we are trying to evade. Instead, we are trying to make our maybe assignments or challenges even more pair programming friendly. So you can actually do a much better job with the power of AI right or with AI powered tools. But still using your own creativity or algorithmic mindset that you are building or development mindset that you are building throughout the class. So making some kind of like a blend of or combination of all these skills, like prompt on one side, on other sides development engineering mindset, on the other side creativity, I think it's much more right now unique skills that we need to combine together to be able to cope with this pair programming era. I would say.
Feel free to add anything.
That was good. That was really great. Did you Alessio, do you want to add something before I say a couple of things.
Maybe it's better you say your thing first because I want to I want to touch I want to touch on another topic
to choose your shirt. So being active. Well, first of all, you should check out our Discord XR bootcamp. It's one of the most engaged and helpful communities in XR development you'll you'll work with someone who works at Boeing and then talk to someone who works at Mehta or at Amazon. You know, it's it's a really great group of people. And one thing I learned with a lot of people in the community is that, you know, they're using LLM to basically rapidly prototype, you know, query different prototyping ideas. Also debugging you know, they come into an issue of why is the code not working, and you're actually using chat CBT to even debug, and also just best practices, you know, so you're just like, hey, you mentioned Alyssia. Like, this is a whole new brave new world of how people are learning but I think it's just a phenomenal tool and for developers, there's just no other better time right now. Yeah, it doesn't. It can also help you not only provide the brainstorming ideas, but also help you code like you already have that I mean, what else do you need? So I just wanted to add those two bits there.
Absolutely cannot agree more. I feel like it really extend your potential so much, and you are so much let's say that usually you arrive to a certain point and that from that point, there is the challenge to figure out how things work because maybe it's not that, you know, take for granted, you know, topic that you're exploring, and you spend like that extra time on that specific problem. But here we go a level above or two or three level above, like you're figuring out things that are actually way harder than you were before and I feel like in that process, you're also learning a lot. So you're like okay, now I know this in discovering that I discovered so many other things that in a matter in a period of time, that would have never been possible right? Before naturally. So it's definitely incredible. And I think what I would like to even if it's a little bit out of the list of questions, that I that I shared, I feel like this is something that you guys could really like nail it as an answer. Like, I feel like VR and AR You know, the T jobs in AR and VR started to shape just some years ago like before was I feel like before there was some some trend in creating calling people creative technologists and then it became some sort of interactive engineer. And then you know, now there is a rvr engineer and there is and I feel like now we just call people product. Designer because there are so many AR and VR people that are just product designer because it's something normal now it's not AR VR product designer anymore. Also my role is like prototype engineer, which is a level of experimental almost engineer and kind of a research role if you want or a support or designing engineer like all of these names, right. And I feel like the role of engineering and designers start blending so much in this process because you have you want to implement things spatially and you have a very good understanding of how things work, but you want to do it spatially in separate retyping and unavoidably you start to enter into a technical reality that makes you a little bit more than what a designer traditionally is concerned is considered to be so I had like a lot of visits a month appreciate I really thank you. And if you want to ask questions, manda please. You know, jump in anytime like this is also you know, a live event for that reason especially what I what I wanted to where I wanted to arrive is that like, having this sort of like blending of design engineer what I recommend to a lot of people that are like, Oh, I would like to start, you know, to work in the in the in the field, what should I do? And I'm like, I would really like try to do some stuff that no one ever did before. So far. Like you just get one or two projects, this assemble and then you just try to, you know, make like aI LLM in VR that talks to you and it does stuff around like I don't know, like, put together stuff. You know, it's a buffet like get it from there getting from there. How you see this kind of job? Panorama? Yeah. Shaping in the future. What do you think the roles are going to be? What do you think the distinction of between these paths of indications are going to be for different roles or let's say do you identify some key job titles that you could say these are the ones that probably are going to shape the industry.
You know, this is a topic that I can also speak for, again, another hour, but unless you first of all, I would like to also thank you as well because you also joined one of our carrier navigator sessions. I think there are tons of amazing advice that again, you can watch on XR bootcamp channel there, it was a little bit maybe targeted towards designers there but I would, I can tell you that the advice that you would have in this talk would be something that you probably cannot collect from 2030 people in the industry, it's really valuable advice. So what we have discussed there actually, maybe we can eliminate, share here as well. And from our perspective, right now, the most in demand, job role is a functionality. I don't care about titles or name of the job role, job, open role. And I think the critical thing is the functionality right? What kind of functional role responsible type of Hey, so, what we have realized and seen so far is like the most in demand job is xr prototype. So of course, what you have mentioned is actually similar to xr prototyper, but I think acceptance type is really the encompassing name that can really show like your creative skills, your rapid prototyping skills, your level of development skills, right, and your skill to understand as if you are testing your app or game better user. So like, it like how I would call it but product skills like understand, okay, this XR app is fun, right is functional, right? So this kind of notion, having this kind of motion, understanding your scope of the project, planning your self accordingly. So all these actually, I would say, skills or experience is actually leading up to an XR prototype role. So we have built all our security job programs, according to this very common functionality, to be honest, from whatever background you are coming from, it doesn't matter we still help you first become an self capable XR prototype. Before anything else. You can always pivot or like a specify yourself on specific examples afterwards. But no matter what you need to be a good XR prototype prototyper even with some projects that stands out with your portfolio projects, right that's very important for us. But another thing which is also important is especially people who are joining trying to breaking into exciting history, from different backgrounds, like let's say I'm engineer, I'm an architect. I'm focusing on medical right. I think even it is way more powerful to become XR prototyper if you are coming from a very specific background, with the if you have already a subject matter expertise on specific industry or use case, value combine the subject matter expertise on a specific industry with your XR prototyping skills. I think this is what the industry needs, especially on the enterprise side. And that is the exact reason with this formula. We got the chance to successfully place like 70 80% of all our graduates to jobs that they were dreaming about. So that's very important for us. So become an excellent successful XR prototyper. And in order to show this it's very easy, right? You need to really build an amazing portfolio. And that's why our actually, programs are usually focusing on assignments, sure, but also prototype sprints. So every week you are creating another project XR prototype, and like a week sprints on a supervised like a session right? And at the end of the program, you are creating four prototypes and to NDPs so that's what we are doing and that's why I thank you for asking this question because this is the I think the I would say the one piece of advice that I would tell to anyone who would like to break into XR. The others roles you can always say like it's specialize at the at the next stage of your career, but first, you need to be a good Excel prototyper like you are already
who other people has to say when they see the work but Yeah, unfortunately I cannot it's three years that I cannot share my work so when you work for the you know from the side, you're very happy because you work for these great companies but the other one, you can really share a lot and I know a lot of other friends of mine that are causing my situation but yeah, I totally agree with you. And I'm gonna just like try to draft a conclusion here. Because you know, like you need to go but also, thanks so much for your time really, this is you know, very, very valuable time and this is going to be you know, spammed all around our Discord all over. And I will be sharing also personally Yeah, I want to say yeah, I think the new designer actually is the prototype. I feel like designer as we intended before I feel like is a figure that it's kind of a little bit. I'm not saying losing importance because the aspect of design is very important to make a product that is going to work otherwise if you don't and that one is not going to be a product that other people is going to use but at the same time I feel like I see those surrounded there is the wish to say okay, I want to do this this way. And from that, you know spark also starts your at least for me starts the started the curiosity to understand how a lot of more things work on behind the scenes and that leads you to there is a beautiful I think that there is a very interesting career path that is being drafted for these for this field because you can really just go deeper and deeper and deeper and this is also for other for other things for other subjects to start like with a little bit of interest in and go deeper Yeah. So I want just to say thank you so much, guys, and for you know, for providing all of your time and insight. Samantha, would you like to add anything before we close? Or, you know, I want to leave the last words to you guys. Also for Ron and David. If you guys want to add anything before we close this call
Samantha did you want to say something?
No, I'm good. I'm homesick at the Yeah, it's cold here. So that's why I've been quiet. It's really interesting. Just listening to you guys.
Thanks for joining while you're sick. That's That's awesome. That's dedication. Thanks for
the only I would like to add LSU I think you're 100% right this whole role of xr designer is really going to be a whole new paradigm shift of how we're actually communicating visually because, you know, in the early days of print, visual communication was very much a one way street you would get something printed to communicate information directions, whatever. And then we got into the realm of digital where it became a two way street. And that term of user interface designer because it was actually you interacting with the technology and that technology giving you input back. And then I think Daniel Mark QC he's one of our instructors. He did his really wonderful breakdown of Apple vision Pro and all the different use cases. But he said something really poignant about this whole new spatial design or 2.5 D reality in the sense that now from you know, one way street of visual communication to two way with user interface and you interacting and getting feedback. And now we're entering this third dimension or this new wave of interacting with communication and interacting with other people because you're actually in 3d space which, which is dynamite, and he says it's so much better than I can but we have a lot of courses for anyone at any skill level that wants to understand, you know how to even get into the space. We have a great community and a great network even place jobs for students and employ anyone tuning in to come check us out.
Amazing thank you so much, guys. Have a good night, everyone.