Anyone that for anyone didn't ever join this, this space, this is a space that like is a very low key, like be weekly talk that I kind of hold just for people that are interested in the XR community. I tend to have like speakers that actually work in the industry or they are trying to do something new. I think we're super lucky to have Julian here. I just want to be clear before I started to introduce anyone that no one pays me to do this. I just like doing it because I like to keep in touch with everyone in the community, like discovering new things I didn't know. And this started eventually as a portfolio review channel on Discord and then like became like, sort of like a bi weekly speaker, speaker event. And we really liked this format seems like a lot of people find anything insightful. So yeah, I without really just talking more, let's give space to Julian, which is the CEO of basil basil, just released new features. I like to explain what bezel is to everyone here and what kind of feature you guys released recently. When you produce yourself and, Tom, what's your mission? Anything else? Please go ahead.
Yeah, I am Julian. Currently CEO at bezel, which is a collaborative 3d design platform. We help people easily create 3d experiences in 3d scenes, and allow them to interact and experience them inside AR VR as well. We support all sorts of headsets, as well, as you know, as of last week, we also support mobile AR on your phones. And obviously, on your computer as well. So I can get might have seen the bezel around online. But if not, I can also share some of that today as well. Just to say a bit about myself. I previously to bezel I worked at Oculus for about four years or so, where I had some engineering efforts around how the OS works inside the headset. So you can think about, you know, all the floating kinda laps around you when you put on your VR headsets.
Gorilla tag,
Gorilla tag? No, it was
one of my favorite games. I was just wondering, since you said you worked with Oculus? I am also wondering.
Yeah, I didn't get muted. What are you wondering? Gorilla tag? Oh, no, sorry. Yeah, I was not part of it, I thought it was gonna say something. Okay. All right. Now I was part of the team is behind building out the OS inside that set. So like, all the panel apps, you see all the buttons that go inside the apps, and then the menus that kind of float around you. That's what I've been working on. And during that time, I worked a lot with designers. And those designers would use tools like figma to kind of get the job done. And, you know, whenever they were printing these designs to meetings, managers would then ask, like, hey, how do I like view these 2d designs and headset? Because obviously, they're all like to do or Kangra frames. And, you know, we would want to be able to visualize these designs that people have built into that as well. And so with that in mind, you know, the answer to that question would be then well, you know, we could but like we would have to partner with a unity developer, to build a, like a whole interactive prototype, like C sharp scripting. And, you know, that's a whole multi week process. And, you know, I think that it's just too heavy for a lot of people on the teams to like really iterate on the designs that go inside the OS. So, you know, I was talking with one of our loop he'd prototype errs, and yeah, but let me share a file that It might find helpful to visualize what we're talking about
I can also share my screen
it's all good. But yeah, so this is Basil. If you haven't used it already, it's all browser based. And we use Vuex are under the hood. And that allows you to both work on your basil files on your computer, as well as you know, on your headset, or on your phone. That's a picture from our team a while back. And we have like camera bookmarks here. And these are the types of like panel apps that we would build at Oculus. And when you think about, okay, you know, when we have these two designs, how we Pigma how we bring them out to 3d, it's all part of how we get this done. Yeah, and so now we have these cursor events as well. So you know, if I hover in this little block, and I hover around, the shape reacts accordingly. If I pop up, click on this candle. This, this UI panel also shows up in the middle. So you can try this yourself by clicking on the link that I shared with a chance. And then what we can then see is that has allowed that we use ourselves to build this file. So if I look at this particular file, I can see how this interaction was made. So there isn't this disease control controllers that I can modify. Just like that. Obviously, I can change the colors of this tree, if I wanted to different greens. And yeah, these are all different 3d assets that were imported into the scene to foster easier creation workflow. So if I close this asset library, and then let's see, I want to bring in like a flamingo. And I just like drag and drop a flamingo and now we have it inside the file. And if I were to bring in a quick visualization of how this works in headset, and you can see how this is a sample bezel file that we just hooked up. So click interactions to and this one can move, depending on what kind of interaction we set. And no, you know, unclick could be perceived as a mouse click on your on your computer. But when you open this exact file, inside your headset, that's when the magic happens. Because now you're able to open this file and interact with it in real time. And listen to all the updates that are happening across devices. So this is happening inside the the headset, but obviously, you just saw how this was also happening inside the computer. And
add some comments here cuz maybe for you know, if you don't spend a lot of time maybe making AR VR experiences, you might you might be asking yourself, like why, like, do we have an alternative to you know, some existing tools? And the truth is that like, I think the barrier for entering some tools like Unity and real and other things is quite, it's quite high. And it's quite heavy, the stack that implies for being like a developer, so for those kinds of experiences, and I think that basil kind of makes things very simple, snappy and accessible. So it's definitely like a step forward for anyone that wants to enter You know, just do some concepts. Of course, there are a little bit of limitation, but it's like a new tool and will be perfection in the future with a lot of more things. So just just to give a kind of understanding to people that want to enter the world now,
so. Yeah, yeah. Thanks for that last year. And yeah, kind of like the story I talked about. It's basically like the idea for this came from the lack of tooling for designers, or like non game developers to like create interactive 3d prototypes, really, that right now is really difficult to do. If you're not capable of building out like a Unity or Unreal kind of prototype. I didn't want you know, I didn't want to ask anyone in the call. What kind of tools they use today to work on, you know, 3d files that they want to speak up to pick up don't want to?
I'll just, I'll just answer. I'm sorry for my friend. Oh, my girlfriend. She's also in the call. But she's She's actually doing from from my recommendation, she's actually doing an AR museum app. thing. And she has. So because she was using shapes, I saw her and now based on my recommendation, I'm asking him to try unity. Core, the AR core, because she was trying to use image tracking, but also to attend the call. So she didn't talk to you guys directly. In terms of like, because she also just thought it best to update a little bit about your orange. She's trying.
Yeah, so yeah. Yeah, that's awesome. And speaking of mobile VR, that's one of the reasons why, you know, let's say Vitamin E today, because we have a new feature update, that now allows you to do exactly this, but on your phone as well. So that's like a pretty big improvement for people who work on mobile AR, or, you know, for a lot of people out there who don't really own headsets, to also try out this kind of the sort of unique hybrid XR design flow that now is really accessible. So I can also show that as well.
Yeah, I remember that we had some discussion. And, like, I mean, the main the main question, I think, that everyone had when they tried basil was like, is there like an a URL? So tool, because I remember I tried initially, and he was working in AR as well. Also the current the current one without before the update, but this one seems more optimized for like foreign experiences. So which is the device that everyone uses for equity? So it's pretty cool that you guys have that?
Yeah. So very helpful. Yeah, exactly. But you weren't originally saying that stuff? Were you thinking more like magically AR? Or something else?
Well, I mean, of course, that's these activities becoming very, very specific. I mean, if this like thinking the I believe this is like a web they are so kind of applies to both, probably. So in that sense, I think that becomes almost not important if something or the other one. But I know that you know, most of the people are using your yarn on the phone rather than headsets. So I think that that's the right at least that's the right way to advertise it for for now, I would say as you got as it gets did it even because seems like a very snappy app. And like very quick, like you can just add some stuff on the go. Meanwhile, for headset feels like more, you know, like, oh, specific use case. Yeah, where do you see this kind of AR application? Kind of like, I mean, I guess like the question that I have is like, Okay, you you make your own scene and then you see it in AR is that like a link that is like a scene that I save, and I can just like get to that scene through that link. So I can just like put my link there and you know, think would be cool to have like a link tree of my AR faces something like that applies really to everyone, everything is not a you need to make like your link tree just really kind of like applies to so many things since he was just a link.
Right. So I think all know how, how easy it is just to open a browser link on your phone. And that's that's essentially how easy it is for you in that experience, a 3d design that you've created in bezel is now directly experience on your phone. So I think that's really the magic of it. And if you're building out a scene, or working on a feature at a company working on AR products, that this is probably the easiest way for you to build something just on your computer as you normally do for any like 2d design. But at the same time, be able to visualize how the design works in an actual space, just through your phone's camera. And right now, because headsets aren't mainstream in society, I think this is the most accessible form of what AR looks like. Yeah, that's that's a super exciting.
I'm also curious when like, when I was seeing this demo, how are these things like spatially anchored and I know you guys are mentioning are crucial material and guidelines. So like, right now the it's on a wall and stuff is viewed as a prison on BGP placed on a wall? Or how does it work. So
we have a word, utilizing a particular spatial anchoring library inside the phone, that this actually automatically anchors your scene to a plane. So we detect planes inside the real life scene. And then kind of plan the the AR scene on top of that plane. So that's why like, like your web XR, or, you know, your phone camera is able to detect like, Oh, this is a plane, the floor under me. And then I can put an air scene on top of that. So that's how we currently do it. We do also have the occluder material that has been really helpful for building things like portals. I think you might have seen examples of that, as well, I can try to show something that shows the occlusion materials.
So for so when you said and Christina, playing yourself by the call on that. So do I have to say like, oh, this is gonna be attached to a plane? Or like, how does? How does that?
Yeah, so if you try it out today, we have a like an interaction where the, like, the main objects inside the scene will move around with your phone in the beginning, like depending on where your camera is pointed towards. But once you tap the screen ones that will kind of fixate your your AR scene on that exact turning point. So it's a bit difficult to explain in words, and you kind of have to, like try it out with your own phone. But that's how that's how we currently decide where to plant the AR scene in the real world. So that after it's planted and angered, you can kind of move around as it looks Catholics as part of the real world.
So it's like it only anchors one point is what you're seeing, right. Like, like, the whole scene is is almost like a group of items. And I'm just anchoring the place I tap, like, similar to like the iPhone AR stuff. Where when you view 3d.
Right, so once you enter I mean, I mean, I can let me actually try to pull up a file myself so I can show you in real time.
Yeah, that would be helpful. Also, does does the click interaction still work in mobile? Er view?
Yes, yeah. So you can still interact with the team on your phone?
Again, some of AR
the one that you just dropped? Yeah. I mean, basically, you know, like, wherever you are, there's not like a lot of difference with any other kind of AR just like it just the link, basically, that opens kind of an app. And and that app just like has all of the capabilities that you might have in another app, but like Since he's not running as an application on your phone, like the performance or things could be, like a little bit different, but like with latest development seems like candidate, you know, rounds pretty well, like you can do quite a lot. And and the best thing is like it's easy as opening a link. And there you go. I just want to give a shout out to Julian that every time there's like live demos, you know, like, yeah, it's also last time I remember you did it. So I mean, just
to see how like the TC taper element grows on top. So we have a, we have a full AR scene that's embedded inside this location. And then, sorry, one second, let me decrease that. So you can just take a look yourself
it's a little difficult to like cast your phone to my camera, but you can probably see what's up. Alright, so yeah, so once you enter your phone view here, you can then position things here. So you can see how the AR view kind of adjusts to your surrounding.
Yeah.
Hi, I have a question. Yeah. Is it possible to rotate things when you're interacting in the mobile AR? Like, because I'm prototyping for my museum AR project? And I want to like rotate this 3d model in it. So is it possible to do that? Yeah.
You mean, just like, all around? Yeah. Okay. Yeah. So you could specify that rotation inside the bezel. And then like, like in your mobile AR scene, it would be like California or something it would rotate. You want to see like an example of that maybe.
I can try to pull up an example that might actually be helpful in this case.
Thank you.
For non like Beza is still like me tech, right? Because I'm seeing that you guys are like, we talked at the GDC. And you're doing a lot of like making sure that all are interoperable? In a sense. There's a lot of complexity in mobile, like making that happen. So currently, like the interactions tab and click are at least universal despite the medium.
Yeah, exactly. So the tap is universal between your phone push screen, as well as your computer like click, as well as the trigger button in the vr controller. So all all devices, we're trying to standardize as much as possible.
Okay, yeah. I'll be really excited to see how you guys manage to like, in life features where each medium was still making insurable interest in the future. Yeah, the challenge is exciting.
Yeah, for sure. Like, for example, like, hover interactions are a bit difficult, like on tap, like touchscreens, right. So like, that's like, stuff like that would be ironed out over time. So actually, this is a rotation of a statue.
So you can actually do this with tap
interaction? Yeah, yeah. So I can also send you this link. So you can try that yourself. So when you enter the start view, but I can just click on this statue, it will rotate on its own. And then if you visualize this file on your phone, the same thing would happen.
Oh, I see. This cool. Thank you.
Yes, I have a way for you to control like, oh, maybe if I want to see an AR view. I don't want these objects all you want like maybe for this case is like just a center like statue, but if I want to see in VR, I can see the whole Is there a way to like, prototype that or like fake it
not sure if i Oh, yeah. So like, if we use this example scene of this museum and I use the same link and I go onto my VR headset, I will be in the room, I will be able to see the walls and the statue. But yeah, let's say I'm trying to use the mobile er view, I just want to see the statue, I don't want to see the walls is there a way in bezel? Oh, that.
So right now, the way to control that would just be to play with the opacity slider. But like, realistically, if you want it to seem to look nice on a mobile AR experience, then you will probably delete these walls. So that once you delete these walls, just the statue will be overlaid on top of the real world. So that the statue looks like it's part of the real world and doesn't have this wall. Like overlaid on top of your actual room. I guess that makes sense. I guess
I tried to connect the questions here. Sorry to interrupt. But I think that the question is, can you know what device is using the web browser at that time? Through your experience? Like very simply, if I enter we'd Magic Leap or HoloLens versus just the phone? Do you know that is the phone versus a headset?
Oh, yeah, so we do detect which device is loading the file. So that allows us to like, like, hook up the interactions. If it's VR, like the headset, and like our controllers would need to matter. And if it's a phone, you know, be triggered the whole, like, pre positioning or anchoring plane detection type of stuff. So it's like, yeah, so bezel does detect which is which? But, yeah, I mean, depends on like, what you want to
use that if you want, like you can differentiate based on that conditional, eventually. So you can change it for every facet.
Yeah, I think it's just like the question of like, if I were to interview use, like interactivity, or like first time interaction to embezzle, I will want to know, like, can I prototype it on the condition of like, if the person were to open this in mobile apps? And then do XYZ actions? Or something? I was wondering, like, is that possible to somehow we take, at least front end wise for the user to customize? Or is it still like very much and then back end?
I see actually, yeah, we currently don't have that level of conditional configuration. Because we always sort of show the same 3d scene, whatever device it is. But it's possible, that's a good feature request, you know, we could add it so that depending on what device you're looking at, you can also set different states or different configurations of the scene.
Really cool. Also, one goes on like walking. Is it possible for me to do like a 3d website right now in bezel because I started the camera is right now, like, what in your view will be more like a road? Like the screen worldview? Is there those what I can do like a screen space? So that, because what I'm trying to say is that I am curious to see if I can create a website that can be both 2d and 3d, in a way where I could go from like, I could use to the websites, I can import my Sigma, but I can somehow control like, the model object. Like, I'm wondering if that's possible, because right now, what I'm seeing in terms of this workflow is using framework. North Korea, you can import stuff that's a little bit long. So I was wondering, do you guys have that feature? Or like, is there a way to protect it?
Yeah, I mean, I think it it's not our current focus, to kind of have a published proposal files that's like a whole website that you would use in production. We we do support embedding personal files as like an iframe in whatever website that you want to use. So it's possible, but it's not a use case that we're focusing on today. Because, yeah, maybe in the future, but right now we're really just focused on kind of To the 3d experience been useful to support the 3d design flow itself. So
yeah, I was just curious in the sense of like, there were a lot of discussions on like, if web XR or like were to be realistic in like everyone's life and small boy recently to the same link on mobile desktop, you'll see like a TV website that wants to go with like a VR headset, then you're in that you're used to, either. And that was why I was asking about a device conditionals as not, because I'll be curious to explore more like, what does it really mean to have that in? Future? Yeah,
I see. Yeah. Now it all makes sense. It all connects. Yeah, I mean, I think that the past is a possible future where once WebEx becomes more mainstream, that has like, that can support different versions of the website. Then, you know, it detects when you're looking at the website, on your headset, and then present to you like a VR version, or AR version of the same website. So it's definitely doable. I think it's really just a matter of execution and like, you know, the market around.
You see, thank you.
Yeah, yeah. So you could invite up as a payment side. In fact, we have, if I can go to
slash docs.
We have some instructions around the embedding and web. So you can go to the share setting. And you can publish the link and then embed it into something like the notion Oh, here we go. This is like a notion of psycho sample. But we now have it up we have a petrol car that's embedded inside this. So
yeah, I mean, you know, it's something that be wanted to enable for people to try out and like, if it's useful for something. I think it's definitely has done use cases. But the main use case that you wanted to focus on was sort of the AR VR 3d workflow.
Yeah. I guess Yeah. For Lucas. And I just want to say this is this might be applicable for what Lucas was saying, around the embedding a 3d as well seen inside a website.
Yes, it was, it was probably explore this. So my question is, if I go to this website on like, Oculus quest browser, and I scroll it till I reach this frame, then is in them conscious browser, can I go into VR mode? Like into the frame? Or is it much more like just just
oh, I believe you should be able to enter VR mode into the frame from the website just by clicking on this button. If you in VR.
That's really cool. I'll definitely
yeah, yeah, sounds great. Sorry, honestly, I was gonna say
they were bringing you places, you know. I feel like also just a viewer though. It's sometimes like I did, I did some similar projects, once I kind of discovered that you could just be a WebGL from unity. In a website, I was like, Oh my gosh, my website is just gonna be unity. You know, like, I just created this link that goes to this link that goes to this link. And it was basically just like game online, but that becomes super heavy and it kind of takes you away from the experience on our website. So I feel like the the viewer and the embedded viewer That's why I asked that question. It's kind of a halfway between someone that really appreciate the three dimensionality but doesn't want to, like necessarily spend time like kind of navigating as ugly game in your website. So be careful with the extreme take on on it. But it's cool that you can see these in VR and AR as well, I believe. So let's have a question. Like, for example, if you entered the scene with, with a VR headset with an AR headset on the web, is it gonna just like, I mean, I know that it's gonna go in, but still in that in that moment, because I mean, ar, I should tell them, like, I should make a conditional that says, If you detect these, these headset in AR, just please remove the background. So I can just have everything placed correctly, but without the background.
Right, right. So we actually have it so that if this bezel file doesn't have a background, it will just use the background color of the HTML page. And then by entering to this particular file, in the AR headset, just working, no background, or like the real world events you
cool. Cool, man. Cool. Yeah.
Yeah, that's about that's about it from from my end, I just wanted to showcase the new mobile AR feature, and tell you what Basil is capable of and why it's the easiest option available.
Yeah, it's really cool.
Thanks, Jerry.
Yeah, of course.
Vita, Vitor, I know if I know you by. If you have any questions, please go ahead. Is this safe space for everyone? Learning. But anyway, you guys will feel so to go right in the chat and anything? I feel like, I have some question that goes more on your, you know, on your take on certain topics. And the first one I have is like, like, I feel like basil is so flexible, which if I designing with a tool of this kind, what I may run into probably the most of the work is like, kind of building a model of interaction, which is consistent across everywhere, like all of the devices web viewer. And the fact is that like, for example, like the interaction on the phone, the interaction and awkward interaction and an AR device interaction, you know, there are so many different ones. And how would you approach something like that, like, for example, we want to do this kind of marketing app where we just showcase this, you know, this, this, this kind of furniture, and I want to, like have some variation of this furniture. We all know that like this is a simple scene. But then when you get to building an app, you need to probably have like a database of options that you're going to go through, you're going to have like a minimum UI. Who knows what's the right UI? Basically? That's my question. So I wonder what were you the best model to?
Use? Yeah, I mean, so we think about this a lot. While trying to build out different ways to experience these DD files, depending on what hardware you're using, what device you're using. I think it really depends on what like how easy it is to traverse input in the file. And what I mean by that is on your phone, because your thumb usually just you know, is around a particular area and a screen, we might want to kind of constrain all the buttons to like very simple, very simple features that you could press for this particular scene to like toggle between different color options of the furniture, and like to kind of see those and it might be a small arrow button in the bottom was bad. But in VR, because you're mostly sort of controller based and you don't have like that sort of central seat in the middle that you can just tap in maybe the menu is more sort of centered around the controller buttons, or like a sort of a handheld menu thing that you can tap on. Although realistically, even in VR allowed the wires kind of ended up being sort of floating panels in the center, right? Kind of like what you see in typical VR headset menus today. So I mean, I think currently embezzle, there's no way you could like, tailor the 3d scene like that for what device you're experiencing and from, but you could add these buttons or surveys like duty or worldspace interactive buttons, and I've seen people do this inside the 3d scene, then you can like click on inside VR, like tap on, on your phone as well. Yeah, kind of like a volumetric button, like a, not like a typical sort of website button, but like an actual.
URL they are brainstorming to about this thing, because they they come from to the, and they have like a system that works with the array. But on to the elements. And then they have like a totally separate thing for, for interaction that are direct. And I also saw that recently, Matt, kind of unlocked their direct interaction tools. So seems like direct interaction is becoming a little bit more accepted and mature. So maybe, you know, like this three dimensional buttons are going to be the new thing, most likely. But yeah, speaking of interfaces, maybe one of the few last topics could be kind of like an updated request. I remember that we talked about this as well, like, we talked about API integrations. We talked about AI integration in the past. And I think I didn't want to talk like broadly about it. Because first of all, I didn't think I'm expert enough to make assumption about all the papers that are coming out. I need to study more. But I can definitely see that, you know, like this kind of new text prompt interaction is something that really works well. It's so intuitive, I would claim almost like addictive in working with it. I don't know if you guys ever, you know, using your work session, the usual tools chat GBT and stuff. codecs and good co pilot. But it feels very rewarding. The fact that there is someone that is asking YOU ARE THEY you're typing one word, and you see like literally what you were thinking right after that there is some sort of like adrenaline coming from this interface, in my opinion, way more than just like pressing a button. Like, it's just like the satisfaction of pressing buttons, something but seeing someone that tries to connect with you, meanwhile, you're trying to do something is almost like addictive. So in that sense, for me, I wonder, with this prompt based interaction, now you seems like people can also host their own models, and build and fine tune their own models, rather than actually using others. So what I could envision is that every company is going to do it's going to do it, because what's the point to go for a third party, when you can just build your own model, they can have perfectly the thing that you need based on your product. It's way easier also to control maybe. And so I wonder if you guys are looking into that. And if you are trying to make something like that, but really like, Basil based model, if you find it interesting, and if you see like happening, how?
Yeah, that reminds me, we actually uploaded a tweet today that shows you how you can use blockchain Labs, which is a separate website where you can just type in a text prompt to generate a skybox, like HDRI environment for your 3d scenes. And once you have them, you can actually add them into your bezel seeds pretty easily. So let me actually try to show you that right now. So yeah, for every bevel father's is sort of like a try lighting environment setting. And if I click on that, you will see some different options for the environment skybox. And some of these are actually coming from this type of generation scheme. So you can see like the number of diverse image generations you could use to populate your 3d background. And this is you know, just scratching It surprised that like what you could be doing in terms of AI, with 3d tools, and I think there is, like an exciting future in front of us around auto generating 3d shapes, auto generating scenes around you and all that stuff. I think there's always this danger of sort of over doing it in a way that's like not really realistic or practical or like to school beyond out, like, that looks cool. So, yeah, bezel we're really just focused on what are some specific features that can actually help people, like, make that process faster, or, like make it easier to create cooler stuff. That's, that's the the perspective that we have right now. And there's like this, you know, it's super easy to like, simulate Oh, particular prototypes, where you know, you're in one space and with one click because we have states and interactions and bezel for either like jump or teleport to a different space that that's all very useful when you're storytelling or like a storyboard particularly 3d environments Yeah, that's kind of
segmentation in terms of
yeah do you mean like says we'll be able to sort of classify which object is which inside the scene and like segmenting that or I'm just trying to clarify what segmentation you're talking about
I see
Yeah. Yeah. I don't think there was a little support set today. Bye. I think like if you use a separate tool that takes out certain things from background and like you use that instead and you've always got a competence vessel so
of course
yeah,
yeah, if you guys do you have any other questions or I think we are reaching one hour so no worries if we can also cut it now. Then I just want to remind everyone that we have like a new kind of like a nice way to collect all of this it's actually in the about section is this website and just share it for a sec. Yeah. So this link that I just shared here is a link to it's a link to a page where I just collect all of the speakers that have been packed of this space. And you can just take you know, look everything in detail. I think there is a lot of values in this small talks and you get to know a lot of cool things. And feel free to join again tomorrow. There is actually a big event, which we did with joined channel which is called AR and marks are I think the the main guy at the channel is called Norman in they they do like this kind of community stuff very seriously. Like there are a lot of people. So it was like, I don't know, I definitely you know, if you guys want to join I think it's gonna be a lot of fun. Yeah. And thank you again, everyone. Yeah,