Hands-On with Lightform Projection Mapping!

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hey everybody someone from tested and it's Jeremy from tested. So some of you may know that Jeremy and I do a weekly show called Projections but one of the things that we haven't covered untested is well actual projections projection mapping Right. So this isn't necessarily fit into that bucket, but projection mapping is just very very cool. Not like about a year ago we were talking on the podcast about how people at Christmastime or holidays in general they take these Lights lasers and they project them on the side of their house and it just looks kind of cheesy I mean it looks very warped and and you know not like projection mapping should Right where the light actually conforms itself to the geometry of the object Your project exactly exactly So I was thinking would it be great if like somebody made a tool that allowed to anybody to? projection map onto their house or in in their rooms or an object or whatever you want and Somebody has a new company called light form. Just stay beautiful Consumers and prosumers so we went to visit their HQ and chat with our CEO about their technology and check out their light form projection mapping device Brett thank you so much for having us here. Yeah, of course. I'm super stoked to see light form So you guys make this device and working on it for a couple years. It is a projection mapping device. Yes, exactly So it's a hardware and a software solution purpose-built for projected AR or projection mapping So what the light form is is it's a high resolution camera. So it's a 4k color camera There's a computer inside of this and you take this and you actually mount it onto Almost any projector and so now it's attached and what we've done is we've turned this projector into a 3d scanner So we can take this we can point it at any object in the real world We get a 3d scan of it and then we can apply magical effects or we can ply information on to the real world using projection I've seen projection mapping in a variety of places and some amusement parks like Disneyland It's in retail locations even like eSports events, but not really in the consumer space This device looks really small and compact. How does this compare to something that like a disney would use for a theme park? Yep Yeah, so we actually got into projection about 10 years ago while at Disney Imagineering So I was adjusting measuring fill was touring with Skrillex doing big projected visuals So light form is all about taking those big kind theme-park level Magical experiences and then boiling down into a solution that is accessible for any designer to create project today R so we'll talk about kind of what that scanning process is so light form is designed to Scan static scenes and then within a couple of minutes create super compelling Magical effects. We have a couple minutes right now. Can you walk me through? Yes Devlin's processes Alright, so we'll set this one to the side and go to our demo in unit over here So, let me just grab a stormtrooper helmet here Alright, and then we actually 3d printed a tested logo or on the form one printer over there so I'm just gonna set these objects down and Then what we're gonna do is we're gonna scan it So right now the camera on the laptop you can actually see a live feed of what the camera is seeing and then Phil is gonna trigger a Scan, so we'll see what that looks like. So projection mapping really it's kind of backwards you map first and then you project on the computer's understanding of the shape and sizes of the world and that's actually why we call this projected AR and not projection mapping because We think the mapping out of projection mapping So it's more about the experience in the content that you can create and less about that painful process of mapping pixels amount objects in a lot of AR devices we've seen mapping the world Understanding the shapes and size of the world's a huge part of it. How does this do it? Because here I see it's it's one lens Yep, right, you're not using like what a Kinect would have in terms of like IR sensor as blasters or stereo Even how does this understand the shape of that? Yep, so it is stereo and it's stereo because You think of stereo is like left eye right eye right and that's how we see in 3d But the left eye is the camera of the light form and the right eye is the projector So they're actually talking to each other through light So those patterns that we saw were black and white patterns those were zeros and ones so it was actually sending binary code to each other through light and at the end of the day what we end up getting is A projector resolution scan so you can get a 1080p scan of whatever you point at so much higher than a real-time depth sensor and we can also use a projector with different lenses so you can actually zoom in and out of The lens on that projector you could use different lensing Projectors and we are capable of scanning any scene from 3 feet to infinity so you can scan a coffee cup Stormtrooper mask or a building across the street. And so your projector agnostic. Yep, and projector agnostic So any normal throw projector is what the LF one supports? productors have a vive wide range of throws like a Small pico projector and the angles you but you could mount this on a small thing habit behind your television and and it would still Work fine. Yep. Yeah, so it's actually designed to be mounted with adhesive So you literally just peel off a sticker plop it on to any projector You can put on the top or the bottom plug in HDMI and the power and then you're good to go You turn your projector into a 3d scanner the calibration though. What about the calibration between the scanner do that? all automatically using computer vision So we're all PhDs in computer vision and design and spent two years building a system Read have to worry about any of that. We just do it for you. All right Well, the scan seems to be done those get much faster than a couple minutes What does the computer know about this world now? So if we on fill screen you can actually see that we have the scan so we have the color information And then we also have the depth behind that Okay, so we have a projector resolution depth and color scan and we're going to use that to then Outline an object in the scene and apply a magical effect to it. You can was saying projector resolution So the higher the resolution of a projector the better fidelity your scan is correct up to 1080p So the hardware is designed to be cheap. And so we support up to 1080p so there's no understanding of what the scene is that this is a helmet or it's a head or you know? It's a box. You're manually masking But what tools does the system in able to help you with that masking? So we have a bunch of easy things that you expect in like Adobe suite So we have a pen tool that we spend a bunch of time on for quickly outlining objects You just have to kind of trace an object. We also have quick select tool so quick select a magic wand like shop has But instead of just having color we use depth as well So it's not just looking for outlines or dark shapes so you could select a white coffee cup on a white table Okay, select the coffee cup using depth in center. Okay, so you want to they're selecting the outline here right picking out those those shapes You can tweak I assume you know and refine your mask and then with that mask Put a photo or a video on this so you can do images or videos so you can use your existing assets But what we're really excited about is these projected AR effects So these are intelligent effects that are actually using the color and the 3d scan of the scene So this is actually wiping in depth through the object. We call this one digital fade So there's a library of these effects and these effects are actually real time So they're running on the device on the computer. That is the the light form lf1 not compiling Export in this video. Yeah. These are shaders, right? So there are shaders that are running on the device in real time. And what we can actually do is we can control them So you can go over here of that and actually control different parameters of the effects and Then Phil will show you Mapping we have this tested logo here and Phil show the pen tool mapping the tested logo and applying another effect Very so we're publishing that right now So it's actually saving the project file transmitting it to the light form and then playing it back into the real world So that's the workflow right? You're working. You're creating your mass you are adding an effect Your library effects may be adjusting some variables and they are stacking the effects as possible And then once you get published that all sends to the light form, you don't need to have a connected computer, correct? So the coolest thing is Phil could actually close his laptop and walk away and we just made permanently deployed magic. Mmm It's all runs locally. So but what's inside here essentially is like a small computer. Yeah HDMI output It's a quad core a nine processor. It's designed to be permantly installed. Mmm So, you know if you're an AV installer, this is actually a fanless design and there's on-board storage So it's not like an SD card that's gonna fail over time Yeah, the idea is that it's computer vision hardware that makes it really easy to create compelling scenes in seconds But then you can permanently deploy this if you're an art gallery You know in your house or at a retail store now because you're running real-time effects on here and even combining that potentially with static images videos What's the load of this machine? Can you that at some point do you lose framerate? Is there a maximum? Complexity you have for this. Yeah It's very easy for an artist to throw things at a computer to make it run really slow Because it's basically a mobile phone processor And so what we do is we intelligently try and kind of optimize the scene. So we'll take if you create a bunch of really Complex effects like a particle system will actually render that to a video file and send the video file to the device Okay, so then you can have a mix of passive and interactive content running on the device, but you don't have to worry about You know kind of optimizing your content we do that for you that this is super cool I know this is something you guys bought and you have run this demo before but the whole Scan to finish was just a couple minutes. And this looks really neat. Yeah, so we actually Have you brought in an object? Yes, so now we're gonna map an object that you chose not us And I want to get my hands on that software. Yes, definitely. All right. Check it out. Sounds good So we brought something from our office, of course the zorb is the f1 from the Fifth Element nice static prop pretty complex and Phil here you guys have scanned this Yeah, we have so we have our projector scan as our background here so again, it's it's really easy to actually go in and use this as a reference as we're tracing the outlines We can do a quick selection. Just kind of select all the gray areas and then convert that to vector in this case we've actually gone and Created a creative mask ourselves Took only a few minutes and what we can do Is if we want to actually interactively edit this mask I can stream the preview to the device So I have a crosshair now that's out in the real world and if I wanted to just tweak some of that barrel right there I could go and Get that really precisely aligned and so everything that's bright here is Within the mask, correct? Yes. So we're projecting white to represent the the mask of the area that you're gonna be mapping Okay, and then what's really handy? Is this interactive crosshair? So as I'm kind of editing these points in the real world, I can see exactly where they lie so we bring this point in a little bit reading that point out a little bit and then we have a Kind of sub pixel accuracy here so we can scroll way in and we can actually get down to you know our Bezier handles and this is when you really want to get it nice and precise but for most applications you can just do kind of a rough selection and that's gonna be portent because the angle this projector is gonna be Different and where the shadow is kind of overlap. Yeah, you want to fine-tune that if I shift this though, you'll need the rescan Yeah, correct so by vectorizing the mask outline It's very easy for us to just select that whole shape and then do a transformation on it and move it On our roadmap. We have we call Auto fine-tune which will account for small shifts in projector movement or object movement So typically when you have an installation going for a while You know things heat up cool down buildings move a little bit right there and you can notice that You know five pixel offset so with computer vision We'll actually be able to help solve effort and that's for kind of lateral movements or just fixed planes scale changes If I do a big rotation or even a slight rotation, you'll need to tweak that mask manually Correct. You will you will want to go in and there's significant changes to the scene Tweak your Bezier handles a little bit. But what's nice about our instant effects. Is that the content underlying you don't have to recreate that You're actually able to Just apply that and with the new scan and the new underlying projector pixels that content just kind of works right away exactly How you would seen it before correct? Yeah, so This one. We actually you know have a some nice depth data there. That's cool so what I'll do is I'll bring up our effects pane and we'll go and Use a depth trace effect. So this is an effect. That's actually using the underlying depth data. I'll go ahead and insert that and Then we can actually publish that So you do see a preview of the effect graph is playing using that same depth at the same visual data the edge data Correct, and we actually have in our software. We also have a preview tab, which will allow you to preview What you know kind of a more photo accurate representation of the scene would look like while you're in the software Let's uh, let's also give our favorite digital fade a try and then we'll switch back to our color there and Publish that and then what can you do after you've published the scene that kind of tweaks? Can you do to this animation? Yeah, so I actually have this MIDI controller hooked up. So we support OSC as you saw before MIDI and then other kind of input/output devices so I can actually interactively adjust the speed and color of this effect So I can change it to more like a yellowish or a reddish And then actually speed it up or slow it down a little bit So we could spend a fair amount of time going through and actually mapping the various components and portions of this device, but we end up with You know quite a few effects and we also have stock video integration So yeah, as I would say everything here so far has been rendered. Yeah, but let's see what video looks like or yep So I'll go ahead and I can import a label into our software Cool, and so now we've got kind of what you would expect for a traditional Mapping tool so I can actually go ahead and I can then Adjust our mask. I can go ahead and adjust the underlying shape or structure. And so if I wanted to actually Add a little call-out so say we're doing like a museum exhibit. Mm-hmm for our for our collection Go ahead and map that Luckily, I have this nice grid on the pegboard there They can help me align it Otherwise mapping text is like a pretty difficult problem just to get you know your alignment with your scene properly but having that underlying scan there is really handy for this and the type of transforms you can do or beyond just the skew Correct. Yeah, so I could actually also scale the content within this. We also have like a mesh warping So say you had like a cylindrical or spherical object, you know, you could actually take your content and warp it around that very easily You know seeing text there occurs to me that Dynamic content to be really interesting. Yes right now its effects which is rendering real-time and imported images but what type of dynamic content could potentially put on this so We'll have a fair amount of Social media integrations so you could display your Twitter feed. Yeah and have you know, say you're a business and you wanted to Display a hashtag you could actually have that kind of integrated and then that would be dynamically populated based on your hookups with your different social media and Then you could also have effects and video content running simultaneously And it's also super cool. You mind if I try my hand at it and create some effects Yeah, please I might I might scoot this label up a little bit for you there so we can see it a little better But yeah, absolutely So Brett I had a chance to use a little bit of the software They'll give you a demo of our on the CF one It's definitely intuitive and fun, but I also can see that so, you know, you're iterating Fast on this. I know you guys launched the pre-order campaign Where are you guys at in terms of the software and also the hardware? Yep, so we launched three weeks ago and you can go online to reserve your light form that light form ships in November So we actually already have final Hardware, so it's FCC certified and we're actually shipping pallets of the hardware right now to a storage facility and what we're doing is working With select early adopters with the program that's already been sold out to refine the software and kind of get that process You know to where it's really smooth and you know completely bug free and then we'll be shipping out 2,000 light form units in November Wow is software where you see the iteration mostly coming forward Yep in both user experience and just capability. Yeah, definitely so You know the hardware kind of stays the same, but we actually have a free a pro and an enterprise version of the software And there's monthly updates. So we say that the hardware gets smarter every month because we're pushing updates and that includes the free version What things you say are on your feature lists or your priority list for things that get to users? Yep So everything that we showed you in the demo outside of the stock video is actually in the free version Um, so a hundred thousand searchable stock videos is in the pro version and we're also adding support for multiple projectors So being able to synchronize a timeline across multiple projectors So just just to have multiple displays but not on the same object, like for example this globe Projectors gonna hit this face a bunch of other faces. I may want to project. Can you combine them? We're starting with kind of separate scenes so You know in the office you could have all of these demos on a unified timeline and then have like triggers with you know Like an iPad app to make them kind of all on a unified timeline So they're the same experience and then down the road. Eventually. We'll look into multi projector support But what we want to keep is the authoring process really easy and if you noticed it's 3d data But the authoring is in 2d and that's what makes it really simple Awesome. Well, thank you so much Bret for having us here. This has been super fun and I never thought we'd see projection mapping done so easy With device like that. Yeah. So we, you know just scan this object as well and just showing off here We have an iPad app where I can change the size or the color of the effect, right? So the effect is running real time on on the hardware and we created this, you know Within whatever 10 seconds, and now it's permantly deployed. I know you a lot of your target Customer erases retail and and location-based up experiences, but I want this in the home. Yeah Well, that's why we built it because we want in our own home, too So awesomes pleasure meet Joe who's really nice to meet you. Thanks Okay, Jerry, was that what you expected it was it was exactly what I expected Of course now that I've seen this I could think of what I want next but what it is is exactly what I hope for I mean I think that this is this fulfills that Request that I had a year ago where you could take one of these aim it at your house at nighttime And you know do some pretty interesting mapping the thing I took away most with chatting with Brett and Phil there is how difficult this no previously it would have been you do and why we don't see in the forum why we only really see it in places like in retail and amusement parks and and Big set ups against the side of buildings now and why it's not in the hands of people like you and me Yeah, I think traditionally like the to do good projection mapping. It's taken multiple disciplines and it's Shoehorned all of these technologies together in a way that's really effective but very specialized, right? I don't want to do that coding. Yeah for the mesh the we the the geometry and bending that light around there I want to focus on the animation resign and a lot of these effects that they showed us made that really Simple now we were only there we tried a few things or we tried them cf1 We tried this globe and it was effective for for even like flat surfaces with with outlines Right. Those animations are cool. We didn't get to see the entire Range of animations but they've built in nor what it would take to make your own animations. Yeah I was really intrigued by that too So all the animations are actually shader based and there's a tool out there called shader toy, which is a website You can go to and it's basically a demonstration of what you can do with shaders and you can contribute your own And that's and there's an enormous amount of possibilities when it comes to this kind of programming, right? and so they were saying that eventually You should be able to do just about all of that on this device may be limited by the processing power It is their mobile chipset, right? And you know what? We saw the objects were small We didn't get to see you what it would look like if they use light form on the side of building on a car Right example and if you're throwing the projector longer distance bigger wider spread Does that take more computation right true or like how many FEX are you running at the same time? Yeah, we did see framerate drops if you had multiple effects going and they don't cap that right now They're trying to develop these effects with a sweet spot in mind But it's going to be up to the users to really take a look at the final effect and see is it meeting and framerate That they want and I know you know for the examples they had even at like weddings You're lighting on the wedding cake or new signage look cool That's like practical applications and I was thinking of like I want to put this on our set Right. I want to use this against the tested backdrop against those hard edges and light it up or for filmmaking I could see filmmaking Filmmakers using that the cool thing about doing stuff like that is you don't have to use the whole throw of the of the projector You don't have to like fill in all the gaps and put effects everywhere You don't see the edge that the framing box that you normally would see when you shoot a projection image against the wall, right? Exactly exactly, but just doing little subtle things like having something move around the background that we have have said, you know even that would be very effective and if you do it Right, you can't tell us projection mapping you sort of look at you. Like is that a light that is on that surface? Yes Yes, right as opposed to putting an LED string back exactly that's real light that bounces off and adds fill light So a person's standing in front of it. So it's like watching like alien or Blade Runner right? All these science fiction films Where you actually see a lot of projection mapping being used now in the filmmaker in that Futuristic world this can happen in the real world I love that there are shaders because it means it's real-time and it means that you can have external You know switches or the sliders or any kind of input Effect the effect whether it's the actual content if it's text or if it's the color or rate or anything like that I mean it it's baby steps towards what could be highly interactive You know I said if it's whether you're in a space where the people walking through? Or maybe it's fed by web content or eventually maybe in a video game well that's the thing dynamic content is What they don't have yet and they talk about they will add Twitter feeds or text in the future. You want a computer interface I want the me turn this zf1 Into my monitor right that globe any shape device to be a mirror of my desktop because then I can add interactivity Then I can put sense on that and have that change base of how I interact with it, right? And are you also then talking about you want to be able to move that object around? Well, that's the thing They don't do also explicitly static objects right now and they talked about how when they first started as a company Yeah, they wanted to track objects in real time That's an incredibly difficult computational challenge and that's what not what their product is aimed at But yeah, of course, I want that in the future. Yeah. Yeah, so very cool light form It is available for pre-order and they said they're shipping them later this year We'd love to get one in and using our asset and it's again opens possibilities and what you can do with art and creativity Using light. So thanks for watching and I will see you guys next time
Info
Channel: Adam Savage’s Tested
Views: 651,486
Rating: undefined out of 5
Keywords: tested, testedcom, projection mapping, projection map, augmented reality, ar, vr, projector, light, video, scan, slam, disney, store, hologram
Id: zKAzVr8ULF4
Channel Id: undefined
Length: 24min 54sec (1494 seconds)
Published: Thu Jun 28 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.