SynthEyes Essentials - 01 Introduction to 3D Matchmoving and Tracking [Boris FX]

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
So what even is SynthEyes? Well, SynthEyes is a 3D tracking and match moving application, so it goes beyond just 2D and planar tracking, and it can actually recreate not just the motion of the camera used to shoot the footage, but also the lens of that camera. It can also model that lens's lens distortion, and it can reverse engineer focal length changes if it's a zoom lens. And while doing that, SynthEyes will also recreate a point cloud representing the landscape or architecture depicted in the photography. So yes, Synthize can help you place 2D elements in your footage, but being the gold standard for match moving and film and TV, it goes far beyond just that. And now, you have it. So let's take a look at some of what it can do. <b>[Music] When you open the SynthEyes, you get dropped right in the summary room. All the tabs at the top of the user interface in SynthEyes are virtual rooms, or pages, with controls for those tools. But for now, we're going to just stick with the summary room. The Merriam-Webster dictionary defines the word summary as, <b>"Done without delay or formality. Quickly execute it." And the summary room is where we can, without delay, quickly execute an automatic 3D camera track. In the top left of the summary room, there's the auto button. So let's give that a push. SynthEyes is now going to ask you for an image sequence or a movie file. This is a pretty simple shot with no real challenges, so it's ideal for this demonstration. Once you select your clip, SynthEyes will pop up a window and allow you to tweak some controls, but we're just going to forge ahead and hit the OK button. At this point, SynthEyes just takes over. It will think for a bit. It will think longer if your shot is long. But after a short wait, it will finally solve the scene. Let's hit OK on the done solving pop-up. SynthEyes puts us in the solver room, and here we see a cloud of points that represent the image features that SynthEyes picked to create this three-dimensional point cloud. Tracking this shot was pretty simple and easy, but SynthEyes gives us a lot more control over auto tracking. And for that, we'll head over to the features room. I'll just click File > New in the title bar pull-down menu and reload our footage again. I still won't change any of the defaults in that first window. I'll just click OK. Now let's go to the features room. This gives us a whole new set of controls on the left side of the screen, and all of those controls have to do with blips. So what's a blip? Let's take a look. I'll click the button for blips all frames. SynthEyes immediately starts working, and a few seconds later we can see a bunch of red and blue dots. If we scrub the playhead, it looks like those red and blue dots are tracking features in our photography. But those dots are not trackers. They're just features that SynthEyes identified to later convert to actual trackers. Those red and blue dots are the blips. So why not convert all the blips to trackers? Well, let's take a closer look. Not all of these blips are of the same tracking quality, and those bad blips will throw off our 3D camera solve. And they'll throw off the solved locations of our 3D point cloud. So SynthEyes will choose the best blips and convert only those blips to trackers. We'll do that with the peel all button. Pay close attention when I click it, because the results may be hard to spot. I'll undo that and peel all one more time, just to help see the difference. Blip trackers] I could make the converted blips easier to see if I were to click the clear all blips button, or I could just go to the trackers room where the blips aren't even displayed. And even still, the converted blip trackers are a little hard to see. So let's turn off show image in the floating display menu. Now you can really see the converted blip trackers. And if you scrub the playhead, you can even feel like you're moving in 3D space through a cloud of points. And this is how 3D camera tracking works, by analyzing the parallax between tracked 2D features in the foreground, mid-ground, and background of motion picture photography. But now all we have are just the 2D trackers. Let's turn on show image again, and let's head over to the solver room and tell SynthEyes to solve our camera and the point cloud for all those trackers. This is where the real magic happens, where we solve our 3D camera track. Without changing any other controls, let's hit the go button. And there's your 3D camera track. And your point cloud based on your automatically tracked 2D features. This is pretty much the same result we got in the summary room by just hitting the auto button. And this is where a lot of people just call the shot done and walk away. But in a real world 3D tracking pipeline, we are definitely not done. Let's start by looking in the lower right of our screen at the big red Hpix number. Hpix stands for horizontal pixels. And it measures in horizontal pixels, the average error of how badly the points in your 3D point cloud don't align with your 2D trackers. Let's go to the trackers room and take a look at exactly what that means. Once again, I'll turn off show image. And in the view floating menu at the bottom of the screen, I'll turn on follow 3D. Now we can just select any old tracker and compare its 2D position to its 3D point cloud position. These 2D trackers do quite a bit of floating around against their solved 3D positions. And that is the error we're talking about. If we take the average of all the errors of all the frames for all the trackers, that basically gives us that Hpix error. And in real production, we want that error below one if possible. Let's turn on show image again and go back to the solver room. So maybe your compositing or 3D animation team wants a denser point cloud. This point cloud is pretty low density and it isn't going to be enough. So let's start over and fix those two things. The Hpix error and the point cloud density. Let's start over at the old toolbar menu, file new, or we could just use the file new hotkey command n or control n if you're on a PC. Once again, we'll select our drone footage and just click OK. But this time in the features room, we'll click the advanced button. This brings up the advanced feature control window. Let's take a closer look starting with camera view type. We're going to focus on small and big and we'll peek at edges and corners. But we're going to start with small. If we increase the small blip size, we can basically see that we are blurring the image to a greater or lesser degree. So the blurrier the image, the larger the features SynthEyes will look for. Same goes for big blip size. The higher the big blip size, the blurrier the image, the bigger the features that SynthEyes will blip. If we look at the balloon help for corner trackers, it says most useful for man-made sets like movie sets. So let's leave that off. But we can take that peek I promised at what SynthEyes sees when it identifies corners and edges. Edges looks basically like a Sobel filter and corners just looks like pure Tron level witchcraft. We'll come back to this at some point in the future. For now, let's go back to viewing the small preview and let's set the small blip size to four. Let's look at the big blip size and set that to eight. And because we want a denser point cloud, let's set the maximum tracker count to be approximately equal to our frame count, about 700. Now we'll set the camera view type back to normal and click blips all frames. After the blips are computed, I'll click peel all. Let's go back into the trackers room and turn off show image and let's scrub the playhead. Back in the features room, adjusting the small and big blip sizes and maximum tracker count is a great way to get control over what size features you want to track and just how dense you want your point cloud. But let's turn on show image again and do another pass. This time with larger small and larger big blip sizes, set the small blip size to 12 and the big blip size to 16. And then click blips all frames again. Once SynthEyes computes those larger blips, we'll click peel all again and get even more trackers. We can see this even more easily if we once again turn off show image in the viewport. Back in the trackers room where things are easier to see, these trackers are larger. This reflects the larger small and big blip sizes. Okay, let's turn show image back on in the viewport and hit that go button again and take a look at our 3D camera track. Our solve will take a bit longer because we are now solving 1400 trackers. Now our point cloud is much denser but our error is still pretty high and our world seems a little off kilter. So let's fix all that starting with the misaligned world. Let's go to the 3D room. Here we see a button labeled hole. Let's click that. This allows us to transform our whole scene, point cloud and camera. In the top left of our user interface there are buttons for move, rotate and scale. Let's rotate. Now we can click and drag to rotate our whole scene. Wherever you click that will be the center of rotation and let's move our scene so it's centered in our 3D world and so the ground lies near the ground plane. Now let's deal with the high error. As I said earlier we really want this number to be below one HPIX. In the menu bar let's go to track cleanup trackers. That brings up the cleanup trackers window. Let's fix the bad frames and set them to disable. This will turn off the trackers on bad frames and not just clear their tracking history. It just makes the scene a little cleaner looking in the graph editor but we'll look at all of that another time. We can also leave everything else in its default state and do note the checkbox at the bottom of the window. When we do this cleanup we're going to clear all of our blips. Finally we'll click the fix button. That looks like SynthEyes just deleted a bunch of trackers but didn't reduce the HPIX error of the solve. That's because we need to do one more thing. Refine the solve. We need to change the solution technique to refine and click go one more time. Now our error really dropped but we can do better. <b>[Music] Every lens has lens distortion and that includes the shot we're working on right here. In the solver room is a checkbox for calc distortion. Let's check that and refine again. And our error drops significantly but we can do even better. There's a more button next to calc distortion and that brings up the advanced lens controls window. The old lens distortion model in SynthEyes is called the classic model but recently SynthEyes had its entire lens distortion system updated to the most modern and advanced lens distortion methods available. So we're going to switch our distortion model to standard radial fourth order. This is a bit better than classic for modeling spherical lenses. Now we have a bunch of mysteriously named lens distortion coefficients but you'll typically use these starting with the top and working your way down. So let's switch C2 to calculate and then click the go button to refine our solve. That seems to improve our HPIX a bit. Now let's set U2 and V2 to calculate and refine our solve again. Another very small improvement and finally let's do the rest C4, U4 and V4 and refine one more time. A huge improvement. You really don't want to just turn on everything all at once because each refine builds on the previous solution and turning on everything all at once can often give worse results. And when using these coefficients if you don't really see an improvement in the HPIX you should really just leave them off. But there you go. Now our HPIX is well below 1.0. Oh and while we're at it let's delete any trackers that are floating around way off in the distance. Just lasso select them and hit the delete key. Then the 8K on your keyboard will recenter your viewports and it's always a good idea to refine the solve one more time to make sure deleting those trackers didn't blow up the whole scene. Now our scene is ready to be shipped. We still have a few things to discuss like building reference geometry for whomever is going to use this track and exporting our track to different 3D animation or 2D compositing applications. But we just covered a lot of ground and you now have some tools to get you started with achieving great 3D tracks in SynthEyes. Because SynthEyes support is exporting to so many applications delivering your tracks deserves its own tutorial. That's on its way so stay tuned. Thanks for watching this SynthEyes intro tutorial. Subscribe to this channel to be the first to see future videos and to learn more about SynthEyes go to the Boris FX website at borisfx.com.
Info
Channel: Boris FX
Views: 17,232
Rating: undefined out of 5
Keywords: boris fx, boris fx sapphire, boris fx mocha pro, boris fx mocha, boris fx continuum, boris fx silhouette, boris fx optics, boris fx particle illusion, how to use photoshop, how to photoshop, how to use after effects, how to use adobe premiere pro, adobe photoshop, adobe after effects, adobe premiere pro, avid media composer, lens flare plugin after effects, planar tracking, rotoscoping, vfx paint, best after effects plugins, best premiere plugins, davinci resolve plugins
Id: IIF1Htbog_o
Channel Id: undefined
Length: 16min 4sec (964 seconds)
Published: Tue Feb 06 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.