Arnold GPU Beta Walk-through and First Impressions | Greyscalegorilla

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

Chad from Greyscalegorilla.com covers his first impressions of the beta. He has been a beta tester for some time so I suppose it's not his first impression but still a good video to watch. It's not fully functional yet and has its ups and downs but for a first beta it is pretty decent to use. Don't expect the Octane killer that people seem to be hoping for just yet.

👍︎︎ 2 👤︎︎ u/thesneakyD 📅︎︎ Mar 21 2019 🗫︎ replies
Captions
hey what's goin everybody this is chad from grayscale guerilla in today's video i'm gonna give you my first impressions of arnold 5.3 GPU the new beta that was just announced let's jump in okay before we start a little disclaimer this is definitely beta software it's not necessarily something that you want to use in production yet it's very beta there's still a lot of work to be done but i just felt compelled to say that and just let you know these are my first impressions this is gonna change over time there's some things that they're working on that I'm not super excited about and there's some things that show a lot of promise so enjoy this video and always take with a grain of salt and hit me in the comments if you have any questions happy to answer them let's dive in so I've got a pretty simple scene here I'm gonna go ahead and make sure that I'm running the CPU mode which I should be pretty let's turn off adaptive good and no.8 okay that's fine yep okay cool so we're in CPU mode I'm just gonna throw turn on the IPR and let it go now full disclosure I am running a thread Ripper 2990 WX so I've got 32 cores I did turn off four of them so that the capture wouldn't have any issues and you're gonna see here actually let's turn on the buckets so you can see the buckets going we're gonna enable bucket corners in the IPR just so you can see what's going on here we're gonna make this a little bit bigger too so we can see it a little bit better so it's going at a pretty good clip again most people probably aren't going to have 32 cores I imagine somewhere between 8 16 some something like that but I figured you know we got to start somewhere I got to show you the switch because that is definitely the thing that everybody wants to see so here CPU mode going alright so I'm gonna turn open up my render settings jump into my system tab and now while it's going I'm going to switch from in device selection I'm going to choose my render device from CPU and change it to GPU and you're going to see it's going to have to send the scene over to the GPU which takes a second in fact this is one of my pet peeves I really hope they fix this and make this go faster because ten seconds is a sort of a drag alright so there it started I'll turn off I'll turn off the corners of my bucket since there's no buckets in here okay so you can immediately see it starts it starts to get pretty clean and if I jump into the main tab you notice that a lot of our settings are now grayed out that's because all of the sampling in GPU Arnold in this beta is going to be done through the camera AAA so that's good and bad well actually I take that back that's mostly bad for me anyway I don't like the fact that all of the samples are brute force being shot through the camera a because what that means is you can't necessarily clean up specific aspects of your image based on indirect spec or subsurface things like that you just have to send more samples into the entire scene which it's not great but you can see it's pretty quick let's go ahead and adjust this because 6 is not going to be enough you can see some noise here so let's jump this up to maybe 8 and then I'm also going to turn on adaptive sampling so adaptive sampling is something that they're saying is is the way to control your your GPU samples it's basically like a unified sampler if I turn this on we've got this becomes our min and this becomes our Max and our adaptive threshold is what's going to essentially tell Arnold to use either the min or the max I'm not going to get into adaptive threshold but if you're familiar with unified sampling it's very similar to that alright so 10 is not gonna be enough so let's go to like 16 and 8 actually we probably don't need to go that high we might be able to go like 6 and 12 which is really pretty low but we'll head and let this cook and let this finish up and of course you do have the ability to use the region which is great so what I often do is I'll region an area that I think is gonna be noisy so I can see it converge and know if my memory and Max are gonna be enough in this case it is it looks fine so I'll go ahead and turn off the render region and let that cook through and then we're going to compare it to the CPU okay so it finished in about 24 seconds and looks adequately clean now I'm going to pop over to CPU and we're gonna render that out at let's see we'll just leave it at the settings that it was I will turn off adaptive so that was like 20 I think it was like 22 or 23 seconds so we'll just see where the CPU Nets out all right so it's gonna finish in about a minute so you can see it's it's well twice over that twice the speed of the CPU version but you know there are some butts here we're going to talk about those a little bit but why is the why is switching from CPU to GPU important why is that a thing that people should care about mainly because if you have a machine with a beefy GPU you can start to look dev and do all your shader work and whatnot on your workstation but then when it's time to render you can send it to a cheaper CPU based cloud farm or if you have your own CPU farm it's going to be much more cost-effective to send it to a CPU farm right now than it is a GPU farm so that is a huge advantage in my opinion for studios and those of you out there that are using cloud based rendering okay so that is switching let's see what I got next here I got a bunch of scenes queued up we're just gonna try to bang through these okay so let's talk about my one of my other favorite things which is going to be the subsurface so there's a new random walk implementation of subsurface light scattering and it is pretty darn cool so I've got this head model here I believe this is all set to go yeah we're gonna go ahead and fire off the IPR again I I wish I really hope they get this sped up because I hate after being spoiled by the CPU waiting for this to turn on is sort of become a pet peeve of mine all right so you can see we've got nice random walk subsurface happening in in the IPR here let's give myself a little bit more room here we don't need all that space all right so I'm going to move this light around you're gonna see that head is gonna react you can see the subsurface happening in the nose and then if I bring it over to the ear and like maybe bring it down a little bit they put it right behind the ear you're gonna see it's gonna clean up and do a region around that but I will say this it does take a while to converge subsurface random is not a cheap thing to render it's computationally heavy and it's going to take some time to get that to not being noisy but let's go ahead turn on some of these other lights because it looks kind of cool and we'll turn but you can see it's a pretty darn interactive in fact I think I have my system settings my initial sampling set to negative one so I get a cleaner result but we could knock this down to let's say negative three which is going to give us even faster sort of feedback on that lighting and yeah so let's go ahead and take a look at what we got next all right so I think in this next scene I'm gonna show you a bunch of these heads all put together into one I'm gonna move this over a little bit into one scene and I wanted to show you some of the just how powerful the new optics denoise R can be when working on lighting and texturing and whatnot in the IPR alright so here we have a couple things going on we've got a bunch of really high res head scans being cloned via a cloner multi-instance new R 20 multi-instance cloner we got some depth of field we got some subsurface and it's pretty darn good right I mean it's let's make this a little bit bigger though it's hard to see let's go to like 65 yeah something like that all right so let's kind of zero in on this dude right here and let's let that let's let that clean up actually I think it's this guy that's in focus let's find him there we go alright so we're gonna let that cook actually that's not our focus dude who's our focus guy it might be this guy down in the corner yeah this guy's close enough all right so we've got some depth of field we've got a lot of stuff going on and you can see it is pretty darn noisy this would take significantly more samples to clean up this indirect subsurface these indirect subsurface samples right in here but I do like using the new optics denoiser so I'm gonna flip that on and we're gonna look at the optics denoising and you'd see it cleaned all that up it did get rid of some of the really subtle surface details in his head but you're gonna see that for a lot of things like certainly this type of scene it's completely fine and you can see it starts to converge rather quickly on our image and gives us a pretty clean representation of what we're gonna be rendering so yeah this is uh not too bad I wouldn't use this to render out image sequences I wouldn't use it in like you know as a way to render out your stuff because obviously it's not gonna be it's not gonna be super great for that in those cases I would use their noise denoiser or maybe a third-party denoiser but yeah that new the new optics is working pretty well especially with this head I don't know why but it just seems to work pretty well and we're and it's really interactive it's pretty darn interactive but yeah so the new random walk implementation is rock-solid I could see the ear looks really great in fact that's good and turn off some of these other lights I think if I just turn off this one and that one you can really see in that year it's really pretty okay so let's keep moving here I'm gonna move on to the next scene actually before I do that let's go ahead and like find a cool shot here these heads are just so weird I wonder if this guy had any idea when he was hired to get his head scanned that he was gonna end up in so many weird scenarios pretty wild okay uh alright let's jump into the next one which is going to be do some atmospherics light atmospherics are working in Arnold GPU the beta that if this launch is hopefully it will around the 10-second mark usually is where it Nets out all right so you can see we've got volumetric lights which is really pretty rad and of course they're you know fully interactive and whatnot let's gonna drop it down here if we want do this like cool backlighting on this mech we want to look into that light a little bit maybe find our camera actually already in the camera let's jump out into this camera and push that right above his head let's move this down a like 50% we go and it's converging pretty quickly there's definitely some issues on our mech which is sort of the next thing I wanted to talk about let's move this into the shot which great too is like now that we can see these area lights it makes it look cooler when you have these like visible light fixtures and your shot let's go and take a look at the samples on this I'm again it's just a first impression so I'm not gonna dive too deep into optimizing for Arnold GPU but actually cleaned up pretty well those settings actually did a pretty good job ok so I want to talk a minute about some things that I don't like and for that I'm gonna step into a pretty simple scene and let's go ahead and make this a little bit smaller so we can see better for this scene I've got these grunt these bron coffee pots which absolutely love this model just want to say shout out to Daniel Minh who provided this model for us and it's a pretty typical scene it's a product render it's on a white psyche you've got glass you're gonna have all sorts of stuff in fact let's jump into the render settings and switch into CPU mode and I'm just gonna quickly set up some render settings here we'll kind of knock these down a little bit and see where it ends up turn off adaptive yep okay all good yep yep turn on the buckets good let's go ahead and hit go and you're gonna see it goes pretty darn fast on a thread Ripper all right and it's a pretty it's got a little bit of noise here in fact let's go ahead and clean that up right now we're going to jump in here I do believe that's going to be happening in my diffuse let's just regen that let's make sure that's the case yeah okay that's in the diffuse all right cool so we clean that up let's go ahead and jump back out we'll let this converge give it give us a little bit more room typical product shot right clean render got some glass got some metal plastic GI big white psyche very typical use case so if I like bring this up to maybe like 65 and we're just gonna gonna isolate this guy right here maybe bring this up a little even a little higher I just want to pay attention once you guys to pay attention to the indirect samples on this white coffee maker and the glass okay cuz right now we're gonna switch over into GPU mode so I'm gonna jump into my system tab switch it to GPU I did turn off the render just to make sure there wasn't gonna be any issues yep that's all good I am gonna tweak these settings to like min of like eight I will turn on adaptive 16 will be my max 0.01 is my threshold all good okay so let's go ahead and fire that off and what you're gonna see once this picks up is how much Arnold GPU beta is struggles with indirect samples both indirect spec and indirect diffuse in transmission as well to be honest it just seems to have a real problem I'm not sure if that actually kicked in or not let's see it's definitely taking a lot longer to kick in and I'm not entirely sure why well let's hope it actually finishes here again this is beta so all of this stuff will hopefully be improved okay there it goes for some reason I don't know why that scene decided to take 40 seconds to load which is a little bit concerning let me make sure that yeah okay well I don't have any answers for that okay so that is taking now what like a minute and it's still not quite clean in fact if we bring this up to 100% of our output size and we just reach in this area around our around the edge around the edge of the coffee pot here you're gonna see it takes a really long time to get rid of this noise and for us to really see it we're going to zoom in a little bit all right so it's gonna struggle in here and that's my biggest beef with Arnold GPU as it is today in this beta is that in direct samples are not very optimized in fact everything is going through camera a can do a little bit of control with the adaptive sampling but it takes a really long time to clean up rough or glossy reflections or fractions and even indirect GI it's a real shame because you can see here it's it's pretty grainy still so if we render this out at full res let's go ahead and do that we're gonna let this render out full res and then we're gonna switch into into CPU so that you can see that as well okay so it finished around just about two minutes and I'm still seeing some grain here in this part of the coffee maker and a little bit in the glass this is not a render like this in any other engine would have taken maybe 45 seconds to a minute on my on my machine 228 et eyes by the way um so this is a bit concerning I hope that they can they can solve this make this a bit faster because I think majority of people are gonna run into this limitation and this this sort of like sampling issue now for comparison let's go ahead and jump into CPU mode and bring our camera a down I think I was doing it at like for something like that and I'll turn off adaptive and let's go ahead and render this out in CPU mode and and see where we net out okay so I didn't eat the settings much and I just kind of let it let it cook on where I was and you could see the CPU version took 1 minute 11 seconds and the GPU version took 1 at 54 seconds so the CPU version is actually quite a bit faster now I do have again the thread Ripper is a bit of a special case but you can see if i zoom in maybe like 400% the CPU version which is right here is much cleaner than the GP version in terms of noise so this GPU version probably should have cooked even longer than it did to converge on a cleaner image so again I hope that they fix this sampling stuff and make this a little bit faster and cleaner or at least give us controls to maybe affect specific samples for specific areas like you know interact inspecting like that but anyway overall it's a super strong beta I'm excited to have Arnold GPU finally out I've been waiting for it for a really long time I'm excited to ever before have everybody play with it and see what everybody starts making and as they update it be sure to check back here I'm gonna do more videos on first impressions and deeper dives into features and whatnot so I hope you enjoyed it and I'll see you in the next one alright so that was my first impressions of Arnold GPU 5.3 beta if you have any specific questions so as I said before hit me in the comments below happy to answer them I think it's overall got a lot of potential and I'm definitely excited about that being able to switch from GPU to CPU and send it to a CPU farm I think that's fantastic I do think the sampling needs some work it's still not quite as fast and as I'd like it to be but it's a great first start and I'm excited to see where they take it so until next time thanks for watching
Info
Channel: Greyscalegorilla
Views: 64,732
Rating: undefined out of 5
Keywords: Cinema 4D, c4d, tuts, tutorials, motion graphics, greyscalegorilla, autodesk arnold, arnold gpu, beta, c4dtoa, walkthrough, first impressions, render, nvidia, threadripper, cpu vs gpu
Id: Ua1jrkDxbdc
Channel Id: undefined
Length: 19min 7sec (1147 seconds)
Published: Wed Mar 20 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.