생성하면 엉망인 손을 자동으로 고칠 수 있다고? Depth Hand Refiner! #soylab

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hello, I'm Choi Don-hyeon of STABLE DIFFUSION KOREA. Nice to meet you. Today, I'm going to talk about the ControlNet update called Hand Refiner. The beginning is a very thankful person who appears a lot in our content. Sivan Okcuoglu posted this. This time, the ControlNet has been updated, and the function of automatically sending the detected map has been added. That's why I found out that the ControlNet has been updated once more. I looked it up, and I found the depth_hand_refiner. This refiner literally means that it automatically fixes your hands. There was an extension description related to depth before. It's very similar to this, but it's one of the things that's added here that automatically shapes your hands through a model that has learned these parts. I found this, and I tested the related parts of the soy.lab. Posting I posted this on STABLE FUTURE KOREA. Today, we're going to talk about the Depth Hand Refiner based on this. Now, to check out the news about the related parts, you can click on the link here. Then you can access this page right away. Let me summarize the contents here. Basically, after the update of this version is updated, Here, on the preprocessor, You can see that this depth_hand_refiner has already been added to the preprocessor. So, if you use this function, This preprocessor is this kind of hand. We're going to show you the original shape of the hand we want to express. It means to make it in the form of this kind of Depth. So, if we were going to fix this before, I should have used these things. If you look at this, there's an extension called 3D OpenPose. If you use this extension, There's a part related to the hand here. If you press Generate, In this way, each of them will come out as a path. This is how the Depth was made. We could have used it to fix the hand shape. But in this case, Even if we don't control all of these things, It's very important to make the related parts automatically. When I first saw this, I thought, My hands are all over. But once I opened the lid, I think there's a little more I need to explain. Anyway, the important thing is that there are some things that don't work. One of them is that if you cross the parts related to this hand, You can only recognize one of them. Another one is that if you actually use it, When you reproduce the parts related to this hand, In the case of the real type, In the case of real life, It showed a very high success rate. In the case of the 2D form, I checked that it didn't work really well. Of course, In this part, If you have time and graphics cards, You can find it so that it doesn't take that long. In some ways, I think this is possible. I think it's best to understand this. Now, let's move on to the first one. Let's look at the I2I function. What we're going to do this time is that we're going to do this. Two things related to the real and the toon. I'll try to modify it through the hand refiner. Yes, now I'm going to fix the image. I'm going to put an image with a finger error here. And I'll set the resolution. The resolution of this image is this resolution. I'm going to increase this a little bit. I think you can basically use more than 0.75. Here's the seed value. I'm going to put in the seed value I tested before. Enable the ControlNet settings. Press Depth. You can change the related parts here to depth_hand_refiner. Now, the preparation is over. In this state, I'm supposed to mask it here. What I mean by that is, Put the image exactly in the inpaint here. You can just mask the parts related to your hand here. Like this. You can mask the erroneous parts like this. You can work on it. But I've already made a mask here. Erroneous and mask image. I'll apply it like this and create it. I'll zoom in. You can see it's been completed. It's neat, right? If you compare it like this, We didn't have a finger before. I didn't have a finger like this. You can see that these parts are automatically solved. In the process I just showed you, I didn't change the settings much. I changed the Denoising Strength. I activated the rest of the ControlNet and created the result. This is how the part related to the hand came out correctly. It's really amazing. And then, I'm going to change the image here like this. And this time, I'm going to use the inpaint_global_harmonious. This is actually a 2D image. This hand doesn't work well. But I'll show you how it works. The image here right now is If you go to the site we went to earlier, There's this image. You can drag and drop this and use it. You can apply related prompts and work on it. Now, in order to create it here, I'll put the original image in InPaint. This setting is the same. Actually, the important thing here is Let me first show you the result with Depth. Now, activate it in ControlNet. Enable here. Press this. Put it in like this. Change it to Depth. Refiner. If you do this related part with Allow Preview, In this way, the shape of the hand is automatically selected. If you look at it now, You can see that the hand shape is very well created. It's very interesting. The model here is I made it by deducing this picture. In this state, it would be great to just create it. As I explained earlier, I'll have to mask it. I'll mask it once. You can proceed. In order to show you this quickly, I'll apply the setting value right away. Now, I'm going to proceed in this state. What's a little changed is... Compared to before, I've raised the Deno Distance to 1. And now, if you look here, I'm going to use Depth and inpaint_global_harmonious here. I'm going to create it now. First of all, I turned this off. I'll create it in this state. If you look now, there's nothing on the front. Now, this is how the picture came out. The parts related to the fingers earlier You can see that only four of them have been solved. But if there's a problem, You can see that the pictures around here are broken. The original picture wasn't like that. The original picture had a building like this. It's a situation where the building is gone. In this state, This is exactly what I mentioned earlier. Activate Inpaint here. If you select this model, Now, when it's created like this, The results that were broken around it In the same direction as possible. You can see that it's being created. Good. You can see that it fills the surroundings like this. Parts related to 2D in this way. If the background is in a very complicated place, If you use the model I just showed you, You can solve it easily. It's very simple. But now I'm here. I didn't show you the process. To get the appropriate seed value, Here's the batch count. Raise it to 9. Turn the related part to minus 1. I turned it around until it came out. But now I've turned it around about 10 times. One of them is the one I like the most. That's why I'm showing you this. Making this related hand come out well. In the end, it doesn't come out at once. Change the part related to the seed value. You need to find the most suitable hand shape. I think you can think of it. If you use it like this, Very quickly. The quality related to this background. Keep the quality as much as possible. You can change the shape of your hand. All right. This is the second time. Hand refiner using ADetailer. Let's check it out. This ADetailer is exactly. The Inpaint work we did earlier. I masked it like this. I'm doing it automatically. It's very similar to Segment Anything. The student model here. I think I have a hand here. If you check it out, That part boxing. Here's a hand. I'm going to deliver a mask. Based on the mask. I'm going to do diffusion. Providing that function. It's ADetailer. I'm going to use this. What you're seeing right now. I'm going to select these parts automatically. I'll show you how to modify it. But this is. I keep telling you. It's 2D. In the case of 2D. Really. With the probability I showed you earlier. You need to turn a lot of seeds. Among them. You're going to get good results. You're going to have to put it up there. Don't forget to do this. In conclusion, It's working. No, it's not. It's working well. Before I explain this. I will explain some ADetailer options. You may be the first to see the ADetailer. If you want to learn more about ADetailer, check out Git. Or. It would be a good idea to refer to the Fast Campus lecture produced by soy.lab. But the important thing is. I'll check it off. No. 1. Skip img2img function here and You need to check the model here with your hand. And you need to remove one of the things that are checked here. And I'm going to set the size here differently. The rest is this ControlNet model. I'll show you how to check the depth refiner model. Now, we're ready. What I'm going to do this time is... I'm going to put this related image into the img2img. I'm going to use a function called ADetailer. I'm not going to use this inpaint mask. This is the original image. You can check it on the site we saw at the beginning. You can get this from Drag and Drop. And this image contains a prompt. I'm going to use that, too. First of all, I'm going to call the settings here. Unlike before, I applied a prompt like this. I didn't make this prompt. I brought the prompt that was used to create the image. I put it in. I'll go down a little bit. I'll set it to Euler and change it to step 30. Change the 512*768 setting to fit the image. The rest of them can go as they are. You can solve it with another seed value, not this seed value. If you come down here, the ControlNet is activated. I'm going to turn this off. This is a little bit of an error when I call the settings. I enabled it in ADetailer like this. You must check the Skip img2img. You have to do this. The img2img process on top of this doesn't work. In this state, the inpainting related thing. Originally, this blur was set to 4, but I set it to 20. I turned off the part where the Inpaint only masked was originally activated. And I'm going to change this to 768. This is set to 0.4 to 0.7. Now the basic preparation is over. This is where it was originally set to non. If you change it to depth like this, you can change it to a depth_hand_refiner model. There's a space to change. If you do this, you can do it. Now all the settings are complete. Now, in this state, I'll press Generate right away. All right. Now, you can check the part where the hand is solved. It's simple, isn't it? Third, I'll briefly explain how to apply the hand refiner in T2i. As you can see now, I'm just creating an image and creating it. I created it. If you look at it now, there's an error like this. This is how you do it. Drag and drop to create a hand shape. And then you create it again with the related part. The results will come out. But the important thing here is that if you look here, there's a Starting Control Step. This is usually set to 0. If you start from 0, you'll get a completely different picture. So, in the direction of applying this from 0.5, the steps will go on like this. From the middle, this related part will intervene to make this shape. I'll try to proceed. Prompt is very simple. I'll try to create it right here. Yes, this is how the image is created. You can see that your hands are very broken. Now, what are you doing in this state? Activate the ControlNet. Enable it. Do the depth. Change the hand refiner. Turn on the Allow preview. Put this erroneous image in here. You can preview it. Now, you can see that your fingers are fine. Now, in this state of being created. I'm not applying it right away. Lower the Starting Control Step shown here to 0.5. And then I'll proceed. I'll create it. Yes, it's almost done. This hand shape is a little better on this side, too. In this state, through the Hires.fix raise the picture. Or send this img2img. Through inpaint, only this part. You can paint it new. So, if you modify it like that, you can get this result. Upscale areas where the hands and nails were wrong and where attention was not properly applied using Hires.fix. You can proceed like this while hand inpainting in img2img. This part was shown earlier. Lastly, I'll briefly explain this part and finish it. Sometimes, there may be people who have this error. Park Sang-jun of Stable Diffusion Korea. I've shared a related error. If you update and turn it off in the ControlNet, The model is automatically downloaded. It's a part that needs to work. Sometimes, there are people like this. Or there are some parts that are not accepted at all. I think there were people who couldn't run it. The related parts are here. Click on this link and download the models here. After you get model, I'm going to create this folder here. If the download wasn't done correctly at first, Create it. Just copy these files here. This is how it's solved. That's all I've got for you today. It's related to hand refiner. I've learned about the functions. Don't forget to subscribe and like. I'll try to find you with good content next time. Thank you.
Info
Channel: soy_lab
Views: 4,341
Rating: undefined out of 5
Keywords: #stablediffusion, #aiart, #handrefine, #controlnet, #animal, #soylab, #stablediffusionkorea, #A1111, #COMFYUI, #a1111, #comfyui, #hand, #refine, #ai, #aihand
Id: 0Rg3STOv3lU
Channel Id: undefined
Length: 14min 14sec (854 seconds)
Published: Mon Jan 08 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.