Stable Diffusion in Docker | Open Source AI Image Generation Self-Hosted

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
my favorite way to run stable diffusion or really any software to be honest is through a container or on top of a container and uh I found this repo that sets everything up nicely for you for running stable diffusion in Docker they also have podman support too but I'll get into that a bit later so I'm going to walk you through kind of the setup for this repo again it's it's awesome it handles everything for you and what's really cool about this project is that uh it doesn't just have one UI to use it's got a selection of a few you can you can take advantage of you've probably heard of automatic 111's UI for stable diffusion very very popular very feature complete uh this repo has support for that it also has support for invoke which is a bit of a more polished UI and it also has support for comfy UI which is kind of like a node based graphical builder for uh for tuning stable diffusion among other related models such as deriv models so to get started we're going to just follow this setup process I'm going to go through it with you before we actually do that I want to go through these other sections so usage if we take a look at that this tells you how to use it and the reason I bring this up is because as mentioned before this does support podman so you can check this out if you're interested in that instead of using Docker and it also shows you how to use it once it's all set up and configured so it shows you how to add custom models and things like that it's mostly the same as just running the uis by themselves however there are some slight caveats with it running in Docker so before getting too deep I would just give this a once over and see if there's anything you might be interested that you're aware that it's at least a feature that you can leverage and then the other section I wanted to show you before we go through setting this up is the FAQ section the reason I bring this up is because several of these issues actually ran into while setting up setting it up myself for example uh this one right here I had to install the Nvidia container toolkit by the way on a related note this project does not support AMD or Mac yet they are looking for support for that but as of time of recording they do not support those platforms at least natively or easily but anyways back to Nvidia container toolkit so I did have to install this and restart the docker Service as it says here this is a pretty common issue people ran into on Arch Linux that is I believe yeah it's literally just called Nvidia Das container Das toolkit so go ahead and install that on whatever platform you're on all right the setup for this is straightforward uh as it says at the top here make sure you have the latest version of Docker and Docker compose installed as we move forward I'm going to assume that you have those set up on your system so we can continue from this point which is just to clone the repo first get clone and I'll take the URL get clone and then uh tip with the git CLI if you don't know this if you specify an argument after the remote URL you want to clone from for example do s- ster it will clone into that directory on your local system so as you can see it's cloning into s Docker which is awesome then we can just CD into that then the next step is to run this compos command which will download all the base files okay that took a good bit of time to download but now that that's done we can continue so uh the next step is to run this compose Command right here and this is what I was talking about earlier with the multiple user interfaces you can use so again invoke is kind of a polished userfriendly version Auto is feature complete it's got a lot of features and and a huge Community around it all these really do have a huge Community but Auto is is I mean it's massive Auto CPU is the same thing except running on a CPU architecture again AMD is not supported keep that in mind comfy UI is that node based Builder and then comfy CPU similar to autoc CPU is comfy UI running on a CPU uh as it suggests here I also recommend this I would run invoker auto personally I would do auto just because a lot of the tutorials on you you'll find on YouTube and elsewhere are based on this however if you do just want like a kind of a in my opinion cleaner polished user interface and voke is fantastic it's a very good one for the sake of this demo I'm going to do auto so we'll copy this command and all you have to do is replace UI in this in these brackets with uh the UI you want to use so just go here do auto once that's done you'll get some logs that look something like this notice that the web server URL is right here Port 7860 is the default uh I will say that you can also run this in the background of course as with any other containers through Docker compos with the dashb flag which stands for detached and it basically just runs the runs the containers in the background so you can keep using the the open Terminal process now that this is all set up we can we can just go to this URL and take a look at it there we go so this is the automatic 1111 web UI this is how you'd see it if you were looking at it anywhere else it's it's just running in Docker which is really cool so uh by default it comes with these two uh checkpoint models uh this is stable diffusion 1.5 prun dma only this is probably what you'd want to use for like the majority of use cases unless you implement your own custom models uh there are lots of models you can find a lot of them are derivatives of stable diffusion including v1.5 in particular this in painting model which is selected right now is designed for in painting which is a little more specific than this this model of course so let's just give it a test once this once this model loads octopus hopping a fence not bad not bad at all uh so like I said you can download your own models uh you all everything you'd want to do pretty much on this is outlining the docs I I showed earlier including installing models customizing the launch parameters things like that you can even use the extensions everything you'd expect from automatic 1111 or whatever UI you you would like to use you can use so uh feel free to do as you wish at this point this is super cool because it's almost entirely open source software and the reason I say almost entirely is because you're using Nvidia which is of course not open source for the most part um the other thing that's cool is that since this is in containers you can easily run it in kubernetes pretty much anything you could think of at this point
Info
Channel: Brian Cooper
Views: 1,196
Rating: undefined out of 5
Keywords:
Id: mvJJ5L7n384
Channel Id: undefined
Length: 7min 16sec (436 seconds)
Published: Tue Nov 14 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.