Deploy Docker Containers with Docker Cloud

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
in this video we're going to deploy to docker images to a server and create a publicly accessible website I'm going to use the same example as in the previous video docker compose in 12 minutes to quickly recap we had a really tiny website written in PHP which just lists some products the product information however is retrieved from the product service via a simple API call the product service is written in Python and runs in a separate docker container so we've got the website there which was just a bit of HTML and it makes an API request to product service which runs in a different container the product service is equally as simple as this bit of Python here when people hate the URL we return some JSON which is just an array of products we also have a docker compose file which defines the two services we've got the the website at the bottom and then the product service above it includes information about the docker images their volumes and the ports that they run on the code is available on github if you need it the link is in the description to deploy these containers to the cloud we're going to use docker cloud this is a free service by docker that will take docker images and automate the deployment of them the docker images have to exist somewhere this is called the repository and then you need somewhere else to deploy them to a server that containers will run on docker cloud doesn't provide these things for you it just sits in the middle and it ultimate the process from image and repository to a running container on a server locally you have docker compose which takes those images and uses the docker compose file to spin up your containers docker cloud does a very similar thing but in the cloud go ahead and sign up at cloud.com once you verify your email address you'll be able to log in at the time of making this video docker cloud has a swarm mode which is in beta if you want to follow along with this video turn swarm mode off dr. cloud is made up of a few things at the bottom in the infrastructure section we have nodes nodes are servers that the containers will run on nodes can be put into clusters to group them together if you need that on top of that we have containers the things that will run on the nodes this view just lists running containers you don't create them directly here instead you can create a service a service is an abstraction or for a container when you create when you define an image that you want to run you give it a name and you can also specify other things like ports and environment variables much like when you use the docker run command in your terminal and when you start a service it'll spin up the docker container in the background services makes scaling really easy if you want a scale you tell docker cloud to increase the number of containers and it'll automatically spin up some more for you the next thing you'll find in the sidebar are stacks a stack is like a docker compost file it lets you to find services in a llamó file instead of creating them manually through the interface that we just saw and this is useful if you've got an application like a website that is made up of multiple services you can define them all in one stack file and then deploy that as opposed to creating services individually and then trying to link them together so we're not going to do anything in the services tab but we're just going to create a stack and then the services will be created automatically and finally repositories this is where docker images live just like notes these are external to docker cloud docker has a repository service you can use it's called docker hub you can host one image for free anymore cost seven dollars a month you can also import images from external repositories for example I have some hosted on github but I'm not gonna go into that in this video all right let's start by creating a node you can bring your own node as they describe it which involves installing some software on server whether that's a dedicated server or VPS or whatever is up to you or you can use one of the integrations that they offer so you see when you create one you've got these providers to choose from and you can link your docker cloud account to one of these providers so I'm going to use digital ocean to connect the cloud provider go to cloud settings and you'll find the options here you just need to click connect provider next to the one that you want to use I was ready to give you my referral link for digital ocean which I'll put in the description anyway to get ten dollars free credit but I've just noticed they gave you twenty dollars here so fine use that one so I've just switched to a different doctor cloud account because I already have my digital ocean account connected to this one I'll create a new node though go to create it'll actually create it as a cluster with just one node in two I'd give it whatever name you want it really doesn't matter you can use labels these are useful if you want to give it like a label of production and then deploy to all of their production clusters none of these are useful for us though now we can choose digitalocean depending on the provider you'll get more settings you can choose a region for digital ocean droplets as they're called and then you can choose which one you want you will be charged for creating these VPS with digital ocean possibly with other providers as well it's worth checking what the prices are before you do it and if you put this up to 5 you'll be charged five times the amount so it's really cheap but don't go too crazy so I can launch that and what it'll do is in digital ocean create a new droplet which is a VPS and it'll create it with the right software in the right configuration such that docker cloud can just throw containers at it and they'll run and magically work so you never have to go to digital ocean to configure anything you don't have to SSH into any servers donker cloud will handle all of that for you so take a few minutes to deploy while that is happening let's look at repositories because that's the other external thing that we need to link with since it's very well integrated with docker hub since they're both services from docker let's use docker hub you can create a repository in docker hub from the docker cloud interface but I think it's a little bit clearer if we sign in to docker hub and do it there you can sign in with the same account as docker cloud in docker hub you have the choice between creating a repository or creating an automated build a repository is something you can push a docker image to so you'd build the docker image locally using the docker build command and then you've run a different command to push it to a repository in docker hub and also mated build will pull your cold directly from github or bitbucket and build the image for you which can be very convenient so let's choose that of course this means we need to push our cold two bit buckets or get up first I'm gonna use bit bucket because it's free for private repositories we also need to make sure we've got a suitable docker file in each repository so let's take a look at our we're gonna create two repositories in bitbucket which will correspond to two automated builds in docker hub we're gonna have one for the product service and we're gonna have one for the website in the previous tutorial we made a docker file for the product service it uses the Python image as the base image and then it just copies our cold inside for the website however we didn't make a docker file because we could just use the PHP image directly and mount a volume this was great in development but when you're deploying images to the cloud and trying to run containers on remote servers mountain volumes isn't really a thing mounting a volume works in development because the cold is on your computer and you can plug that straight into the docker container when it's running but this server we've got in digital ocean running our container and that doesn't have our code so the volume mount doesn't make any sense we need the cold to be baked into the image so just like the product service docker file copies the cold into the image we need to do the same thing for the website so I'm going to create a new file this will be called docker file inside of the website directory all we need is from PHP with a tag Apache that's what we were using before in the docker compose file and we just need to copy the current directory into slash var slash www HTML as you can see that is where we were mounting the volume to the reason this one is the current directory with just a dot and this is dot slash website is because the docker compose file is one directory higher than the docker file now we don't need to do anything else and we know that because this image worked directly without any modifications and the only thing we want to change is we want that image but with our called already inside so that should be enough to run the website we can test this out locally in a terminal by going to the website directory you can see we've got the docker file there and running docker build dash T for the tag I was call it example shop website it doesn't really matter and then a dot at the end to say use the docker file in the current directory obviously make sure you've got docker running first and once that's done we can do dock run - ptoo set the port so just the same as what we had in the the dr. Campos file we want port 5000 locally mapped to port 80 inside of the container and then the name of the image that we just built which was example a shop website so if we go to localhost 5000 we can see that's something loads now it doesn't work it doesn't show the products because the products service is not running so the the API call that it tries to make just fails but this is fine this is what we would expect what we wanted to know was does the PHP application work at all does does this docker file do what we expect and I think it does it's loading and it's trying to make an API call to the product service so that's good enough for me alright next I'm gonna go to bitbucket gonna create a repository for the website I'm just gonna stick with example shop websites make sure to private git repository so I need to make this website directory into a git repository you can do that with git init and then we can copy and paste the commands from bitbucket here to add the origin if we do get status we'll see that we have some untracked files we do git add dot to add them all so now they are staged and ready to be committed and we can do git commit - em - give it a message of initial commit and then we do the git push just like bitbucket tells us to let me know if a gates tutorial would be useful and I can do that in the future so if we refresh this page we should see some different stuff go to source we can see our files there great back in docker hub we were mid creating an automated build we can now choose bitbucket you'll have to link your account if you haven't already it's really easy to do and there's our example shop website repository in bitbucket that we just created so I'm gonna make this private for now feel free to choose whatever visibility you want be aware that even though it's the image people will see on docker hub you can't just go and look inside of an image so you can see all of the code in there so you are exposing everything if you make this public if you've got no secrets passwords or anything confidential in your code then you probably find I'm gonna create this the descriptions here don't matter because it's private nobody else is gonna see it build settings is an interesting tab this is where you can make the build happen automatically when you push some code so when you make some changes to your website and you pushed a bit bukit doctor will see that change and rebuild the image to the image is always up to date and you never have to worry about building it yourself so you can choose a branch that when pushed to triggers the build master is the only branch that we've got if you wanted to work on some different feature branches and have that build some different versions of the image you could do that you can have different docker files and you can choose what tag that's the the resulting image gets so when you see other images in the docker hub and there are loads of different versions like PHP version 7 PHP version 5 PHP version 5 with Apache this is how you set them up you'd have a different docker file for each one and you tell dock home to build them all when you pushed a master the default settings are fine for us though and we can try it out by clicking trigger so this is what would happen if we pushed some new code in build details you'll see the new job is queued and it's usually quite quick give it a few minutes and it'll start building your image all right I left this for a while it's finished building it didn't actually take 4 hours but it has worked and now we can go back to dock a cloud and when we refresh we see the new image available to use so we need to follow the same process now for the other container that we want to ultimately run so that would be the product service this one already has a docker file it copies the files in so that looks good to me it's a case of creating another repository initializing a git repository inside the product directory adding the remote pushing to bitbucket creating a new automated build so you can see if I try to make another private repository it won't let me you can go and buy more for seven dollars a month I'm just gonna make this one public I obviously wouldn't expect anybody to use this but it is safe for me to put this online there's nothing confidential inside of this image so again this one will build automatically when the repository is pushed to but I'm just gonna click trigger to make that easier while we wait for that to builds let's take a look at stacks we're gonna create a new one and like I said earlier as stuck is where you define all of your services that create your application it's very similar to a docker compose file so if we take a look at the docker compose file we have for this application you can see we defined services here so we're going to be doing the same thing but for the cloud if you click the link at the bottom of the creation page it will show you how stock files should look and you can see it is it is like this section of the docker compose file so you just skip the version under the services bit at the top and start directly with these service names so using the docker compose file for reference we can write our stack file the first thing we need is the product service and we're going to specify the image it's really easy to refer to an image that's hosted with docker hub it's just your user name and then the the name of the image that we created earlier so it's example product service and you can specify the tag if you don't it'll put it on automatically and assume you mean latest that's the tag that it's got here the next thing we can do is set restart to always so if for whatever reason this service goes down it'll detect this and automatically restart the containers and hopefully things will be a bit more stable this way we obviously don't need volumes because the code has already baked into the image and we could specify ports if we wanted this to be publicly accessible but we probably don't want the API to be accessible directly over the Internet we only want the website to be able to access the product service so we're not going to specify any public facing ports instead we're going to go straight on to the website again we specify the image we're using two spaces for indents by the way so this one is example a shop website again the tag is just latest so what we need here is to specify links so this is an array of other services that it links to with docker compose this happens automatically everything in a docker compose file has access to each other in a stack file you need to be a bit more expert about it so we just need to say that it links to the product service so that's just this name here the name of the service you can optionally specify an alias so if you put a colon and then a different name here that would be the name that the website service could use to refer to the product service but if we look in the index dot PHP file for the website you can see that it makes a request to something called product service so it needs to have that name and if you don't specify an alias it'll just use the service name as the alias this one we do need to specify ports or rather one port we can make this available on port 80 so there's my port 80 externally to port 80 inside of the container and that's all we need we should be able to create and deploy this stack now if we go to timeline we can see what it's doing you get the console output as it's pulling the images and spinning up the containers once it's finished you can go back to the general page and you get these endpoints so if we had multiple containers that were publicly accessible you might do this if you want to scale your service up and have multiple containers running the same thing they would be listed here and the service endpoint would load balance between them but for us both of these endpoints are gonna be the same thing so if we click one of them we see that the website works are done the internet anybody could access this and this is happening in real time it's making that API request to the other service which is running in a different container if we have a look at the container section we can see website one and product service one being created automatically similarly the services will have been created product service and website and we only have to worry about the stack so we can see the whole application both of the services that we need the the product service and the PHP website are all encapsulated inside of this stack and we can see the whole thing the whole application is running you can stop it you can restart it you can terminate it which means stop it and then delete it or you can go make changes if you want to obviously the endpoint the URL we've got for for this service is not very friendly what you can do is go into digital ocean or wherever the the server actually exists and to find out the IP address of it if we go to that IP address you can see the the website exists there and you can use that IP address to set up DNS records for a domain name if you own one I have a few websites running this way and I've got six containers running on one digital ocean VPS for five dollars a month and instead of exposing port 80 on each application because that wouldn't really work if you've got multiple things that want to be on port 80 I have a proxy that sits in front of them so I have another stack file here which uses a che proxy then it links to the services in other stacks that are running the websites that I want to make public and this is the one that's exposed on port 80 the other services in their own stack files just have to specify the domain name that they want to sort of listen to and when that domain name is linked to the IP address of the VPS the droplet it just magically works so if you want to run multiple websites on one node then look into proxies specifically H a proxy really easy to use but that's pretty much how you would take your docker images that you made locally and get them deployed to the actual Internet and be publicly available we're hosting our callin bit bucket but it could be in github we're using docker hub to build the images automatically when the code changes of course you could skip this step and just push your pre-built images straight to a repository created on docker hub or another dock of registry that exists like I mentioned at the beginning of the video github offer something similar to docker hub and then we use in docker cloud to take those images that exist and spin up containers and put them on a real VPS that exists on the Internet this is of course not the only way you can deploy docker containers there are many different techniques for orchestration and getting your containers running somewhere but if you're building small applications then this is a really easy way to get started and like I said this is how I run multiple websites and they've been running for nine months this way with no problems I hope this was helpful good luck with your docker applications and your deployment of them I'll see you next time thanks for watching [Music] you [Music]
Info
Channel: Jake Wright
Views: 116,245
Rating: undefined out of 5
Keywords: jake, wright, computer, science, software, developer, docker, tutorial, beginners, internet, website, php, python, docker cloud, images, containers, digital ocean, bitbucket
Id: F82K07NmRpk
Channel Id: undefined
Length: 20min 19sec (1219 seconds)
Published: Tue Jan 16 2018
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.