How To Use Docker To Make Local Development A Breeze

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
if you're developing software for the cloud you've certainly run into docker docker has a lot of capabilities but to get the most out of it you have to make sure that the way you're running the code locally matches up with how it runs in the cloud if you don't you're going to run into incompatibility issues the software might crash in the cloud but work perfectly on your machine and that makes testing things and fixing those issues a real pain so today i'm going to show you how to go all in on docker and avoid all of those problems and i'll use a simple api server in python as an example the code for this episode is on github including all the settings it's ready to go so you have no excuse to not do this well maybe one you might not have thought through all the necessary steps to design your software and for that i have a free guide for you you can get it at ion.codeslash design guide it contains the seven steps i take whenever i design a new piece of software i think it'll help you avoid some of the mistakes i made in the past and put you on the right track so arion.cod's slash design guide the link is also in the description this is the code example that we're going to use today it's a simple api that retrieves youtube channel information i'm not even accessing the youtube api i'm just reading that stuff from a json file that i just put in here to keep things simple and later on i also want to show you how to deal with data and syncing data between local folders and your docker container which is actually really simple but what i have here is pretty basic built on fast api i'm defining a class called channel which represents a youtube channel that has some information like the id the name some tags and description what this is exactly doesn't really matter and then i read the channel information from a json file normally you would use a database but i want to keep it pretty simple in this example so i just use a file and this file just contains information about a couple of different channels there's this code stacker jack harrington and myself these two channels by the way are really nice you should check them out if you haven't done so already so there's basically only two types of requests you can do in this particular very simple server one is a simple get request at the root which is like a health check of the server so it just returns the response that the server is running and then there is another get request that allows you to retrieve a channel based on the id that you provide and the only thing that this does is it checks if the channel id is not there then it raises an exception that's a status called 404 a not found error and it gives some detailed information well not that many details channel not found and if it is there then it's simply going to return the content of the channel at that particular id also again here this is really simplified you wouldn't do it this way in a production api because you want to separate those things a bit more i have a whole section in my online course where i talk about this topic in detail but here i just want to keep things pretty simple so this how i did it in this particular example so in principle that's all you need to set up a really simple api one thing i want to mention is that you see that i'm using a data class here to represent the channel you can also use pydentic if you want to fast api supports pedentic and pedentic is actually compatible with data classes so that's why we can also use a data class here and this example i'm showing you here is actually also very similar to the basic example that fast api itself shows on their homepage so how do we run this server well for that we're going to use uficorn and youthcorn allows you to run a server like this and expose it to a particular port here you see the command that you need to run in order to start the server so along the main file that's the app server which corresponds to this variable here and the host is local that's the local host and the server is accepting request on port 8080 and there's also this dash dash reload that means that if we change things in the file then the server is going to automatically reload so we can see the effect of our change so when i run this you see we get some status messages here it started the server and we have to link here and then when i do the simple get request you can see it here that's simply localhost at 8080 we get that the server is running which is what we expect and now we can also use the channels endpoint to get youtube channel information for example i could get the information of the channel by jack harrington like so and then i'm getting json data back if i make a mistake here and i accidentally write it in the wrong way then i'm going to get the 404 error with the detail that the channel wasn't found and because of the reload option if i change things here in the code it's going to automatically reload the server so for example let's say i'm changing this not to the server is running but the server is running exclamation mark so then we get this i save the file and if we look in the terminal you see that we get here a reload message that detected changes in main api and it's reloading and now if i load the root folder again then this is what we get which is the changed code which is great docker has completely changed the game for cloud development it allows us to run applications in an isolated environment a container running your preferred linux distribution moreover you can create an image where you can pre-install software change anything you like treat it like a virtual machine you can run these images locally on your machine and they also neatly integrate with cloud technology such as kubernetes if you're developing a cloud-based app you're going to have different environments in which the app will run the classic set or sequence of environments is the dtap street development testing acceptance and production but there are alternatives to detail as well for example instead of a sequence a street you could create parallel environments for example have a parallel testing and performance analysis environment you could do progressive deployment with feature flags or ring based deployments that open up a new release to groups of users in sequence many possibilities but i won't talk too much about deployment today if you'd like me to do a separate video about that let me know in the comments and when you're there why don't you hit that like button as well really helps me a lot to reach more people now back to docker because the same docker image runs in each of the different environments you have you reduce the chance that code that runs locally on your machine breaks when you try to run it in another environment and also having the same environments everywhere simplifies things like testing your code doing security checks audits and it also makes sure that the developers on your team work in the same environment reducing the potential for errors the way i'm running this code locally at the moment is not really ideal i'm running this on a mac and our server is going to run on linux so if i were to deploy this api as is the server might break for some incompatibility reason between mac os and linux i don't know i haven't tested that and i can't test it because i'm running it on a mac machine locally and not on a linux machine and it's not just the difference in os that's an issue for example i'm running a particular version of python on my mac i have to verify that the one that's running on my server is going to be the same and i have to also make sure that the dependencies that are needed in order to run this api locally are actually installed on my machine if you're a solo developer working on a project that nobody else is really going to run the code of on their machine then that might be fine but as soon as you're working the team it makes a lot of sense to try to make the environments that the different developers in your teams work in are as similar as possible basically anything you can do to homogenize homogeneous homo homogene homogeneous homogenize make make things the same between developers anything you can do to achieve that is going to make your life a lot easier i should use less expensive words on this channel so instead of running this locally let's try this with docker so first of course you have to make sure to install docker and once you do you're going to have a docker demon that's going to be running in the background that's going to take care of running and maintaining your containers docker is a really nice tool because it provides a complete virtual environment virtual machine on which your code is running and you can precisely define the kind of environment that you want so which version of linux you can supply commands to install things on that virtual machine and docker is smart about handling this so it's going to cache the things that don't change while you're updating the code and the way that it works is that you create a docker image according to the settings that you want and then once you have that image you can run that as a container and the fun thing is you can do it locally you can do that on your server it doesn't matter it's the same environment which is what makes it so powerful after you've installed docker what you need to do is supply a definition a specification of what you want the docker image to look like what are your environment settings basically and for that you need to define a docker file here you see an example of the docker file that i'm using for this particular project the first line defines the image that this image is going to be built on and that's the latest version of the python image so this is a docker image that has python already installed it might not always be a good idea to just use the latest one like i'm doing here you might want to specify a specific version to make sure you're not getting any incompatibilities in the future but for simplicity i'm just using the latest version of python here then what i do is i set a working directory so that's where all my files inside the container so the container has its own file system within that i'm going to have a app folder where i'm going to put all the files that i need for my particular api and that's the working directory then i install dependency so i'm copying requirements.txt which contains the list of dependencies that i'm using in this project to the app folder which is my working directory and then i run pip install i'll come back to this again in a minute after i've done that i'm going to copy the script to the folder so that means everything that's in the root folder which contains main.pi and everything that i need i'm going to copy it to the app folder and then finally i'm using the command to start the server which is uvicorn and internally in the docker container i'm running this on port 80 which is the default port to use for that also you see i don't have a reload option active here because there's no need to normally if you deploy this to the cloud you won't change those files on the fly but you will if you run this locally and i'll show you how to do that in a different way in a few minutes you might wonder why i'm copying requirements.txt separately here and then run the install command i could have also just copied everything then run pip install and then start the server right that's the same thing well it's true but the problem is that docker caches each of these steps so if i would have copied requirements.txt here and then run pip install that means that every time i make a change to the code docker is going to redo this step and that means it needs to reinstall the dependencies which takes time dependencies don't change as often i mean you might add a few as you're developing your app but they won't change as often as you change the code normally so it makes sense to install those dependencies before you copy the rest of the code to the folder so that when you change the code docker has cached this dependency installation step and it simply copies the script to the folder and starts the server so that's just way faster so once you have this docker file you can instruct docker to start creating an image based on this file and you do that using the docker command so we're going to build it and i'm also going to supply a tag so that i can later on easily find this docker image back if i don't supply tag docker is going to invent some weird random name for me i just want to call it i don't know channel api and we want to build the image from the folder that we're currently in so now you see it's retrieving the python image and it's going to build the docker image for me so now that i've built a docker image i can create a local container to run the image locally on my machine that's also using docker but then with the run and then i can pass a few options so one thing that i want to do is i want to specify the port that i'm going to publish this container on so that's going to be 8080 just like i did when i was running the code locally and i want the mapping from this port that i'm going to see in my machine to the port that's running internally in the container which is port 80 which is what we specified in a docker file and i also supplied the image which is channel api so this is what that looks like there's one other thing i want to do which is that i also want to run this detached so that starting the docker container i get my terminal back so i can write some commands here so let's run this and now we get also an id from docker that's the container id basically so we can use this id later on to stop or kill the container or anything we want to do so now it's running our channel api in a container and you can also see this in your docker desktop application after you've installed that so you see we have the channel api container that's currently running i can open this and then i can also see what's happening in the container right here so let's go back to our server so now i can perform the same request and the server is running but now i'm accessing the data from the docker container and just like before when i was running the code directly on my machine i can access all the endpoints of this api what happens if i change the code for example if i replace the exclamation mark here by a dot like so then you see that my server is not automatically updated so what i would need to do on my current setup is that i go to my dashboard and i delete the container that's currently running and then i go back into my terminal i'm going to build the channel api docker image which well it is going pretty fast because i cached the requirements copying that you see here and now we can run the docker image again and if i now switch to my browser then we get the new version of the code this is not ideal but fortunately there's a better option when you deploy your code to the cloud you'll need to build the docker image just like it did before and tell kubernetes or whatever other container orchestration system you're using to update the running image to the next version this is all part of your continuous deployment pipeline if you're using github you can set this up with a github workflow if using bitbucket you can use a bitbucket pipeline and most alternatives will also have some kind of mechanism to start and run the deployment after committing and pushing the code change the way i built and ran the container locally is not great every time i change something in the code i have to stop the server manually rebuild the image and then restart the container this is where docker compose comes in it has two important features that make running your code locally in a docker container really easy you can supply a custom run command that restarts the server automatically when a file has been changed and you can sync a folder on your machine that's called volume and docker-composed terms to a folder inside the running container you still need a docker file that i talked about before but when you use docker compose you also need to supply a yaml file that contains the information that docker compose needs to run here you see an example of a docker compose yaml file it's very simple it contains a single app surface and that has a couple of properties so the first one is the folder that we're using to build the docker container you don't necessarily need this but if you want docker compose to build the image before you actually start the container which generally you want it's useful to supply a build folder so you can actually do that the second thing is the container name so that's when you create the container we give the name so we can easily find it in our docker desktop then i'm supplying a custom command here which is uv corn etc etc port 80 so that's all the same but i am supplying the reload option here because we want to reload restart the server when our code has been changed and what this does is this replaces in the dockerfile this command that we have here so docker compose still uses the docker file but it replaces some of these commands with some of its own custom things so in this case we're replacing the command we because we want in docker compose which is we'll only use this locally we want to make sure that if the code changes we restart the server when we deploy this to the cloud this is not important because the code doesn't change there will be a new deploy in that case but locally we want this to happen so that's why we override the command we also do the port mapping so we're saying that we to the outside outside of the docker container we want to expose port 8080 which is what we have at the moment and internally that maps to the 80 port which is used in the darker container and that's also supplied here as part of the command and this is the bit of magic that makes everything come together really nicely is that we're defining a volume and what i'm doing here is i'm defining a mapping between the current folder and the slash app folder inside the docker container and what this does is really need this means that if we change anything in this folder in our project folder then it's going to sync those changes automatically to the app folder and because it syncs those changes the files are going to be updated our uv gordon command will detect that change and it's going to restart the server because of this command so when you put all this together you now have a system where you able to run your api in a docker container but you can change the code and when you do save the file it's going to automatically restart the docker container for you because of the volume syncing option that i've added here so a custom command that overrides this command right here and the volume that actually overrides the copy behavior that we have here so let's try this so we have docker compose and we're just instructing it to update automatically so this does the syncing and automatic restarting of the server and we also want to build the container for the first time so now you see that it's going to watch for changes in the directory slash app and ufcorn is running locally inside docker on this port so now when i go to the browser i can show that the server is running let's also get some channels information again like so and that also works so that's nice and now when i go to my main file let's say i replace the dot again by an exclamation mark like so so i'm changing the code in the server and then we see here that uvicorn detected a change here and then it's reloading so now when i go back to my browser again you see that we get the results that we expect so now it's an exclamation mark again a final thing that i want to show you is how to deal with changes in data because currently uvicorn only restarts the server when i change the python file for example if i go into my channels.json file and i change python to use a capital letter p like so and then i go into my server and i request my channel information nothing has changed because uvicorn only detects changes in dot pi files so how do you fix this let's stop this server that's currently running and what you need to do is to specify more precisely to uvicorn the kinds of files that it needs to watch and just have to make sure that you have installed the watch files package because that's what juvecon is relying on i've put it in the requirements.txt file so if you install that that won't be an issue but you need to make sure that it's there and then what you can do in your custom command is not just write that it should reload the server if something changes but we can actually specify what it should look at so we can do a reload include and let's say we just want to include everything for the time being and you can add more precision here later if you want to but this is how you specify the kinds of files you want you've got to watch so let me save this and now i'm going to restart the server and now when i retrieve my own channel information again i'm getting the updated value which makes sense because it rebuilt the image with the code that i already changed so let's say i'm going to change the channels.json file and let me add a keyword software at scale and when i save the file you also see that it's restart the server and when i go back here it has now added this particular keyword so what we've ended up with is a really nice flexible and completely separate virtual environment that we can run locally that's exactly the same as when we run it in the cloud and we can change the code we can change the data and it's going to sync automatically with the server that's running inside isolated environment and it's going to restart it automatically whenever that's needed now again i already said this in the beginning the api code in this particular video is pretty basic if you want to learn more about api development the difference between graphql and rest api take a look at this video thanks for watching take care and see you soon
Info
Channel: ArjanCodes
Views: 275,761
Rating: undefined out of 5
Keywords: docker, docker tutorial, docker basics, docker compose, docker tutorial for beginners, software development, docker introduction, docker container, docker compose tutorial, docker development environment, docker container tutorial, docker images, docker containers, web development projects, docker tutorial 2022, docker tutorial python, docker compose yml, docker compose volumes, docker compose vs dockerfile, docker compose file, docker compose up, docker compose python
Id: zkMRWDQV4Tg
Channel Id: undefined
Length: 21min 53sec (1313 seconds)
Published: Fri Jul 08 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.