Using docker in unusual ways

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Docker is one of my most used tools when it comes to software development with the most obvious use case being the containerization of application deployments and whilst Docker is amazing for this it's not the only thing it's good at in fact there's many different ways I use Docker to help improve my software development workflow so when the docker team reached out and asked if I'd do a video on some of their newer features I jumped at the opportunity and thought it'd be a good chance to talk about some of the more unusual ways that I use it so for disclosure this video is sponsored by Docker which means we'll also talk about these three commands on screen throughout the video the unusual stuff however that's all me my first unusual use of Docker is to use it as a time machine since Docker hub's release in 2014 we've had an append only library of operating systems programming languages and other software that we can download and run at any time by running the following command I'm able to easily run an instance of UB 2 that was released back in 2012 and whilst it's always nice to take a trip down software Memory Lane there's some big benefits to being able to do this so easily one such benefit is backwards compatibility for example let's say my code needs to compile on the last three major versions of go rather than managing multiple versions of the language in my environment I can instead just use a Docker file to test that my code compiles on these older versions saving me and my environment a bunch of configuration as well as programming languages and operating systems this can also be applied to General applications as well for example let's say I want to test my fancy new website on an older version of Firefox again I can easily do this by using Docker without having to install multiple versions of Firefox on my system not only is Docker great for ensuring your code has backwards compatibility it's also rather useful when your code has no forward compatibility as well which brings me on to my next use when it comes to dealing with Legacy code things don't always operate like they used to and whilst the code itself might not change the environment often does for example at my last place of work we had a core component of our data pipeline that was written in Python 2 and it hadn't been touched since 2019 occasionally this code needed to be ran manually but as the years went by this became harder and harder to achieve this left us with two options either we would have to find a way for anyone to run this code or we'd rewrite it in Rust and despite how excited we were to perpetuate a stereotype we decided to use Docker instead by containerizing this component we were able to run it on our local machines without having to make any code changes the process of doing this is now even easier than before by using one of the new features from Docker the docker init command this command helps you to interactively set up a project with the files needed to both containerize your application and deploy it locally as an example here is an old node.js project I have from 2018 which I didn't dockerize I can Rectify that using the docker init command upon doing so Docker will automatically detect the platform that my application is using the command then suggests which version of node.js I should use in this case it's node 18 being the oldest LTS version that's still supported Docker in it will also detect the package manager I'm using in this case mpm and my start command I'm also able to modify any of these if I want to I do have to specify which Port my server runs on manually however once it's done it generates three files for me the first is the docker file which we can use to build our application image taking a look at this file there's a number of best practices that have been automatically added through the docker init command this is pretty great as it's sometimes difficult to remember to add these best practices in when setting up a new project the next file it generates is the docker ignore which is used to exclude files from being added to the docker context the one that's generated is rather comprehensive handling files from a number of different editors and project types the last file it generates is the docker compose compose doyo which we can use to run our application locally to do so I just use the docker compose up-- build command which will then both build and run my app with the correct Port exposed allowing me to interact with my application as if it was running locally all in all the docker init command helps to speed up a lot of the tedious work that goes into dockerizing your application which actually segus nicely into my next use case as well as dealing with Legacy code Docker is rather useful for writing new code as well because apps don't live in a bubble local developer environments need access to dependencies and tooling that's required to build and deploy the application and usually the versions of these tools don't align with what's provid by my package manager fortunately however we can use Docker to solve this one way that I've seen is to provide an image that contains all of the versions of tooling that a project uses then to use any of the tools you can call the docker run command mounting your Source directory as a volume and then either running your command inside of the container or invoking it using the docker run command whilst this works pretty well it's not my preferred approach to take instead I prefer to use Docker compose which provides the Run command to use this is pretty simple inside of a composed. you can Define the image of any tooling and its desired version in this case terraform 1.4 then using the docker compos run command I'm able to execute this tool as if I was inside of that container what's great about this is that it's much easier to add or update a single tool and Commit This to your repository for other developers on your team to make use of this also elevates the composed. Amo into a source of Truth for all of the tooling that your team uses as well as tooling this can also be set up for a local environment of the application stack as well by defining the application and any dependencies such as databases or other services in the composed AO it's then a simple command to start these services to both develop and test against in the past however this approach has been somewhat tedious as anytime A change is made to the application code it required manually rebuilding the application's image fortunately the team at Docker have improved this with one of their new features Docker compose watch by using this new feature you can enable Docker compose to watch for any file events on your current working directory then when a file is updated your application will be automatically redeployed into your stack to see this in action let's try this out on the chatly project from my rust socket iio video here I've created a composed. yaml that contains all of the services within my application stack I have my front-end app written in react and running on vit and I have my socket iio server written in Rust let's go ahead and add the watch feature first to my front end to do so I need to add the develop subsection with a watch field inside to the front end service entry the for this field is a list of rules that control the automatic service updates based on local file changes the first entry will have the action of sync which will keep any files in the path directory the chatly web app synchronized with the target directory of the docker container this action is great for any files that can be moved across without needing to be recompiled and works really well for our web app lastly we just want to make sure we ignore any changes to our node modules instead We'll add another watch entry this entry will cause our Docker image to be rebuilt if anything changes in our package.json with the changes made if I go ahead and run Docker compose watch and open up my application you'll see that it's running as expected if I make a change to the color scheme and save my file the changes will then be synced to the docker container and vit will hot reload pretty cool let's add this to our server as well heading back into the composed. yaml I first need to add the develop and watch fields to my server Service as the rust code is fully compiled we only need to add a rebuild action here pointing it at the correct part path and finally let's add an ignore for the Target directory now if I rerun Docker compose watch then add a quick new feature into my rust server when I save my file the image will start rebuilding and once it's complete we'll see the new feature appear this shows the power of using Docker as a local environment when it comes to writing code and even though this is pretty great I think it's nothing compared to what it can do for writing tests when building components that interface with other dependencies for example code that interacts with a database I prefer to write tests that work against a real instance instead of mocking one out this is called integration testing and whilst it provides a number of benefits it also provides some challenges as well the biggest challenge is ensuring a consistent test environment no matter where your tests are running fortunately we can achieve this using Docker but not in the way you might think one approach is to create a test environment using Docker compose which we would then start before running our tests however this presents a couple of problems because the environment is being set up outside of the test context there's no way for the test to know whether the containers are ready and in the correct State additionally any connection information has to be provided to our tests either through the use of envar or hardcoding this approach also doesn't handle Port conflicts gracefully especially if there's other versions of dependencies running on a user's system all of these issues can lead to the integration tests being inconsistent earning them the title of a flaky test for me a better option is to use a package called test containers the test test containers framework allows you to Define any dependency container images within your test code these containers will then be started and checked that they're healthy before allowing your test code to continue Additionally the package ensures containers will run consistently across different systems by exposing any internal ports on unused ones in the system Host this host Port can then be attained by using a helper method which we can then use in your connection logic once the test case is completed the containers will then be stopped and deleted all of this enables your integration tests to be consistent self-contained and isolated from one another the package is available for a number of different languages as you can see on screen and all it requires is Docker be installed on the system you want to run your tests on I could talk about test containers for a whole another video so if that's something you'd like to see then let me know in the comments down below using test containers is probably my most favorite use case on this video however with that being said the last item is still rather interesting when it comes to containerization as I mentioned at the start my most common use case for Docker is for deploying application code this is not only for production code in my day job but also to run software and services in my own home lab whil these are different use cases they do share one common concern vulnerabilities fortunately Docker desktop comes with a built-in container scanning tool called Docker Scout Docker Scout is one of the new features from Docker and it's really great for allowing you to easily perform container scanning on your local images however it can do more than just that and can be also used to scan your local file system as well well to see this in action let's jump back on over to that code from 2018 that I showed earlier to scan this project I can use the docker Scout quick view command passing in the file system schema to scan the current directory very quickly this shows I have a number of critical and high vulnerabilities not great it also gives a little hint on what to do next which is to scan the file system for cves let's go ahead and do that this gives a lot of output so let's start from the top and scroll through the first critical vulnerability we can see is for mongus 5.2.3 this view gives a list of all of the cves that affects the mongus package as well as the version that the cve was fixed in now this is pretty great but it's not unique most container scanning tools provide this information however the feature that I think sets Docker Scout part is the compare command which will allow you to compare two images or file systems and display the differences this feature is experimental but I find it works pretty well let's see this in action by comparing the file system to a Docker image that I created earlier we can see the number of known vulnerabilities is exactly the same on first thought that's kind of obvious they're both using the exact same code base but when you consider that the image has an operating system embedded as well we can determine that all of the vulnerabilities are caused by my application code we also get some other nice stats here as well we can see that the image contains 256 more packages and weighs an additional 43 mbes as well as comparing a file system to an image you can also compare two images to each other this can help you to detect if any new changes introduce new vulnerabilities into your containers or if the issue existed already the last Docker Scout command that I think worth mentioning is the recommendations command which gives suggestions on optimizations you can make to your image these suggestions go beyond just the vulnerabilities however and also take into consideration the size number of packages and popularity of the image as well all of this makes Docker Scout a great tool for optimizing your images and can help you make sure that you're not the one introducing that vulnerability into your production code using Docker for more than just application deployments has improved my workflow substantially however I'd love to know about you what are some of the unusual ways that you use Docker yourself let me know in the comments down below and I'll do a follow-up video on some of the more interesting ones I want to give a big thank you to Docker for sponsoring this video their new features are pretty great and I hope you can find some use for them in your day-to-day otherwise a big thank you for watching and I'll see you on the next one
Info
Channel: Dreams of Code
Views: 415,269
Rating: undefined out of 5
Keywords:
Id: zfNqp85g5JM
Channel Id: undefined
Length: 12min 58sec (778 seconds)
Published: Wed Jan 17 2024
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.