How To Setup Github Actions For CI/CD

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
we've all heard about cicd but what does it actually stand for you might think cicd is a sort of follow-up of the CD so first we have the CD which was great but then we have cicd which is awesome music listening experience or something well no that's actually not it cicd stands for continuous integration and then continuous delivery or deployment which are two different things I'm going to talk about those things today and show you how it works so what I've done for this video is that I've created a basic setup that contains all the ingredients of cicd and I'm going to show you how to set that up because if you set it up in the wrong way it's going to be a huge pain to do product releases and you're basically going to feel like you're in Hell whenever you have to do product release which we don't want to do now as technology I'm going to be using GitHub actions which allows you to run pipelines scripts after you've committed code to a repository on GitHub obviously and I'm also going to be using pulumi who is the sponsor of today's video you know Cloud infrastructure can get quite complicated because there's many different settings to deal with you may use Yama files to define those resources to change those settings directly from command line interface to it from the web interface of your cloud provider what palumi does is that it provides a way to define your Cloud resources in code using the programming language and IDE that you like so you're not writing Yama configurations per se or many workflows you don't have to learn some domain specific language you just write everything in the programming language that you like that can be by for example and then follow me takes care of provisioning resources to achieve the state of the cloud infrastructure that you like what's also nice that palumi supports over 80 clouds and clouds native Technologies like kubernetes a recently added product is pollumi deployments which allows you to update your Cloud infrastructure automatically for example when you commit to a git branch you can also run this from within the pollumi dashboard or you can even deploy directly by using a rest API interface you can get started for free with palumi using this link and I've also put the link in description of this video now let's take a look at cicd the example I'm going to use today is one that I've used before it's a simple API for retrieving YouTube channel information and this API is going to be deployed as a Google Cloud function and the way this works it's actually pretty simple Google Cloud functions expect you to set up a simple flask API flask app and then you can build that out and then use the cloud function if you take a look at the code there's actually three files that are important one is the main file that contains the main entry points for the flask application and that's main.pi and there is a single function in there called Channel API that gets a request and sends a response in order to handle several API endpoints I've created a route layer below this function and that's actually the code that you're seeing here so I'm creating a new app context for the let's say internal app and then I'm dispatching the request and returning the response so this is purely a wrapper around an internal flask application that does the actual routing and that's in this file so here you see I Define this internal flask app and then I Define the internal paths in this case there is a root route that simply creates a response that the channel API is running and there's also a route to get channel information depending on a channel ID now the routes file again uses operations which is the actual interaction with the database and this is what that file looks like so there's a function get channel that gets a channel ID and the path to a database why I set it up this way I'll explain in a minute and then I'm printing few things here just for logging purposes but you could remove those lines and then I connect to that database using SQL lights and now what I do here is I want to get the Channel with this ID so I construct a SQL query that that injects the channel ID into the query I fetch one I print it just for login if the channel is not not found I raise a channel not found error and that's a simple class that inherits from exception you could add more information here if you wanted to and then finally I'm returning a dictionary with the ID name tags and description of Channel that's all there is to it I've already deployed this app as a cloud function this is what happens when I go to the root URL so get the simple message that the channel API is running but I can change this to channels slash Iron codes for example and then I'm going to get a Json response with general information about iron codes so what is cicd CI stands for continuous integration it's basically a devops best practice where you regularly merge changes into a repository and then you're going to run builds and test automatically this is actually really nice because then whenever you commit a new version of your code it's automatically going to run builds setup configurations whatever you like and run a bunch of tests automatically and you don't have to do this on your local machine this can just be on the server somewhere you can even do it at night so that when you're done for the day then the next morning you come back to a whole bunch of bugs which is always good news so you grab your extra strong espresso coffee and then you can and just Dive Right In so it's really practical the cd part of cicd stands for either continuous delivery or continuous deployment when we're talking about continuous delivery this basically means that you're producing your software in short frequent cycles and then you can release the software at any time with continuous delivery the idea is that the process of deployment is quite simple and it's repeatable so the process is clear but often this is still done manually continuous deployment goes one step further than continuous delivery basically every change that passes all of the stages of production so the building the testing Etc is released automatically to the customers this means that you can release faster because any change you make well by default that's going to be release if it passes all the tests and it's also less risky because the changes that you make while you're working on the products are going to be small and that saves you a lot of trouble in the long term I remember in my company we had in the beginning of process where we basically works for months on a feature and then we release it and then basically it broke everything and then we had to solve a bunch of bugs which was really stressful what we do now is we're releasing smaller and smaller things and we're starting to rely more on feature Flags enable or disable those features separately from the actual deployment so the code is already there but the customers simply don't use that code yet and that allows us to re-smoke quickly and have more stability and less stress for our developer team but you can't just start doing continuous deployments you're going to need a great testing culture in your company because if you deploy changes directly to your customers you need to make sure that you have some processes in place that check your code for errors and mistakes that's why testing is so important and like I mentioned before what's really helpful is to have a mechanism for feature flags and possibly even faced rollouts to incrementally make a feature available to your customer without you having to redeploy your code every time so now let's go to the example and set up a basic CI CD pipeline if you want to do continuous integration it means you need some way of at least building and testing your code now in case File form of course we're not really building code but you might want to do other things like collecting configuration files moving assets to a different place or things like that and then of course you want to actually run the test when you deploy your code the way that I like to do this is via GitHub actions all of the repository providers whether that's GitHub gitlab bitbucket they all have some mechanism for you to run code run scripts after you've committed your code to a branch in the repository and that's exactly what GitHub actions do and the way it works is that you define a workflow file where you specify what should happen after you've committed your code to a certain Branch before I show you how that works let's first look at my test setup the reason I split up the application into a main file route and operations is that the operations are now actually pretty easy to test you see I have a database path parameter and when I run a test I can just Supply a different database which is very useful so here I have a test folder where I've created a mock channels database that just contains some random data and then I have the test file where it's actually very simple I import the operations from the operations file I Define where my mock database is and then I have few tests to check whether the channel information is actually correctly retrieved so I have to test one is a success test and one is a fail test I'm using pi test and on top of that because I'm calling get channel multiple times with the mock database I'm using partial function application to already Supply it with the path to this mock database and then you can simply call it here and I don't have to supply this constant value of the time so now I can simply run the test locally on my machine Pi test is going to test these operations for me and of course both of them pass because that's how I set it up if you want to run these tests after you've committed your code to a branch you need to add a workflow and how do you do that well in your repository you need to add a DOT GitHub folder that's the folder you see here and within that you can add yaml files to describe what should happen after committing here for example if a yaml file that says that well if I push to the main branch I'm going to run these jobs and there's one job that's called update and that runs on Ubuntu and these are the steps so this is a standard GitHub action thing we want to check out the repository then I'm going to set up python so Python 3.10.8 and I'm going to install the requirements so then this installs things like flask Etc any other dependencies that I'm going to need and then finally I'm running the test so python minus B minus M by test the minus B is there because I don't want the Y test packets to create cache for me because that's going to screw up the deployments later on so with this workflow whenever I push new code to the main branch it's going to do these things it's going to run the test for me which is really nice you can actually see the results of those tests in GitHub so this is an example of a workflow that I've recently ran and there you see we have a run test task and we can see that actually render tests and they both passed just like I showed you before my local machine and all the other things are here as well like setting up python or installing the requirements so that's pretty good by the way if you are enjoying this video so far give it a like it's much appreciated if you want to go beyond continuous integration and also do continuous deployment you need to add more things to this workflow file because then we're actually going to need deploy the code and what's nice is that since we're also running tests in this workflow file we can add the task of deploying the code after running the test so if the tests fail the code is not deployed which is exactly what we want so here I have another version of this workflow file it looks almost the same except we have some extra configuration for configuring Google cloud and getting the credentials in and at the end after running the test let me make this a bit larger you see that we have a deployment here which is that we're calling gcloud functions and now we're deploying the channel API with the given entry point and the other settings that we need so this means if the test pass then we're going to deploy any updates as a new Cloud function and we don't have to worry about anything anymore assuming that we wrote enough test which at the moment I didn't really do so you probably should do more testing there's a couple of things that are interesting in this particular workflow here you see we have a task that configures the Google Cloud credentials and that refers to a secret where are these secrets actually stored well actually that's in the GitHub settings for this particular repository here I am in the settings part of this repository and as you can see I have here under security a number of secrets so you can pick various areas where you want those sequels to be available but we're using GitHub actions so here you see we have couple of Secrets like the Google Cloud project and the Google cloud service accounts key and if you look back into the actual workflow you see that I'm also referring to those things here in the workflow so this is normally how you would set it up also the nice thing is in GitHub there's actually no way to actually read those things once you've set them I can edit this but as you see it just shows an empty value so I can update it but I can't see the previous value that I put in so this is pretty cure in any case what you shouldn't do is put those secrets in your repository because then anybody who has access to your repository also has access to those secrets and that can be dangerous you want to put it here as secrets and then use it in the workflow so the final version of this workflow file that I want to show you today uses palumi and here it's set up slightly differently well we still need the Google Cloud credentials and setting up Google Cloud because palumi ultimately relies on those things but what's nice with palumi is that in your workflow file we're actually not defining at all anything related to the cloud architecture You're simply telling hulumi to update the system and I'm also supplying here another secret which is the access token to the pulumi app so where do you then Define This Cloud infrastructure well you do that as part of your own code base you see an example of a file that uses palumi to specify the cloud infrastructure there's quite a bit of code in there I'm going to go through it in a minute but there's also an easy way to get started with this by using bloomy templates and that basically sets up an entire Cloud architecture project for you where just Supply a couple of basic settings for example I can create a directory that's called a new project and then let's go to that directory and then I'm going to create a serverless Google Cloud platform application in Python using a palumi template for that so I simply press this and now it's going to set up everything so I can enter a project name so I'm just going to call this my project oh that's not valid let's try something else my project without spaces like so and then we can also just add a description we're going to do that for the moment you can create a stack that's basically a deployment environment you can have a Dev stack and a production stack at any type of Stack that you would like to have so I'm just going to use the default here you can add a path there is a couple of settings specific to this template like which file to use for error pages so this is for deploying static website what the index document is what the site path is and which Google Cloud project to deploy into so that's iron codes follow me in my case and then the reason well U.S Center one that's fine and now it's going to create everything that is needed for this particular project so you see it now has completed this project and not the only thing I need to do is run pull me up and it's going to deploy this to the cloud now this has created a basic setup for a project the most important file here is the main donor file that contains the actual specification of the architecture so there's a couple of things in here like configuration settings and then it creates a storage bucket that's configured as a website it creates an IM binding that's a um that's a Google access mechanism to allow read access to that particular storage bucket there's a synced folder that syncs the local files to the folder in Google Cloud then it creates another storage bucket that actually contains the app and it uploads that to the storage bucket and then it creates a cloud function that returns some data and finally there is a im permission to invoke the function from the website and that's basically it and then you can take this kind of file as a basis and an edit it so this is actually exactly what I did so here this actually simplified version where I simply Define a path to the source code so in my particular project that's in the functions folder this is where my main file is then I'm storing the source code let me make this a bit larger in a Google Cloud Storage bucket location in the US then I'm going to create an archive object so I'll take the files that we should put there and put it into an archive so that's what we have here and then I create a cloud function that takes this archive as the source to be used and finally we have the invoker which is again the permission to be able to call this Cloud function and that simply deploys everything so this defines the cloud architecture and then you see there's a couple of other files here as well in particular there's this one which is a balloon configuration where Define the project which I'm going to deploy This Cloud function so that's iron codes palomi and the region in which I want these things to be deployed so this is the definition that's used for the dev stack you can add other Stacks as well and then you can use other projects and other settings if you want to and then there is a general configuration that contains things like hey this is a python app we're using a vmf and there's a description and a name and you can add more things here as well what's nice about this approach is that you have all your configuration and architecture in one code base so it's really easy to manage and you don't have to go through different source of files and the yaml file which is the GitHub workflow doesn't really contain any architecture specific things it just tells palumi to run an update and it's going to use the dev stack name that we created before so I think this is a nice way to approach Cloud architecture I personally don't really like this Yama file so the less I have to work and then the happier I am and I prefer to just work with python because I think that just works better so one thing you need to be aware of is that you need to use this B flag otherwise by test it's going to create cache files and then palumi doesn't know what to do with those files and another thing is that currently I'm not running those tests in the virtual environment that palumi creates so there might be some discrepancy if we have different versions of python of or different versions of the requirements that we use and finally like I said it's really important that if you use this kind of cicd pipeline that your secrets are defined in GitHub in the repository settings if you don't do that then of course it's not going to work but if you store them in another place then chances are that somebody who you don't want to have access actually has access to those secrets and that may mean that somebody can do things with the cloud architecture that you definitely don't want them to do so we've now seen a more or less complete CI CD setup it's still very simple but it has the main ingredients it runs test before it deploys and then it actually deploys the code as a cloud function you probably don't want to use only Cloud functions in your Cloud architecture you probably want to do more complex things if you want to do that you should use Docker and if you want to learn more about Docker I did a video about that recently you can watch that right here thanks again to palumi for sponsoring this video hope you enjoyed it take care and see you next week
Info
Channel: ArjanCodes
Views: 11,916
Rating: undefined out of 5
Keywords: cicd, ci cd, github ci cd tutorial, github ci cd, github continuous integration, github continuous delivery, github continuous deployment, github continuous integration vs continuous deployment, github ci vs cd, ci vs cd, continuous integration and continuous deployment, continuous integration and continuous deployment cicd, github actions tutorial, github actions ci/cd, github actions workflow, continuous integration, github actions ci/cd tutorial, pulumi tutorial, github ci
Id: 9flcoQ1R0Y4
Channel Id: undefined
Length: 20min 27sec (1227 seconds)
Published: Fri Dec 16 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.