Azure DevOps: Provision API Infrastructure using Terraform - Full Course

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Applause] I've had a really tough day I have just been manually spent most of it manually provisioning infrastructure on my cloud provider creating resources deleting resources recreating them I think I've got that VAR SI if only there was a way you could represent infrastructure it's called and then automatically deploy it via maybe RC ICD pipeline or something like that that would be a utopia to me only such a thing existed wait a minute such a thing does exist and in the video coming up I'm going to show you exactly how to do all of that so stay tuned [Music] well hello wherever you are whenever you are wait am i I'm in Melbourne Australia and when is it it's me 2020 well I hope you enjoyed that dramatic reimagining at the start of the video there I genuinely think I'm wasted here I should have become an actor and maybe with the performance of that I would have won many many Oscars I don't know what do you think comments below maybe the one anyway it's great to have you back great to have you with me quite a lot to get through so I'm going to jump straight into it and before we do though just if you like the video that's maybe a bit early to make that judgment call but if you like the video please do give it a like and if you haven't done so already please subscribe and to my wonderful patreon supporter so just go that little bit extra by way of support thank you so much to you to add and of course your names will be appearing at the end of the video but for now let's take a look at what we're going to be doing all right yes so today we are a DevOps as your they're not special and we're actually going to be provisioning API infrastructure using something called terraform and we'll be building a fully fully automated sea icd pipeline all the way through from coding right the way through to not just deployment back to the provisioning clothes and infrastructure as well so this is what you're gonna learn in today's course though what I'll see is this is quite a long video again in the description below and you can actually jump to any of the parts of the video I'm going to go through now if you want to just go straight to a specific section of interest I would recommend you watch the whole video but if you just want to go to a specific point you can do that and it also means you can revisit the video and come back to specific bits if you want so yes in the description below take a look I've got the full outline of the course but this what we're going to go through I'll give you an overview of today's tutorial or today's course I'll take you through shortly what we're going to build we'll talk briefly about the value of infrastructure as called and then I'll take you through what tools you'll need to work with to these course we are then going to move into a course and we're going to scaffold up our API using dotnet core we're going to package that docker we're going to push it to docker hub we're going to push our cord to get hub now I've covered a lot of this stuff before in my previous videos I'm going to go through all step by step I'm not gonna skip I'm not going to dwell on it okay so if you're not used darker before you're not that familiar with the api's check out some of my other content and then come back here otherwise we are still covering all completely step by step don't worry about that and what we will see is the cord is going to be on github so I'll put that in the description text below as well so if you want to look at the cord as you what give me through the video it's all there for you as well so once we've done that we're then going to move on to talk about terraform which is kind of a central concept of today's video along with is your DevOps and we'll talk about the benefits of terraform what it is why you would use it we'll write our first infrastructure as code file or provision that on Azure and then we'll talk about some security considerations and we'll be doing all that our command line just to get ourselves set up for what comes next which is moving over turns your DevOps and creating a fully automated CI CD pipeline so yes we'll set up a project we'll set up some service connections which is basically that connections to the other systems as your DevOps needs to talk to we'll set up our pipeline and we will trigger our first build all the way up from our desktop to kick off the pipeline into is your DevOps part 2 we'll talk about something called variable sets will introduce terraform into a pipeline and we'll talk a bit about terraform State and yes people add to terraform to our pipeline and run it we'll talk about something called idempotency and it's not an issue but something we need to cover off and you'll also talk about it build the ocean link which is very important to and then finally we will wrap up yes I'm going to give you homework you heard correctly and a bit of wrap-up and my supports are credits to remand it off cool so I think I've mentioned everything I've wanted to there yes chapter links are below and also our quarters on github as well so you can follow along awesome so let's talk about what we're going to build today now you will see here you've probably seen something like this before when we're talking about there oops I've kind of repurposed it for my own purposes and the one thing I want to just call out straight away you would usually have a testing step in your DevOps pipeline is one of the most important parts of it for this video though it's not the focus and I feel adding in unit testing and all that kind of stuff isn't really going to add benefit to the today's central topic so I've taken that out of our plate link but just be aware of course best practices that you would always have a testing step in your pipeline so we all starts off with down on our desktop using BS cord where we will scaffold up our api cord we will have our docker file that will take that code and make an image a darker image from it we will write our first terraform as infrastructure as code file that will specify the resources that we want to provision on Azure and we will have our as your pipelines file in our project as well which describes how our pipeline should operate and run now a really important takeaway from today's video is that is all considered as called a unit of code that we will commit to a github now you'll notice that we have our application code in there we have our infrastructure as code and there we have it as your pipe things called in they're all considered code and I guess what I'm going with that is as an application developer some things you forget about infrastructure as an Operations person or an infrastructure person you might not be that involved with the application cord with this approach you're basically bringing the two worlds together through their box into the one place and getting them to think about each other's domain and really you're breaking down and dissolving those boundaries for you no longer have dev an obstacle it's really just a DevOps team so very important concept our cord is considered as one unit that we will push up to github get tubble then trigger our build on as your DevOps and what were then going to do as your DevOps is going to build our docker image it's then going to push a darker image up to docker hub and then it's going to use terraform and we'll come on to what Tara form is in a bit to provision and deploy that image - as you are and terraform is going to do that in conjunction with is you and they've all ups we've all talked as your and provision of resources for us and that is going to be completely automated the only manual step in that at the end of the day is going to be the push up to github so you're going to build all that today you're going to use all these tools to be fully step-by-step but just before we move on the main thrust of today's video is around terraform is around Azure of DevOps and as you and the provisioning of infrastructure and those two files they are specifically so while we cover everything else it's not a deep dive so just be aware of that all right so just very quickly some benefits of infrastructure as called as I can I said they are it truly being brings their and ops together upfront and I would consider a DevOps based practice you'll end up with consistent infrastructure configurations rather than stuff that's maybe somewhat ad-hoc you'll end up of course certainly with faster deployments they can certainly attest to that even if you're not deploying stuff or provisioning stuff in production even if you're just setting up cased environments or bed environments I can tell you're using this is so much quicker than manually having to say it stuff up so you definitely save a lot of time there's less documentation yeah because the cord effectively replaces it and you have that traceability because you're checking record into a repository it's fully traceable and you're not going to end up instances where you have critical bits of sierra infrastructure sitting under people's desks and nobody knows why they got they are how they got there and I'm not joking that happens even in some of the largest companies over there that happens which is it's really shocking so infrastructure is cord gets rid of all that and you have your traceability where everything comes from cool finally I Lydians before we move on to starting the course text editor I highly recommend es cord it's free dotnet core SDK which is free you can swap out that for another framework if you want it's not really weighted to that but there will be some differences and the rest of the projects or if you want to follow along a releases to use donate core terraform purchase free as your CLI which is free darker which is free and then you'll need accounts on Azure as your DevOps github and docker hub all of those are free with the exception of Azure where you will need to supply a credit card if you want to sign up but if you've not signed up to Azure you'll get 12 months and 12 months worth of free services and the services were using to them fairly confident are included within that free of unit even if they will they are very very low impact from a cost perspective so you know tell the stuff then once you've finished with it it's gonna cost you pennies or Saints or whatever currency you're using so not going to the expense so but yes you will need to provide a credit card or there's your everything else completely free and you just sign up cool so let's get cording or let's get cording our infrastructure what we're waiting for ok so the first thing we want to do is create a very simple API that we'll use so the rest of the project so let's open in your Visual Studio core dwindle bring it over onto the screen control apostrophe just to bring up a shell and I'm going to work in to move into my working directory which is on my view drive so I'll change into season three and change into episode 3 where we are no first thing we'll do is a dotnet the ocean just to make sure we have the dotnet SDK donut core SDK installed and indeed we do at the ocean 31 if you get an error message they are just Google dotnet core SDK and follow the instructions to download awesome so the next thing we want to do is get a list of the potential templates that we can use to create our project and that's dotnet new and that gives us a list of templated projects that we can scaffold up and the one that we want is this one here the asp.net core Web API and that's called an API now this isn't a tutorial on Web API is I've done plenty on that already so if you're not that familiar with them please check those out otherwise let's get cracking and setup so again dotnet new we have AP I can give it a name of weather API or reasons that will become very obvious in a second and that should create our project folder so let's open that in vs cord so cord art open it recursively where there API and I will just reopen the project in Visual Studio code perfect no will get are looking pop-up here I'm sure in a few minutes or a few seconds just quick yes you only have to do that once and this is our project again not going to go through it in massive detail what I am going to do and this is the only bit of it's not even really cording that we're going to do at this point is click on our start up class and I'm going to remove this HTTP redirection command here I don't want that at this point in time so take that out if you don't probably okay but I would take our enemy just to be on the safe side just to keep things nice and simple and that's basically it for the cording let's just test that it runs I mean it should run I'd be very surprised that it doesn't and I can show you what it does so don't yet run that will run it up let's bring your web browser over here and probably got it cached already if I'm going to be playing before so yes localhost port 5000 M you want to hit the weather forecast controller and indeed we have and it just brings back some JSON data that represents somewhere that information this is somewhat randomized so if we run it again I'll bring back a slightly different yeah I think that's different and that's it that's all that's all we need to be going to deploy that out as your we don't need to do any more with that we might make some maybe not changes just to test our pipeline but for now that's all we need to do it by way of the cord in our project but what we do need to move on to node because we're using docker is to add a docker file into our project ok so let's move back over to our application project let's kill the running web server and yes we want to create a darker file so we'll right-click click anywhere in the the root of your project you can either right-click new file or use a little icon up here we're going to call this one darker file no surprises and what is a darker file well it is just a set of instructions it takes our application called in packages into a docker image that we can then run and publish else wheel now I never really usually like cutting and pasting code but I'm going to do so in this instance as darker and darker files are not really the core thrust of what this video is about if you are interested or you're not sure you've not used docker before then I've made some other videos on that and I also have a fairly detailed blog article that goes through how it works and what it does so set of instructions let's save it off and let's run it and what that will do is it will take our cord and create an image now I strongly recommend if you're using BS code to download or install this little extension and it gives you a nice view of your images and your containers and all that stuff so to build docker builds views atop the CLI and we're going to tag our build with on a name and the convention is your docker login ID - binary thistle and in forward slash the name of your whatever you want to call it and then just the build context which is our current working directory and that will go off and build our docker image and you'll see some activity here where images are being pulled down as part of the build all that kind of awesome stuff now once we let that happen I don't have any images such wise pulling those down when that happens we'll run it just to test it to make sure it works and then finally we'll manually push it up to docker hub so docker hub is it's a base of the echo an image repository where we can publish our own images and also download images from vendors such as my sequel article and Paul squares and all these kind of people and get their pre-packed images for use and our project so very very powerful very useful now we're doing this manually but we are going to of course replicate this method fashion in a bit so that looks like that's what you can see here we now have three images the two microsoft ones we were referencing and our binary Thistle weather API image let's just test it to make sure that runs and the way that we run it is docker run I'm going to do the port specification and we're gonna map to an external port could be anything I'm using 8080 and I'm mapping it to the internal port exposed by our image and you can see here that was and through this little declaration here in our darker file cool and a me just need to tell it what image we want to run just spell it correctly bring that this sort of weather DPI and I should just run out fantastic now if we pop over to here previously it was running natively on our desktop on port 5000 so if we change it now to 8080 we should get a very similar good put and indeed we do so that's cool that's all working and let's just kill that you'll notice ctrl C on Windows anyway it doesn't stop it from running so you can either right-click here if you're using this plug-in and click stop or darker click into the terminal docker PS to get a list of running containers and then stop and you can either use this ID or the name probably better to use the ID actually because the name may not be unique and that'll stop because yeah it stopped running cool and then the last thing we want to do is publish it up to docker hub just to because we will be making use of it in a second docker push and then make this all API and that will then push our image up to docker hub and we will make use of that in a minute when we come on to using terraform our command line we will need that in our repository cool okay so that has finally finished and if you move back or what - docker hub and refresh on my repositories you should hopefully see where that API now we've done this manually we will be setting all this up automatically as we move through the video cool so the next thing we want to do we're going to add our a project to to get hub and therefore locally and have it committed to get as well so just before we do any of that I want to add a get ignore file - again to the root of our project so new file dot get ignore and that just means we don't include a lot of stuff that we don't really want to include mainly compiled assets and all that kind of stuff now I'm just copying that over as well I wouldn't even dream of typing now and would be ridiculous or copying it from another project and this is again available on the article I referenced previously or you can just Google and get ignore file in there some good examples over there cool so that's that done we then just want to initiate a local get repository so get in it and you can see we have some nice stuff that we need to add to our repository so get and everything cool and then we want to do a git commit and we'll call this in the show commit commit another thing and we do a git status we should be all clean on our local branch fantastic so what we want to then do is push all that up to github so let's get github up and running github again I've already logged in so it takes me to my github repository and let's create a new repository to house all this so new repository and I'm gonna call it season season zero three episodes zero three and we'll call it and as your DevOps and tariffs on be very doubtful if that existed elsewhere we'll leave it as a public repository for now I'm not going to bother giving it description so cool click create and you can see here if we hadn't already created our local get depository we'd have to follow these steps but we only have to do the following two steps here so we can probably copy these lines and both at the same time I picked up doing it one by one I don't know why we just did and then I can remember the last command would be they get push origin master and that'll take our local get repository I'll push up to github now this activity here is the activity that again going back to our PowerPoint will trigger all build pipeline cool so we move back over again do a quick refresh let's just make sure that I got popped up to github and indeed it does cool so we've kind of done the groundwork that we need to get started with terraform our command line so let's move on to that now so actually just before we move on to actually using terraform I thought it would be worth just spending a couple of minutes on just covering what it is exactly and more importantly why we're deciding to use it in this case so on screen you can see along the bottom I've depicted three of the begger cloud providers out they are the important takeaway is that ten a formal apps they work not just with those three but with many many other providers as well but I just put these three on screen because I felt most people watching this video will have some degree of familiarity I think with one of these three bigger players so what is terraform well before we answer that question let's just talk about autumn provisioning resources on your chosen cloud provider now in my case I'm more familiar with Azure now if I wanted to do that don't need to use terraform I could just use as your resource management templates for example to do that and Microsoft specific technology to provision those resources and that's ok that would work however if I then needed to move everything onto AWS or Google Cloud I would then have to learn their specific the ocean or favoritism I don't with the useless cloud so I don't know what they would be but I'm fairly certain they would have to be some major learning involved in order for me to switch my delivery pipeline over to one of those providers ok so an interest of saving time this is where something that terraform will add value and come in really useful because it provides this yes this layer of abstraction away from specific implementational detail and allows you to specify your infrastructure as called using its framework using a common framework now the way we define infrastructure as code and terraform is using something called HCl hashey Corp language or you can actually use JSON as well the HCL approach is the preferred approach nonetheless you only have to learn that one framework write your infrastructure as called in that way and terraform will then take either the heavy lifting and provision across whatever cloud providers you need to use so what it means for me as a developer then is if I then do need to pivot away to some other cloud provider yes I'll have to rewrite my infrastructure has called definitions to a certain degree you'll still have to understand some of the nuances with specific cloud providers but generally speaking the resistance to move is going to be greatly reduced the other benefit introduces you may actually be an organization that uses many cloud providers at the same time and on a number of organizations do that and so again terraform provides this layer of abstraction so you don't have to deal with specifics so that's really the benefit that's why really wanted to label that point rather than just using technology for the secret that is the benefit that it brings to the table for me as a developer future roadmap future proofing ease of movement all that stuff how does it work well it makes use of the respective cloud providers API and it makes use of its own providers that will talk to those api's as you can see there on screen and so basically it will translate the definition files that you write in HCl or JSON and it will translate them and make it into a format that is usable by the respective api's for each provider thus provisioning the resources as required cool and yes what's with falter clothes I just want to label that point as well so I think we've got a couple other points I think I've covered them all before we actually move on to using it so yes it's open source so it's free which is cool almost any infrastructure type can be represented so what do I mean by that I mean things like about machines containers databases switches all that stuff so we can do all that covered this already works with all major cloud providers using its own providers that talk to their api's and yes we can express terraform infrastructure cord as HCl which is has caught hashey clock language and that is the preferred approach or you can actually write it in Jason as well now there's not a number of other things we will cover off remember the plot that as we move through the video but I think it's time to actually go on and using it all right so we want to actually start using terraform now so in order to use it we have to foster fall down Lord and install it now you don't even really install it to be honest with you just if you just google terraform downloads you'll select your environment Windows or Linux or OSX or whatever it happens to be and in all cases it just comes as a single executable file that's all and if you want to make use of it from your command line more globally then you'll have to update your environment variables your path variable in the case of Windows things probably the same on Linux so just to show you environment variables over here in my system variables I've updated my path just to include the path to the terraform executable cool and if you're not too sure about how to do that just google it I'm not going to show you how to do that here it's pretty straightforward and just to check that it works take them terraform at the command line and you'll get some advice on how to use it excuse me and the other thing that you will need to follow along is the is your CLI so again there are a number of different ways you can actually use as your from your command line but I advise you use as your CLI and you can see here that you can just get it and install it for your various platform as well definitely recommend that that's what you use it's much more straightforward than other methods in my humble opinion ok so I'm just going to clear down our work environment a little bit and we're going to come on to actually creating our infrastructure as called file so just before we go on I just draw your attention to the fact I have installed this visual studio called plugin that allows some degree of auto completion of the terraform files that we're going to produce there are other ones out there I don't know if this one's the best one I've ever walk or whatever but installed it seems to work ok check out maybe look at some other ones that might work for you better so to create our infrastructure as code file we're going to create a file in our route so new file call it anything you like but a convention dictates we called it in and then the extension has to be a dot TF and you can see very helpfully visual studio cord puts a little terraform icon there fathers so the first thing we want to specify in our file is the provider that we are wanting to work with in this case it's as your sole provider tag and then the name of the provider as your RM Open bracket close bracket now this is this HCl language it looks a little bit like Jason it doesn't seem to employ the use of commas which I don't actually have a problem with it's quite nice and this provider name is a terraform name for it says your provider again all this is on the terraform website if you want to understand the providers for different providers so we need to specify a version or do we want to use and again this is why I'm seeing this particular plugin that's trying to autocomplete it's not protected legally it's not picking up that's what we need to use but you know that's okay and then the other thing we need to specify is this features features tag I thought I could get away with not supplying this but it complains if you don't put it in course all that's saying is we want to use as your weight so the next thing we want to do is specify the resources that we want to create on as you all know for those of you who have used this your before you will be familiar with rizal scripts so before you can even create containers or got to the machines you have to put them into Azure script now I'm going to create one for our purposes so you won't be surprised to hear that we define that by specifying this resource block and we then specify the type of resource we want to create so in our case we want to create a as your resource group now you're probably where did that come from how do I know what these are that's all documented on the website so if you want to look if you want to create specific type of resources go to the website look up the type that you'll have to specify here and then we need to give it a name let's give it a name terraform tests don't matter what it's called the only thing you need to be on mind is that this name of this resource is the name and then as far as this file is concerned it's not actually the name of the resource on Azure that we are going to create and these two bits of information taking and combination should be unique and we can actually use these tags here to reference this from other resources and you'll see what I mean in a minute when we come on to creating our next resource anyway it's a good segue into saying we do need to actually name our resource on Azure so we use this name attribute here and it's just name equals whatever we choose to call it so I'll call it terraform in the source group doesn't really matter and then we have to specify our location as well and that is the location of where you want to provision your resource and in my case that's Australia East obviously depending on where you are in the world you'll probably want to provision that somewhere else and you'll have to find out what that is just go to 0 and have a look at the available locations you can actually do that from the command line in fact let's let's not do that yet cuz I want to demonstrate something else but I can show you that if I remember in a bit cool so let's see if I file that's the first bit done basically so let's see if we can apply it to Azure no the way you do that is by using the terraformed command line and we're going to use a number of different commands over the course of this tutorial let me just show you what the art here on have to remember to delete them the first one is in it and that's going to basically initialize our terraform system to make sure that's got the right provider to be able to talk to Azure so we'll show you that in a second next command is plan and that's more for us as a developer to see what terraform has in mind what it's actually going to do when you execute the plan so running terraform plan just generates a plan doesn't actually execute the plan and then the next command will be apply there's a name suggest so that's applying the plan whatever that happens to be and we will eventually use this command at some point we have the destroy command and sure you can imagine what that does so let's kick this off by doing terraform in it and you can see here it's initializing something called the back end and we'll detail that bit more when we come on to doing our pipeline but for now it's seeing it's successfully been initialized and it's downloaded the plug-in provider as you are in fantastic and you may have seen up here that we have this dot terraform folder and that's where that plug-in though sits cool so let's try and have a look at the plan so see what happens now actually I wanted this to fail to be honest with you and it doesn't look like it's going to fail it looks like it's going to work and I'm going to tell you why in a minute so I didn't want that to happen so let's take a look at what it's done it's gone off to Azure and you're gonna ask how did it go off to Azure but let me come back to that in a minute it's gone off to Azure and it's gone to check to see if we have a resource group called there and I don't at the moment so it comes up with this plan to see Terra Terra forum will perform the following actions and it's going to create the resources that we want but I don't actually want that to happen because I wanted it to fail and the reason I wanted it to fail was you're gonna see well I know one can't anybody just run commands at a command prompt and start creating resources on my as your subscription how does that work the reason that what does I'm actually logged in to as you're using me as your command line so let's log out let's do that and let's try it again so this should fail No yeah there we go cool that's what I wanted to happen initially try and run the plan terraform is going to go off as your January plan of course it's failed because we've logged out or what because I'd previously been logged in so I just wanted to demonstrate that and put your mind at rest that terraform doesn't have some specific privileges into your azure again so we need to login so the interactively at the command line and that will in this case generate a pop-up here and that will ask me to log into my as your Azure subscription now cool this is fine that is fine for working at the command line and doing stuff interactively look when we come on to implementing this in our pipeline we of course don't want any degree of interactivity at all we just want things to be authenticated through without that interaction so we'll come on and I'll come on to showing you how to do that and in fact we'll set up for our command line and then we can apply the same principle when we come to setting up for our pipeline nonetheless we've logged in that's all cool kind of ruined the supplies by running the terraform plan but let's run it again it'll come back with exactly the same result so it should do because we're not applied the plan cool so you open a crater rizal script all good and then to execute the plan we type terraform apply now again it will challenge us here to see are you sure you want to apply it because you're creating resources and you know that that could incur cost potentially so we want to make sure that we want to do it so again it generates the plan and asks us if we want to move forward and they do yes so that's a good point the the plan command the captaining but if I've said this but the plan command is really for us as developers just to check out to see what it's doing you don't need to actually run that in the sequence I've shown you there you need to do the init if you haven't done that already but once you've done initiation that's fine you can just jump straight to doing apply plan is more for us cool and it says the appliance complete one resource has been added the zero changed zero destroyed so if we come over to as you are here we are I'm over in my Resort scripts already if we refresh there we go don't know what was happening there yeah TF our main is there it's probably the least exciting resource to create but nonetheless I have been provisioned on as you are for as fantastic ok so we now want to create our second resource which is our container to host our weather API docker image just before we do that I just want to take you back to the shore and just show you the resource that we're going to create just to give you a better context so it's a container instance I'm not actually going to create it here I just want to show you some of the attributes that we will be needing to fill in they will become familiar when you actually see it in the cord that we right so what subscription what's our resource group that would be the one you created down here cool location for you we are going to get our image from zoar we're going to use docker hub and we will specify our image name which will be something like binary thistle weather API the OS that we want the size of our container image and if you go into networking we can see that we can see it so we want it publicly available we can give it a DNS name so we can you know access it from access and more friendly building rather than using an IP address and specify a port so again I'm not going to do that here I just wanted to give you a quick overview of what we're going to create just to contextualise it for you but we're going to do it in chord so won't come as any supplies that we want to create another resource block and in this case the first thing we actually need to create is something called a as your M container group and as before we to get the name so to terraform container group test do the name is any and again these two things have to be unique within this file and again this is not the name of our container that we're going to create this is just the name of this resource from within this file so next we do need to specify the name of this container group and I'm just going to call it weather API and we need to specify a location on this again now we could basically just do not have done up here and specify Australia East but one of the nice things about the smart cup is that you can actually reference other resources and this is kind of hopefully making it a bit cleaner as to what this stuff is this naming convention allows us to reference this resource from other resources within this file and if you're used to object-oriented notation drink covers any supplies to see that will specify location of this resource make sense hopefully does really straightforward it's going to tab this out just to make it a bit more readable excellent and then we need to specify the resource group name and you didn't tab that enough which you want it to be this resource group here so you can all point of and there's all script is so we can put this resource into it and be accessing maybe some are way except this time we specify the name attribute cool now we've got a few other bits and pieces to specify and I'm just going to copy them across because I'm sure you probably don't want to see me typing that stuff out and implying I make a mistake as well so this should look familiar address take public the DNS name label we will give it a unique or semi unique name binary thistle there that API the West tape is or Linux now the next thing we actually want to specify as a container object and I'm actually going to copy across these attributes again because they're a bit be a bit laborious for me to be typing these out in great detail you'd probably go back crazy so let me just take you through what they are the name of our actual container the image we want to use the CPU size the memory size one gigabyte and then the ports that we want to specify so feel they should look fairly familiar to you from what I just showed you before let's save that off now let's run terraform guinta i'm going to let's run terraform plan and see what terraform thinks that we need to do in this case cool so let's come back with our plan and it's basically seeing we need to create this container group source and with all the various bits and pieces that we have specified it has not said that we need to create our original resource group for it so you know what's coming next let's do a terraform apply and as before it did move on our plan again let's just insert a value of yes to proceed now you can override that with a switch we can do that later I'll show you how to do that cool and you can see here it is added one resource not changed any and not destroyed any either courses are gonna be exciting though because we're actually provisioned a something useful there we go where that APL click on that and you should have a look and see that it's up and running hopefully it is it's just started obviously but yes is we've had the option to stop it so I'm going to assume it's running and you can see here it's got the drain lis DNS name that we provided along with the other things as well so why don't we test that and see if it's working so it's copy the URL to the clipboard and let's go back to here and let's just paste let's just paste the URL in here making sure we retain the controller that we want to go to them and there we go so we are out on the interweb with a live container running of our application that it pulled from docker hub so very very cool very simple very quick no that's all good but there was a lot of manual steps there of those loving plans there was you know logging in all sorts of things with that so we want to know take this concept and as per our plan of attack let me just bring up my beloved PowerPoint here we go okay so you've you've already seen this so I'm not going to labor it in any more detail we've kind of done less fuss but here we do still have to do some work on a terraforming file and of course others your pipelines file which is what we're coming on to now but yes what we really wanted to use is as your DevOps to automate everything they've already done at the command line so everything else here actually done at the command line so it's no time to move on to as your DevOps but just before we do that what I want to cover off is this little interaction here between terraform and as your we've already done that at the command line and in order for it to work I had to issue the a Zed command using as your CLI an order for terraform to be able to create resources we don't want that interaction or that interactive login when we come to automating it so we're going to have to do something about that and the way we're going to address that is to set up something called the service principle which is basically almost just like an automated user but terraform can use to create resources or as your so we'll set it up and get it working on our command line to begin with and then we can use that as a basis to they use it in Azure DevOps so we're going to go on and do that so in order to create a service principle we need to go over to a zoo and create it there and really just think of a service principle as an automated user a user account almost that an application can use to do some things on as your so that's basically what it is it's a system account on an application account basically now when we set this up on as your we're going to get four bits of information we're going to get the client ID we will get a client secret which is basically the password for this user account we will get the tenant ID and that is the ID of our lose your active directory that we have created this service principle in and the subscription ID which is the ID of my as your subscription or your as your subscription and terraform will you'll configure terraform so it can use these values to talk to as your and then authenticate straight through without any interaction that's basically how this is going to work so at the command line we'll set these up as user defined environment variables specific to me and then have you come to configuring it and terraform and we'll configure that and as your DevOps and you'll see how we do that in a bit before we move on to creating a service principle in obtaining the values I cannot stress this enough and I can't stress it enough I'll put flashing red lights up or something this information is I'm sure you can understand is highly sensitive and if anybody other than yourself gets access to these values they can absolutely then use it as your subscription and start creating the sources on it and I'm sure you can understand just how problematic that could be on a whole number of levels so keep this information under wraps do not publish it do not commit it to get hub or anything like that and as we flow through this tutorial we're not going to do any of that stuff we will keep it secret so I can't stress that enough now I'm going to redact some information on screen specifically my 10 an ID and subscription ID because that's not going to change for me I'm not going to tell domeier's your subscription so I'll keep that kind of secret and the client ID and client secret I don't mind that being on screen because at the end of the video I'll tell tell that stuff donate away so it doesn't matter if you have access to it won't exist by the time you're watching this video but yes absolutely keep it secret from your own perspective so I think with that we're ready to move over to you and get started so in order to create service principle I'm actually just going to follow the instructions that are provided on the terraform website they've actually gone through a number of ways you can authenticate we've already used this one here using these your CLI but we are now going to configure this one as we don't want that interaction with as your command line when we come to using our CI CG pipeline so let's just click on that and I'm just going to scroll to the bottom there are a number of steps that we're going to follow so number one create an application in as you know active directory you'll generally a client secret which again is basically the password for this application and they will grant that application access rights to do stuff one is your soar again please be careful with this stuff it's very powerful so let's jump over to zo and first thing I want to do is create our app registration so as you have to directly click on that and then move them to add register Asians and you can see here I already have one I set up for when I was preparing this video but we're going to create a new one so you can see how you do it let's give it a name call it teller form service and support for CI CD something like that I'm just going to at the end of it just so I make sure we we know we're going to use this new one I've created not the old one support that the current types single tenant is fine let's click register now this then takes you into this particular service sends of this particular app registration you can see here on screen there's a number of bits of information that I'm actually going to just take a quick note of so I think I've got notepad over here so I've just got a notepad over here with the four environment variables that we're going to need and I'm just going to kind of copy these over to that so first one client ID copy it paste it over here next one is the tenant ID we need that paste that in here now again this client ID will destroy at the end of the day the tenant ID i've redacted he or should have redacted in the edit which is just the idea of my active directory the client secret we're going to generate next and we'll get the subscription ID in a little bit so over to client secrets no certificates and secrets down in the client secrets section we just want to create a new client secret and we'll give it a name to our form right secret and you can see an expiration one years playing and it plays it down here so copy that over to my file as well now I'm not saving this file I'm not going to do anything with this file other than user as a placeholder for these values we will then plug them into our environment variables and the next step one comment I'll make about the client secret if we then we visit this particular section in five minutes time and you got to try and look at what the client secret is you won't be able to as you would actually redact and you can't even copy it I don't think so it's that sensitive they're like they give you five minutes Greece to copy it in a way like I'm doing here so you can use it but then after that you can't even see it still there but you can't do anything with it so it's a security feature and it's just labeling the point of how important this stuff is so cool some of done steps wanted to create an app registration created a client secret and the last thing we want to do is get this application access to Azure Active Directory to do stuff create resources basically so let's go home let's go to our subscription I'm gonna copy this which is a fourth and final piece of information over my notepad here cool we then need to go into our subscription click on that and we want to select access control and my screens running it quite a high magnification level it's not looking that great anyway want to click add a role assignment and we will select a role go and we want to contribute on that gives us a enough rate to be able to create stuff assignment access rights to that okay it's really these user or service principle you're using a service principle and then we need to select the app registration we just created so it was terraform there we go terraform service transport for CI CD to click that Matt can it puts it down here it's a bit of a clunky interface to be honest with you but nonetheless it works so click Save and that should then grant that application rights to create stuff in Active Directory awesome so that's the Active Directory stuff setup from our service principle perspective we no want to use those values and plug them into our you environment variables saw that we can just offend to keep straight through from our command Ling so I want to test that out no to meet you have set up correctly and what I want to do just first of all let me just bring up I want a project I'm just gonna vlog he said Lago I think I'm longer enemy but I just want to kind of labor the points or log though anybody cool and if you want if you run terraform plan it will feel cool so bear that in mind it will feel what we want to do now is now add these environment be able to our system environment evils so if in Windows if you just take the env bring these up we want to add them into our user being able just for me we don't want to make them system-wide because the one you know somebody else logs into this are the one int of access to that sort user variables now we can create new ones here or we can do at the command-line probably a bit quicker to do it the command line there's a number of ways you can do it in Windows it's just cancel out of that the command I'm going to use and I need to make sure I spell it correctly as s e TX you can probably understand why I'm saying I need to be careful with how I spell that and then I'm just going to go over to my file here and I'm just going to issue that command for each of these values here so let's do the first one paste it in well so I think I need to put oh there you go I think I don't have a space in there let's try that again set X on clients okay cool let me just check that that was added the NVE evils and yes we have the arm client yeah cool all right so we just need to do that for the other three so set X what they don't know what happened there set X and let me just copy the others across right secret okay cool so that should be our for environment variables set up system sorry user environment variables set up the RM they are cool so theoretically we should be able to run our terraform plan command our command line and it should just a same ticket straight through but one thing I would advise you to do and I'm going to do it here is just close down your terminal refresh your terminal I'm running my terminal within Visual Studio codes I'm just going to close that down and restart it that's probably where you can refresh your environment variables by running a command they can't remember what now and what that is so let me just bring up my visual studio again visual studio chord again here we go alright so I'm in the right project yes cool I'm just gonna do a double zero logo just to make sure we are definitely logged out at there's your CL I call it an or active account so we should theoretically be able to go to terraform plan and yeah cool it's going to generate our plan for us so it's picked up those four environment variables and it's authenticated straight through this so personally as a developer the you know I don't mind using the azide login interactivity and in fact that's why I would use as a developer just doing my own bits and pieces obviously though when we come to pushing it into its pipeline we need this automated method so I just wanted to check that the groundwork has done the service principle is set up so when we come to doing it in the pipeline you shouldn't have any problems alright so with that little bit of pre groundwork done or no ready to move on to your DevOps alright so moving into some exciting territory now and we're going to use as your DevOps but just before that I will say that don't know just before this is super quick I promise you I want to delete the darker image that we previously pushed up to docker hub just because we're going to test push from as your DevOps and I want to make sure that actually does work so settings and then delete repository and to type the name and the API delete okay and you know what mate just delete this as well while we're at it things the repository and its catalog API just sort of support super clear we have no images there so I've no repositories in my namespace perfect alright so let's move over to Azure DevOps now this is as your this quarters your DevOps does anybody else think that's a rubbish name as your DevOps I find it quite the thing it's great I mean look at it for something better anyway upend it quite confusing so energy or DevOps this is where we're going to set up our automated pipeline so we need to create a new project let's call it terraform form test2 as already have a previous one that was using to test this video we need to select it as public as I'll get repository at this point in time is public if you remember way back when in advanced we just set version control to get which we just what we are using and work item process are jail we're not using leaders at that click create an org wharf and create our project for us now just while that's thinking about that the next thing we're going to do before we even configure our pipeline is set up a couple of service connections and the service connection is basically a connection to the others that github at that as your DevOps will need to use sort thinking back to our diagram I won't bring up it will need to use docker hub with the pushed image the darker Hobson's needs access to that and actually also needs access to as soon as well so over and project settings let me just step back if you didn't see what I did there so here we are in our project terminal project settings click that on the left hand side here you'll see service connections and we want to create a first service connection now I'm running my screen at let me actually change it to e I've got a magnified at the moment so you can read the code easier but I'm going to put it to 100% because I have found that yeah that's better when you're setting up these service connections you can't quite see all the config properly so I'll switch it back after I've done this so the first service can actually want to sell is to a docker registry there we go NIC's and I'll ask you what type of registry so you could have as your container registry which we don't want that cost money I believe we want docker hub so we need to tell it however connecting so I'm just into my credentials or fur can remember the right one if we click verify verification succeeded I never need to give it a service connection name so let's call its I don't know docker hub will say binary this'll block our cool grant access permission to all pipelines yeah why not no issue with that so we'll verify it and save that service connection cool so we've created can make a service connection to our docker hub next one you want to connect to is as your resource manager so next we want to select service principle automatic next pull back our subscription we don't need to supply of azure script at this point in time so let's just call this playing the Rufus or as your resource manager and get access to all paintings again you can call this anything you like a double spacing here let's take that and let's save that cool so we've got the service connection for both of those things that mean heat no let me just put my screen back to a lower resolution hopefully it doesn't screw up the screen recording let's try that okay cool so we're back to a larger display now and what are we doing grid on ie let's go back to as your DevOps cool so know what we want to do is create our DevOps pipeline so click on pipelines create pipeline and you'll remember from my diagram the thing that triggers our pipeline is are called commits to get hubs so we will select github and it's going to ask us again to sign in to github let me just make sure that my password then can have changed it so yes it's connected to get hub and it's brought back a list of our repositories so the one that we want to work with is the one right at the top here so it's like that and what it will do is it will actually bring back a number of suggestions for you as to what it thinks you'll want to do with your repository so it's obviously pars that and it's seen that we've got a docker file so it's kind of suggesting that we want to maybe do some darker stuff we do but the the options that we want your aren't surance I mean to show more I don't know if the one that we want is in here is so let's just like docker anyway validating configure and this will basically produce our as you of pipelines yeah more file which is the main file we are going to be working in as we move through the rest of the project and I'm going to be editing it for the most part in the S cord and you'll see how that all works in a second now this isn't quite what I wanted let's put this task in it's going to build our docker image which is okay but that's not quite what I wanted so let me just put my cursor now if you've not worked with the amyl before it's white space sensitive I hate it to be honest with you I really don't like it the editor that inline editor that Microsoft to provide it to you is actually quite good but you really need to be careful with your spacing and all that kind of stuff anyway what I want to do was add a different task if you search for darker the one we actually want is this one here build up docker images now we may have the same name probably your key actually let me put the screen back up to the higher resolution on a high resolution D magnify it I'm giving away all my secrets now you can see how I how I set things up that's better so what is our container registry so this is we are our service connections came in and we can see here that the container registry that we want to use is the binary Thistle docker hub so we select that and the container repository that the misleading name is basically the name of our docker image that we want to push up so you'll see binary this'll four slash weather API I think I don't know you know if you're doing this for the first time and you see container registry container repository maybe a wee bit confusing but nonetheless that's the the values they are no this is what I'm talking about you can just select once it refresh it build push but we want to build and push the docker file is just going to pick up the docker file form our project build context is plain and the tags pay a bit of an attention to this we will make use of this but later on so let's add that yeah and you can see it's added another task under our previous tasks now I'm going to take this out we don't need this that's just the previous build step so I think we can safely take that out and what we will do is we will copy this task and you will put it up here hopefully I put it in the right place and I'm hoping that should work okay so basically I'm just quickly stepping through it sets up some some pre-flight checks it sets up this build ID variable which will basically the build number will make use of later as your DevOps works that kind of hierarchy so it starts with stages then jobs then steps so we are defining a build stage here we will define a separate stage later on and then just have some display text we then have a build job it defines the the functional machine that we want to perform the build within and then we can define a number of steps and we are just going to have one step which basically goes to our docker hub container registry and we'll push this repository once it's been built to it and the command that it's going to issue this build and push and then there's just some specifications about the docker file to use and the tag that we want to tag to our repository so I think we are ready to go let's save and run this we're just going to run it manually from here we're not triggering it yet from our github repository let's just see them on it now and what it's going to do is it's going to ask to give a commit message and whether we want to now we're going to commit it directly into our github repository so let's save and run that and it was a way to create the pipeline and you can then see here that it's starting to kick off let me step back into pipelines and wanted this view here yeah okay yep so it's basically starting our build it's cured the agent is waiting to run it basically so we just have to sit back relax maybe grab a coffee and we'll come back and see how that's doing it shouldn't take that long and in fact you can see here it's actually started to build now hopefully all things being equal we should get a successful build and it should have built our weather API image and pushed it to docker hub cool so I've obviously you can see how long that took I've chopped the video just you know pileup is doing that maturely to sit there and watch me watching a screen so got a little green tech build was successful you can dive in to the build steps and look and see you know if there was any errors or successes here it's obviously all gone well hopefully so it looks okay so let me let me put my screen back to the law of resolution most people this is the feedback I've got from people that they like it running in this kind of high up it's not higher resolution but easier magnified magnified layout they can see things better especially when it comes to code and I do get that but sometimes I feel it can augment the screen in a way that I think makes it harder to read but nonetheless let's try and stick with it so yeah that's a bit better so let's go over to docker hub and do a quick refresh and there we go our weather API has been pushed up if you just click on it I just want to give you a bit of our heads up there is a tag in here with a particular number so just bear that in mind and we can talk about versioning and all that kind of stuff a bit later but for now that's pretty good with setup on initial pipeline we have run it manually we've committed our as your pipelines yeah more file to our github coordinates let's just show you that it's just going to get up and if we select my my repositories pick the one the first one we are working with here we are horrible looking picture of me there you will see here as your pipelines Gamal has actually been added to our repository bear that in mind when we come to come back down to the escort we will get a little not an error but a little message saying we need to synchronize the repository that's on github with our local repository and from that point in we will work in vs chord with our as your pipelines yam will fail cool so we are getting there but we no need to move on and introduce terraform into our pipelines so it can act the provision resources on Azure so let's do that next so actually just before we bring terraform into the mix I think it's worth triggering our build pipeline from our desktop so commit a small insignificant chord change to get hub and that should trigger on build know what I will just remind you of is that the current build version of our image is at 2:00 to 8:00 so if we trigger another build from our desktop you should see that change if it's what correctly alright so hopefully that made sense so back over in our project let's just let's do it get status just to see you lat ok so it thinks we excuse me we need to add some files to our repository now I just really want to add our main TF I don't really want to add these files and come on - what they are in a bit and I don't really want to add them to our repository so it really should put these into our get ignore file so let's just do a get add main TF and get status and yeah okay get commit added mean dot TF okay and then we'll do a git push origin master now you can see here that there's something rejected now basically what that is it's the fact that we had added or as your plate platens file to our git repository and that's not reflected down here so I think if you do a git pull that should rectify it I think hmm okay maybe it's get pull origin origin master need to do that okay that looks better okay so as your pipelines file has come down fantastic and let's see if you can push it up okay that's better now I will deal with these files later I will leave them there just now but what we should see if we move over to our pipeline is let's just do a refresh yeah you can see we have a second build kicking off and it's going to start running excellent so yeah basically that part of our pipeline is working okay we've committed cords back up to github and we've had to pull Devon as your enamel files your paintings your phone which we can now work on within the S chord but you can see that it has kicked off our pipeline fantastic so once this is completed we'll check out our docker hub the ocean to make sure that has published a later version and ready to move on actually to terraform cool so that looks like that's completed and again I obviously didn't make you wait to watch that happening so if we come back over to our docker hub repository and refresh there we go you can see we have a new build version a new tag of our image and that will again come in useful or it'll be relevant later fantastic ok so now we are truly ready to move on to the next part of our developing a pipeline and that is adding terraform into the mix ok so in order to bring terraform into the mix you will recall that we want to make use of the service principle that we created in Azure and an order for terraformer to make you solve or use the service principle it needs access to those environment variables that we previously set those up as user environment variables on our local desktop we no need to add those into as your tables know the way you do that is making sure you have your pipeline selected going into library and then we're going to create something called a variable group and we can just give it any name you like I'm going to rename this as terraform service principle I think that names fairly self-explanatory and allow access to all pipelines yes why nots link c2 as long as your key vault has variables we don't need to do that so let's add our first Babel and now I should have my notepad from before I did indeed and then after that I'm going to destroy this notepad I'm going to you know kill it because I have some very sensitive information you know so the first one is our client ID and again please be careful when you add these values in and again this just means that we have access to terraform has access to these environment variables when it comes to run so double double check one you know these grids are quite difficult to read so I'm not expecting it to be the whole thing but what I usually do is check the last two or three digits in the first two or three digits just to make sure they are the same and I've copied the whole thing a few times I've not copied the entire thing and again it's small stuff like that that drives you crazy and when you find the issue you you've wasted like an hour of your time or something and it's just not very good cool now the one thing we want to do with our client secret is we want to select this little padlock on it and I just kind of Eclipse it know theoretically we could do it with all of them there's no reason why not but let's just do it with that one in fact let's do it with our client ID as well because I think that's somewhat more sensitive cool so that's our environment variable setups or when terraform runs within our Yama file and it runs on the agent that has access to these elephant now as I said before I had this file over here I never saved it I am now going to delete it and don't save it it's gone okay so again the client secret and the client ID and the other two as well actually really sensitive so please do not have that kicking around anyway or publishing oh I'm not going to be held responsible for it and I won't see anything more about that ever again that's I've made that point enough times now now one of the things you'll have noticed actually when we tried to commit our accord to get hub it not mentioned we had two files here that well you haven't been committed to our repository and I'll draw your attention to this one here more than anything terraformed state now what is terraform state well before we answer that let's just pop over to my power point and find the right page here we go so there are a number of files within terraform users main TF we're already working with that and as you see that as you know that just holds our configuration claude telephone bars we are not making use of available file but basically you can use variables within TF mean and if you want you can split it into separate file and that that poly is good practice best practice we're not doing that in this video we're just going to put them in mean TF for now and then finally the file that I really want to talk about it's terraform TF state this is a JSON file managed by terraform and it's used to map our resources in our main file to our actual running resources and as your so it basically it's a way that terraform keeps track of possibly what it needs to do because this what it has done so it's quite an important file and it's used when we apply what's not quite an important it's a very important file and it is used when we are applying our plan to our provider in this case as you now when we run a terraform at our command line as you can see here it just creates that txt file and that's all cool and it needs to process going forward so it needs to be there continually you can't get rid of it and just it needs to be there an order for everything to work correctly no that's fine when we are running on our command line but when we run on our pipeline the way as your DevOps pipelines works and we'll the way most continuous integration platforms work is when you run your pipeline config it will provision an agent one your run your config one your your plan and then it'll tear that agent down now as part of that this state file that terraform needs will be created sure but then it will be torn down and I'll be forgotten about now we can't have that situation we need to persist this state file throughout the use of using Azure DevOps so we need to store that file off and what terraform refers to as a back end so it's persisted so that when the pipeline runs again the next day it's a so that is the next bit of conflict we're going to do before we actually start using tear form to set up some storage on as your so that we can store our state final beyond so let's let's do that now okay so we want to set up some storage for our state file now I'm actually just going to bring up a notepad because I'm going to make a few notes and just put it off to the side for the moment so actually I made the meter playing over here so we need a read a few bits of information so and we need to create a couple of things for this all to work so we need a raise group or the results group we need two storage account now we're going to storage account has definite types of storage that you can use with that we're going to use the blob storage account and as part that we'll need to create a container and then finally we just need to specify the file name but we don't do that in Azure we do that in our main TF file and in fact we actually have to supply all these bits of information to our main TF file but we need to create this stuff in as you are so let's do that now so the first thing we'll do is create a new resource group and let's add one we will call it set a form resort storage I've got blob blob store I'm just going to copy that and I'm going to copy it over to my notepad just so I remember what I called it location we want to put that in Australia East there we go and we'll review and create that validation past creates fantastic now you may ask why are we using terraform to do this well we're only doing this once it's not something we're doing as part of our build pipeline this is just config you could potentially add it and if you're maybe setting up development environments and stuff but it's probably a bit of overkill for us to use it here and for what we are doing so I'm just going to crease it manually probably have to wait for it to did it great yeah go through the source group okay they create a cool just wanna double-check that just so sometimes I find that with Azure um it doesn't actually refresh yeah still not still not showing it there but I'm fairly sure that has been created enemy let's will it will complain it will complain if not so okay we've created our resource group and the next thing we want to do is actually create a storage account so into resources that let's just take storage storage account and you can see here it has four different types blob file table and cubed we making use of the blob storage stuff cool so let's create that and the first thing it's going to ask for as a resource group so you can see here here's the one we just created tea FRG blob store fantastic storage account name so call it terraform storage account okay it obviously needs something really edit unique so I'll put my name at the end of the binary this one and be very surprised that all 24 characters okay so ta let's take out the account but that's better okay TF storage binary thistle so yes you'll have to make sure that is unique let me just copy that over to sheer piece that in so I have a record of it let's set up our location and Australia East you want standards kind of kind yeah let's find general purpose replication we just want LRS we don't want anything too fancy locally redundant storage and they just want cool we don't need hot access to us which wants something very very basic so let's review and create that know if you can hear that it's very windy outside here and we're moving into winter here and which I quite like personally but it's getting cold and very windy today so validation past click on create ok cool so it's deploying that and then we need to do one more thing we need to create a container for our file and then we're pretty much set up with this and then we need to move over to our TF main or main TF sorry file and add these details and to configure for terraform refers to as a back-end call so that has been created let's go to our resource and you can see down here you have different types of you know queue service table service file service we are using this blob service click on containers and we want to add in your container I think I can turn on almost just like a folder I guess I suppose you would say and we're going to store our terraform state file in here and yeah we want it to be private I think that's correct yeah ok and any advanced no that's cool all right let's create that cool so we have a TF state container wonderful so what we want to do now is move back over to our cord spindle you cool I'm just gonna get rid of the terminal we don't need that for the moment and move back over into our TF main file let's revisit that so we want to basically as I say use these details to configure our back-end and so 10:00 or formal then no rather than using local file storage in this case it needs to go off to as your to store the state file so if you don't mind I'm just going to I do have one here but I don't need to update the values so terraform backend is um so that's our provider and then we need to provide we need to fill these out this is these are the values are used for my previous example now the one thing we can leave is this key the key basically just refers to the file name that we want to store so our resource group if I bring my notepad backup was tea FRG blob store and our storage account was this DF storage painter nice this'll and our container name was it was TF state wasn't - no state let's just double check that txt fantastic okay cool now that's all we need to do there so that's basically configuring configuring a terraform to use that blob storage to store our state file so it persists throughout multiple runs cool so we'll find out whether that all works or not when we come to actually adding terraform into or as your pipeline as your as your plate lanes Yama file I'm not going to do that now okay so we are getting tantalizingly close to bringing terraform into our as your pipeline's file so back down in vs cord and let's open up our as your pipelines file you so here we are this wonderful Yama file me just clear on that down now as I said before as your DevOps works and I can a hierarchy of stages jobs and steps so we're now going to add any new stage to our pipeline and we need to be really careful I can't stress this enough about how you indent no thankfully Visual Studio core does provide these than we call them lightens or something markers to show you how stuff is indented now I'm going to take this in I feel that this is kind of the most important part there's videos right near the end and I feel I should type it in as we go and explain it as I go rather than just cutting and pasting it in so what we want to define a stage you can call it anything you like but I'm going to call it provision and then space display name and this is just a few actually when they're running the pipelining you look at how it's running and as your pipelines this is just what it displays as it's standing through so that you don't have to be have to put this and but it's just it just helped with fault-finding and all that kind of stuff so we'll say well I don't know terraforming on Azure know that then the next thing which is what's doing is this depends on clause and basically all that seeing is this this can't run until well whatever we specify here has one but of a stage we are specifying here and that is build sort of seeing the provision stage depends on the build stage if the build stage fails then there's no point in provision in it provisioning anything awesome so we then move into our jobs to specify our jobs don't forget the call on and then we can have any one of a number of jobs but we're just going to have one job in this case and we'll call it division as well and again display name and again we're going we're graduating down in the hierarchy so we want to put a bit more detail in here so we'll say provisioning container instance and every want to specify an image that we want to use we do that using the pool directive let me specify the image here and let's just copy it from up here I've been to latest you can specify a version if you want but I always just go with the latest that seems to work well and then we want to see actually we have some environment variables for you here so we want to specify what those aren't no I actually can't remember what we called what we called our variable group so let's go back over to your pipelines and here we are no actually this is a very good point I'm glad I did this you will notice I cannot add the DS in here I didn't click Save so make sure you click Save and before we do that we I make actually just change this name rule kind of form and just remove some species form from it terraform and ours cool so let's just copy that this is the value that we need but let's save it first I didn't do that hopefully it doesn't error out hopefully haven't left it too long cool let's go back here and see if it saved that okay let me go tough fraud and I've spelt it wrong so let me correct that I always do that tear a fraught form enemy there you go Teta from okay copy that save it again that looks like it's saved let's go back up and level me just chained it as they are and indeed it is that's cool so let's move back over to file and we just need to specify that here and but we do that is by specifying a group I'm providing that we put that value dear I absolutely despise yanil I just need to make sure I have all this in the in the right place cool all right so that's our job setup and then we we want to set up a statement okay so steps and again we can have you know more than one step but let's start with the scripting start and end with a scripting step so this is just a bash script I believe pipe in there and then we basically start to define our script which will make use of terraform okay so I will make sure I indent correctly after that under the thing where they are so we just want to set the flag and that just means if we arrow out then we become out of this we don't really continue like that has to know autocorrect it which I don't want that's incredibly annoying let me fix that up in a second just something you do maybe because I didn't put without putting in the switch maybe not okay okay take a nap that was quite annoying and then basically all we want to do is run our terraform commands at this point so terror form in it to initialize the the back end and we will specify the input false flag or equals false make sure you say equals false otherwise we'll have some issues and all all the input false option means is we don't require input at this stage so it's just terror for a minute fair enough and then terraform apply your mortis we don't need to run the plan command because it's just a pipeline that's running it now again we all supply and put false equals false and also importantly we need to specify auto - approve which means it doesn't require us to take yes - you know what approve the changes now I'm just making sure I've take that incorrect they make sure it's equal signs here and the - they died that tripped me up when I was testing listen it took me a while to find the issue it drove me a bit crazy so cool that's that done okay and then we just really want to make sure I've lined this up correctly and we want to give the script a name and call it anything let's call it run terraform and then we'll give it another display name and I think it doesn't like the fact there's a space there can't have a space so or display name can of course mime terraform cool and then finally we actually have to in our script we have to specify the environment variables that we had stored away in our variable group you have to specify them here so they are actually passed through to terraform and we do that by just specifying and end directors and then we specify our environment vehicles now I'm sure you won't mind I'm gonna copy this over because it would just be way too laborious to watch me typing all that in and basically we are just specifying the environment variable that we want here and this is getting past them from our variable group to terraform yang var that will populate these values here so again secure there's no visibility of what these values are other than going into that environment variable group and as your dev ops alright so there is every chance that I've made some kind of typo in this but let's run it anyway let's be brave and run it and see how we get on now just before we do Vaughn it what I want to do is just clean up all our resources on Azure tick them off so we've got a clear slate and remove our image from docker again just to start from an empty playing field now the way I want to remove resources from as you're attached to use terraform destroy to get rid of everything rather than going into as you are and taking all that stuff down manually so back over in our project we have a terraform state file and a terraform state backup file that we should technically no longer need because our main TF file is now making use of a as your back-end so let's get rid of them let's delete these from our a solution they shouldn't really about they shouldn't be there we definitely don't need them cool so I just want to do a quick terraform in it just to initialize our back-end it was all looking good and then let's do it terraform and destroy and what that will do is it will go off does your look at what's running and we do have some stuff running that we had set up manually and it will tell us what its going to delete so it's going to see it's going to destroy two things which is correct a resource group and our container and if you want to look in detail you'll see all the stuff it's going to do so we'll do it yes so again great tool for if you're just setting up development environments and tearing them down I know when I'm preparing for these videos I'm often having to go in twos you and set things up and take it down and this is great rather than me going in there and manually deleting stuff and possibly deleting the wrong thing terraformers do it for you so that's cool and you'll also notice that there's no state file being generated again and our local file system that's being used by using the blob storage up one as your which is also cool now while it's doing that let's just pop over to docker hub and delete to the image over there so I'll bring over my web browser so you can see if you're doing we had our image up here let others just settings and delete that repository to click it first and then it will say anywhere the API okay all right cool so we should have no repositories do that no repositories that was weird I thought that would have deleted that was asking me to sign in don't know what's going on with this today it's just sewing back in yeah okay that is very very weird never seen that before let's try this again possibly my session timed out or something like that don't know try again that's better okay I think maybe our session time there was something of that nature so cool we've lured red repositories on our hub our resources have been destroyed on Azure so we have a completely clean slate so now what we want to do is simply push our new as your pipelines file up to get as well as are edited main TF file so that's just a quick get status we can see that yes we've deleted these files from our repository I had actually committed the mail there on just to try and clean up so we'll get rid of them anyway and we have commodified as your pipelines and main TF which is what we want to yeah invoke up changes basically so I get add I was all think I get status just to see it yet cool and then get commit to pipeline file okay and let's kick this off and that should trigger a build on as your dev ops and hopefully we have some degree of success so let's go back into our project but we want because we've not pushed it up to hey we've not pushed up I've just committed it so kick push origin master okay there we go that's better I don't know what's thinking I'm think I'm losing my mind so yes you're pushing the the repository up to get up and that should trigger a build in a couple of seconds there we go cool so we'll just sit back relax and see how that goes grab a cup of coffee and come back when it's done you'll see here actually that we now have this two-stage build process so this first little blue blob that's no building is our build of our image and the second one is the provisioning the stage that we added into our new as your pipelines file so you know that's taken that's been pushed up and it's been taken by as your DevOps and it's now working with us that's looking positive and if you click over here you can just see an a sub view of the two stages and we'll see how the run this should still work we'll keep our fingers crossed for this one and see how we go alright so our image has been built with green tech the US or if we move over to docker hub and refresh you should see our image there and indeed we do and if you just drill into that file our provisioning step is building we can have a look and see what's there so we have a yeah our image and we have it tagged with our we build two three six cool so let's see how this provisioning step cause you can actually drill into well that's right oh dear so we have some kind of issue on our run terraform stay pure so let's take a quick look in and see what's going on check the color size better get rid of that okay so we can see here that terraform has initialized successfully so that's basically the terraform and it's that cool that's all good and you can see here it's actually started to try and attempt to create our resource group that looks that that was successful and then it's moving on to attempting to create our container group which is going to contain our image and that's where the arrows out so mmm let's have a look overall as you and see what's happened there so let's do a quick refresh and I'll just remind myself what the name of the resource group we were wanting to create was the resource group was tftf me Nagi this is the name here so I just want to verify that we have that Rizal script created we do so it's kind of half worked but we've got to the point where we wanted to provision our container and that's failed let's have a look a bit more detail of what the error was end up creating and updating container weather API feel you're sending requests originally record inaccessible image the image binary fissile weather API in container weather API is not accessible okay so I actually know what this says and it may surprise you to hear that it deliberately just to kind of make a point about versioning our docker images so let me just take you through why this is failed and then we'll fix up so yes as I said the issue that we're having is all of them the way we are tagging or creating or naming our docker images so just to review again the standard convention for people like me or you who are creating docker images and pushing them up to docker hub will be your docker user ID so in my case paddy thistle /and please note there are no spaces that I've just put spaces in here just to make it very clear their name in the image so in our case weather API and then call on and a potential notion so in our pipeline and our as your pipelines gamal file this is what would be getting pushed up to docker hub or something like that I've just put in a an actual value of one two three but that would be the build value the build number which would change every time we run the pipeline so that's what our pipelines file as your pipelines file is creating and we saw that just in the last step document has been pushed up with with a build number in our terraform main TF file we are specifying that we want as you are to build a container with the name of this image here okay we're not specifying any value and bring that the quest ghost as your as your interprets it as this and I'll actually pop over to as you and have a quick look at what I'm meaning there if you don't supply an actual value that you want for the version then it will just default to this latest the ocean and as a result it will throw an error now the one question you might have was we actually used terraform in its current form the T the main TF file to provision resources on Azure and it all worked how did that work well it worked because when I ran terraform at the command line I had previously pushed a darker image up to get home not to get on it but to docker hub without specifying a number and so basically and if you can maybe recall you can even went back the video you would see that in the zoning list there was actually our latest to the option so that's why in that instance it succeeded for me manually did all because I didn't specify a value at the command line when they pushed the docker image up and now that's throwing in at all so yes they'll be good reminding that he pushed and yeah when you manually push me didn't specify a version and it actually worked okay so let me just pop over to is you and I'll just quickly show you what I'm talking about okay so let's pretend we're going to create a new container instance resource so I'm gonna clear the source code didn't want to do that come back out of that and pull them to resources my apologies so let's pretend we're going to add a new container instance my actually going to I just wanted to show you just to be super clear up what's happening let's go create and if we scroll down and select docker hub as the registry that we want what you can see here if not specified docker hub will be used fine for the container registry and the latest version of the image will be pulled so basically what what's happening is the only image that exists on docker hub is specifically tagged with a mission number and when we try to create the default which is using latest it feels so hopefully that makes sense so how do we how do we rectify that let's go back to the PowerPoint how do we rectify that so my initial go-to response was well in our as your pipelines file why don't we just take this value out why don't we just not specify the dole value and I'll just show you where that actually happens just again to be to be clear so you know what it's happening so an hour as your pipelines file in our build step we apply this tag to the image here and that tag is basically the build number okay so that's where that's coming from so I thought initially by the media's get rid of that don't bother tag in the image and that means the that we push up to dark Rahab will be this latest image it wouldn't actually have a build number and it would all work sounds like a an acceptable solution but it won't work why won't it work well this is about an interesting concept if you're not heard that before terraform generally and terraform apply command is what we call ID important be careful how you pronounce that what does that mean well I'd importance is a concept in mathematics and computer science and basically very simply it means when we perform the same operation again we will get the same result so terraform applier is an important know what what what does that mean well if we didn't specify a build version and our as your pipelines file and we just pushed up an image to docker hub that was tagged as latest the first time we run our pipeline everything will work as expected because what will happen is our image will get pushed up to docker hub tariff or a terraform or run it will see we want to provision a container with this name which would just be binary thistle where their API cool and that with all work perfectly however we then make some changes to our cord we rerun our CIC pipeline what will happen is every time after terraform we'll look and go has anything changed no from a terraform perspective all that's looking at is the plan that it needs to create and in the plan nothing has changed it's not really aware that code has changed it's just a way that of the planet has been given and according to the plan nothing has changed and so nothing will happen it won't actually deploy a new image don't as your saw that's where I'd importance it comes in that you could run it again and again and nothing will change because the as far as terraform is concerned it's the same operation so specifying a build number will actually remedy this so what we'll do next to finish this whole thing off and rectify our issue is will leave it as your pipelines file as is and still tag it with an image but what we'll do is we will pass that tag through that build number tag through into our TF main file and that means every time it runs and terraform looks at its you know what it has to do because it's got a new build number every time it runs it will go ah something has changed I actually need to do something and idempotency Clause or principle does not kick in and me then get record flushed down to as Europe so hopefully that made sense let's put versioning into our main TF file and we'll rerun our pipeline and we'll leave them do some chord changes to make sure they actually get flushed and and we'll see how we go okay so what we want to do is back over in others your pipelines file we want to kind of replicate what we're doing with our build image and pass that tag value through to terraform so it can use it and number one that will resolve the issue but Crumpler currently having with a pipeline and will also solve an issue we would have potentially had in the future how to be used just reverted to using the latest image so let's do that no no the way you do that is by providing a user-defined environment variable through to terraform now that they are all if you're using terraform be able they're all prefixed with TF underscore bar underscore and then you just give the variable the name that you want no you can call it anything we like let's call it image build it seems as good a tag as any and what is the value we simply just pass over this tag value here that's defined folded-up then our as your pet lens file which is just getting the build ID so often as your pipelines prospective that's all we need to do we just need to introduce this new custom environment variable and pass over the tag value then move over to our main TF file now what we really want to do is here when we're requesting the creation of our container we want to basically have that value here somewhere and available in here and you won't be surprised to hear that the way we do that is by actually declaring a variable and our main TF file and you may recall I was talking about that types of file in terraform so our main TF our state file and you can also push out your variables if you've got a lot of them you can push no interceptor variables file we're only going to create one variable here so we don't really I don't think you need to do that so we're just going to create our variable with that our main file and so the way you do that you're just basically declared a variable give it a name now this is this is important name is going to be just this section of the environment variable that you defined not the whole thing just the bit after the second underscore which I found I don't know you know this is all documented I just found this but I had to play around with it a bit before I got it to work I put the fill thing initially and didn't like that so anyway just my little complaint so that's the variable that you declare we don't give it a default value that value will be passed in from as your pipelines and we can give it a description latest bill image build things aren't working okay and then all we need to do is just append it into here in our image name just so it takes on a new build number every time in the way you do that is dollar sign in curly brackets you got to specify a bar dot and then you just give it the name of your variable here so that's the same pattern you can use time and time again you can pass over excuse me custom environment variables from your command line or from your as your pipelines fail they will get passed then here you have to declare a variable up here and then you can just use your bibles within your within your creation of your resources so it gives us the glia flexibility which is very very nice and best practice but typically to be to pull these being able so intersect a file if you had a if you have maybe more than one of them we just have one I think you'd be a bit of overkill to to do that so I think that all looks pretty good hopefully that makes sense to everybody so when we request the creation the container we're going to be asking for the specific build the ocean and everything should work okay so I think all that's left to do is to commit this cord back up to github and revé on our pipeline and see here we go all right so let's bring up a command window terminal and it's just look I get status to see what Oh actually before we do that the one thing I do want to do and I said we would do it I remembered is let's update our wear lock controller so this was the the I guess you'd call it where there are descriptors that comes out the box with it let's change that to something else just to test that we're actually getting the new cord flushed through as well so I'm not going to take this in because it'd be very boring to watch me do it and all I'm gonna do is just basically because as data is somewhat randomized we don't Nestle nor which one of these we're going to get when it runs I'm just going to replace all of these with triple exes okay so we should see xxx when we run our API if this all works cool so let's do a get status and yep our three files have changed so we have changed our control up we've changed orders your pipelines Yamma and we've changed our main TF now again this is a great demonstration of cord being deployed as a unit you know so we get some infrastructure cord we've got a pipeline cord and we've got an actual application called all being deployed up to github at the same time awesome so let's do it gets add everything let's do a git commit fixed our pipeline and that's all good get status a branch a local branch should be clean is so let's do we get to push origin master to kick off our build on as your DevOps and we'll see how we go hopefully this time we have some success so let's move over they are now back over to is your pipelines let's click on pipelines I think that's already kicked off there is also here's our second build coming through so we just have to sit back patiently with our fingers crossed and see how we go on okay so our push to docker hub has what next step into our provisioning step I'm fairly confident if you move over to docker hub that will be okay I'm going to leave that off the screen just so we can see what's going on here here we go okay job finished initializing we're running terraform lab might see how it goes initialized the back end initializing provide up plugins cool we should start to see our resources being created we at last time we did get as far as the resource group that was created okay I know potentially because we already had even source group there it shouldn't recreate it should just know that it was already they are based on the state file so that's cool so it looks like it's now going into creating our container group this is a very good sign because we're seeing this ten seconds elapsed which basically means it's kicked off it's starting to provision it and it's just waiting for Israel to finish doing that so I'm fairly confident that this has worked well so we just need to let this do its thing let it run and then we will quickly test our API in Firefox and have a look at the payload and make sure that the the new court has been flushed through look at that all green all clean there is nothing in my view nicer than double green dots on something that can build pipeline it's one of the most satisfying sites to the hold so that all looks good let me just remind myself what we called our container image let's go back to our main TF file we called it weather API so let's go waters your to make sure it's being created I'm fairly confident that has been school to all resources want to do anything with that let's restore this sort of fresh it should I see there we go weather API has been created and it all looks like it's up and running ok so let's copy the URL and pop over to Firefox and type it in here and don't forget to hold the controller weather forecast and there we go it's up and running and as you can see the summary has been replaced with all X's so our code has been flushed through it now the real way to double double test that would be to I'm not going to do this and feel the confidence going to work but we would go back into our weather forecast file change this back to the way we had it don't change anything else Reeve on our pipeline push back up to get github read on our pipeline and I'm absolutely 100% confident that change will be flushed first so we got over the IDE important C issue and we also fix up our pipeline as well with that I think I don't much else to go through I'm just going to do a move on to our final bit of wrap-up I've got a bit of homework for you a couple of improvements you could possibly make to this place let's do that next okay so I wasn't joking when I said I had homework for you and now I know that for those of you who left who have left school many years ago and college many years ago you'll be quite surprised that some random dude on the internet is giving you homework but you know life life is weird like that isn't it and I do expect on my desk first thing Monday morning no excuses all joking aside though I do have a couple of things I think could be improved well there's probably more than a couple of things but a couple of things spring to my mind are are a couple of questions I have are a little running terraform at the command line will that work no and if not how would you fix it and then the other one is I imagename binary fissile forward slash where that API scan hard-coded in two separate places and as your pipelines file and also in our main TFI or terraform file is this a good approach could it be improved I'll leave that with you other than that all I have to say is thank you once again for watching if you liked the video give it a like if you want to see more of the same stuff why don't you subscribe if you've not done already and you'll be notified when I post some new stuff up and to my wonderful followers on patreon again thank you so much for the a little bit of extra support that you give me I really appreciate and your names are going to be coming up next other than that - you all stay safe stay happy and well and I will see you again next time [Music] [Music] [Music] [Music]
Info
Channel: Les Jackson
Views: 36,627
Rating: 4.9761295 out of 5
Keywords: dotnet playbook, .net, .net core, 3.1, c#, azure, azure devops, terraform, iac, infrastructure as code, hashicorp, cicd, pipelines, docker, docker hub, step by step, course, tutorial, les jackson
Id: Ff0DoAmpv6w
Channel Id: undefined
Length: 125min 1sec (7501 seconds)
Published: Sun May 10 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.