AWS Nordics Office Hours - Build and deploy .NET 5 web apps using AWS CodePipeline and AWS CDK

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
you your unmute welcome everyone to the aws nordic's office hours things can happen when we are live so this is the weekly show from the nordics where i bring on guests to talk about techy stuff and this time we started out by talking about audio apparently every week i have an aws expert on to talk about a specific topic and you our viewers are able to then ask questions around that topic and hopefully by the end of the hour you've learned something new and also you've been able to get a lot of questions answered so feel free to start using the chat let us know that you're here and that you're watching and this week i am joined by our old friend alexander draganov so welcome back alexander thank you so great to have you with us again we've talked about net we've talked about cdk and this time we're gonna combine the two but even add some additional things to that but first tell people a bit about yourself if they haven't watched you on the show previously yeah so good afternoon my name is alexander draganov and i worked at edwards as a partner solutions architect and i am specialized in microsoft workloads on aws and especially in dotnet on aws so that's why again the topic of today's session will be something around dotnet as before exactly and well tell us about the topic of this week then what are we going to spend this next hour on yes so what are going to do today is to build end-to-end ci cd pipeline for net 5 application so we will build it from scratch and we will do it like in less than an hour obviously so we will use code commit code build code pipeline cdk to create.net 5 web application and deploy to elastic bin stock and elastic bin stock environment will be itself created by cdk and again it will be end-to-end demo from scratch in less than an hour so this is our goal for today very cool and the entire suite of code tools i guess we'll go through them one by one as we get to that section of of the demo when you're building the application and cdk and that's something that we've talked quite a lot about on the aws nordic's office hours but let's cover some basics there as well when we get to that point so uh well where do you want to start alexander so we will start as i said we'll start from scratch so we need a place to store our source code so we'll start with code commit so all right your screen share is up let's go tell us about code commit to begin with so coke commit it's a way to store your source code and it's just a git compatible repository on edwards so everything that you can do with your git repository all your git commands were just applicable to code commit so convenient way for you to work with your git repository on lbs on lbs i would say and that means compared to using other git providers it means that you're storing your source code within your aws account yes exactly right so let's get started then yeah so as you see i don't have anything in my account so let's start with creating repository and to make name fancy it will be live demo yeah that's it the repository is created now we need to access it so i'm just cloning the url and it looks like this git code commit and then region and this is name of my repository and i will switch to my development environment and this is standard command to clone repository clone and this is the url which i copied do you ever see anyone using svn these days instead of git ah it was long time ago i used svn so i think nowadays everyone is using git yep seems to be the de facto standard today okay so uh yeah i just cloned empty repository so i need to add some file to it so i will just do i will switch to main branch and i will just touch with me file and then standard commands hit it it as you see nothing proprietary just standard git commands okay so now my readme should be in our git repository let's switch and see our repository code and yeah you see we have our readme in our code commit repository okay so um but the goal of the session is to create isp.netweb application so let's create it and let me grab some piece of code not to type it so i'm just scaffolding standard web application using dot net new command and the name of location will be sample app very easy now we will dotnet will restore some packages and to make it more uh to add some touch of this live demo i will just change a name of application so it will be where is my page so instead of welcome it will be yes nordics for this hours so you see but it's not so it's real demo and let's run this application just to see how it looks like but it's standard scaffolded.net framework application and this is using.net five yes it's 0.5 so everyone i guess is waiting 4.06 to be released it's preview one but for now because it's preview we will work with net 5 because it's released version and let me show how our application looks like that's standard application and lbs nordic's office hours okay so this is the application which we are going to deploy to lbs okay but now this application uh it's a local one so now we need to push it to repository but before we push it to repository we need to later tell how to build this application and in order to do it i need to create a file which is called buildspec.yaml and add some content to this file and i will show it in a second so this is my buildspace.yaml so this is the first version so this file will evolve during today's session but here what we are saying that i need dotnet five runtime then i specify build command and for build command i'm just using dotnet build and when i'm specifying post build command dot net publish and it will publish it to output folder so this is actually the name of this output folder which i'm specifying as an output artifact so this is the simplest version of build spec yaml for network application and this is used by code build it will be used by code build and i will show it in a second once we introduce code build too all right pipeline and then i need to also add git ignore file because now i don't want binaries to be included into my project and the git ignore file is used to choose basically what is being what you're able to commit into yeah or what i want to exclude actually i want to commit everything except these files which are generated by build so you see now it's going to commit hundred files but once i save this file it should be yeah 54. so i don't want any generated binaries to be committed so a question from the chat was what does the file build spec.yaml do so build spec it it will tell build agent how to build this application so first it will compile it dotnet build then it will publish it to output folder and then we specify like what artifacts to take from this output folder so this is like the result of build process take resources compile that and put the binaries into output folder and this is what you specify in this buildback gamble but later you will see like how this file evolves like when we add cdk when we add unit tests so you will see like more and more lines in this file but for now build publish and specify where my artifacts are okay so now we need to commit all our changes so again git add with all files commit application and push it to main branch and there is one more thing which i need to do is to create an s3 bucket to store the artifacts because when we create our code build we need to specify where to put the binaries so for that i will just we're going to create a s3 bucket using adwords cli and as you know the bucket name should be unique so let me name it like this and i hope this name is unique yeah okay so we created the bucket and then we have all our sources in code commit yeah so we have sources we have build spec we have git ignore so next step is to create build project and in order to build a project we are going to use code build so let's get started again create build project and it will be named sample app and how can we simply explain what code builder does so code build so we have source code and source code is just c-sharp files in our example so we need someone to compile with code some agent so we need some environment for this compilation process to run and this is what we do with code build yep simple yeah so uh sources uh you can use s3 you can use github github enterprise bitbucket but in our case we're using code commit repository this is the repository which i just created picking the main branch because this is what i want to build and then i'm next times need to specify the built environment so like where i want to i mean how this build machine should look like you can use your own custom image if you have very specific built environment but i'm using managed image i'm going to use ubuntu with standard runtime with version five and then it's going to create new service role for me and then for our builds to be a bit faster so probably we will spend a couple of minutes i'm just using different compute instances instead of tell us tell us why you chose to use ubuntu in that case it's because dotnet 5 is not a long-term support version uh it's a week it's available like on ubuntu platform and let me find the link uh where the runtimes are specified just a second i can paste that in the chat yeah but yeah i will show it here yeah so this is available runtimes and for dotnet if i'd want to use net5 i need to use ubuntu standard 5. if i use dotnet 3.1 which is long term support version i can use either ubuntu or amazon linux 2. but for net 5 this is what i'm using and this is what i configured standard ubuntu standard five and when i uh instead of two vcpus i'm using four vcbus and then uh how to build it i said that i'm going to use build spec file but instead i can have build commands specified here as well like one by one but we are going to rely on the build spec file which will be part of our source code and for artifacts we are going to use s3 and this is the bucket name which i just created so once the build process is done the all the binaries will be put to this folder and i think that's it create build project and there's a question in the chat around the prod or dev different environments or stages uh if you can have an auto trigger for the dev environment and manual trigger for prod uh and perhaps this will be shown better in the later stage when we get to the the pipeline yes so code pipeline it's a service which helps you to orchestrate build process and in code pipeline you can have like different branches for different environments so you can orchestrate your build process with code pipeline and we will come back to it in a few minutes okay but now build started so as you see build is ongoing so so what it's doing now is it's setting the runtime based on the build spec fetching the artifacts yeah yes and then doing the compilation process and you see the whole output here and restoring nuget packages and then uploading artifacts to s3 so upload is succeeded so i suppose in a second it will be succeeded and we can see that our s3 bucket is not empty anymore any moment soon well it's always the case um yeah so what you entered into the build spec file is what's being used by this code build project to then do the different steps that you've defined okay so succeeded build details and this is my artifact folder location and yes this is an application and this is the binary folder with all published binaries required for this application to run okay so we have our source code we have uh so we have our git repository set up now what we want we nee want to trigger build every time we push changes to his source code repository because you saw before it was manual process i just clicked the button trigger build and then my artifacts were created and to do it we are going to code pipeline so we're going to create pipeline and name it sample app as well and create new service role okay so so uh we're specifying code commit again and uh like how i can explain it why we're specifying it again so here we are specifying when do we want to build so we want to build when something is changed in this repository in this branch and before when we configure the code build project i would say it's we specify like what we want to build and here we specify when we want to build so we are using code commit we are using this repository with branch and we uh like it will be triggered based on cloud watch events so every time there is a commit to your code commit repository there is a cloud watch event and that in turn will then trigger yes and it will trigger this pipeline okay so we specify what is resource or like what is the trigger and now we specify build provider and we can use either jenkins or code build so we're using code build and this is our project name which we created five minutes ago for deployment we will skip deployment stage because we need to create infrastructure first and yeah let's skip it and that's it let's create pipeline there we go yes so and it's uh automatically triggered because uh yeah there is a new new commit from uh quote pipeline perspective and a pipeline if people are unaware of what that actually means and how would you explain what a pipeline is it's a kind of uh you know order of steps that you want to perform so this is this is first step get resources with the second step build it so pretty much the easiest pipeline there is yeah so two steps only and going back to the question uh from earlier and you would then add additional steps based on your different environments for for instance and and just define how you want those steps to to proceed yes exactly okay so while while it's uh building let's uh add unit test to our project and i will switch to the console so okay there we go and again it will be a simple x unit test project which we named sample test and if we and it's uh scaffolded i think one unit test so if we run dotnet test uh you will see that one test should be yeah one test is passed all green this is how it should look like uh now we need to update build spec because we need to specify that we want to run these tests as a part of our build process and what what do you this is just a basic test but what do you usually include in unit tests for different kind of projects it's it's just uh based on your needs uh unit test which it's a test for your project for you to know that you should proceed to the next step in your pipeline but everything is good okay so i'm just modifying the build spec and i will show what i what is changed so it's only two changes so there is a second command after dot net build we are running dot net test so if a build is succeeded we are running with tests we are specifying the format of a logger it's trx and we specify the output folder for the test results so this is one line which was added to the build spec and the second one we added this report section where we specify that we want test results from this folder and this is the visual studio trx file format so and i think that this shows even more clearly now in the build spec file what uh it actually does you're adding an additional command to run that that unit test that you just defined yeah and you could basically just copy paste that command and run it yourself as well right yes exactly so it's standard uh dotnet command nothing build spec specific nothing at best specific okay so let's commit it and again if we switch so our previous build succeeded two minutes ago and now because the sources are changed it should automatically trigger next build with unit tests so let's see any second now yeah so it's triggered with sources with commit which we just did with unit tests and now it's building but instead of only building off application it will also run the unit tests so let's wait i guess about one minute yep question in the meantime uh basically the cost for code build how does that work uh so it's a pay as you go model as most of uh aws services so you pay for per minute for the build agent when the build is run so in our case like if our build was run for two minutes i will pay for two minutes of build agent time and then there are different uh build agents with different compute power from 2d cpus to i think you know 36 vcpus or there is even built agent with gpu so based on the instant size of this build agent like the price per minute will different but you pay per minute for build agent and that's really the one of the great things about a service like code build that it spins up and the the instance or or the container that you're running your build project on and then it spins down so you only pay for at the time that that is active yeah and also another good thing about code build with it runs like in a new docker container every time so meaning that your built environment is immutable so if you build something the build was succeeded next time the build failed this means that it's not something with your built environment and something with your source code so that's it okay so the build is succeeded now if we go to details to reports so we should see our one unit test passed with 100 pass rate okay so now we have our source code we have our build process so application is successfully compiled it's tested we have artifacts now we need to deploy it somewhere so we need to define our elastic bin stock environment so we need to create our infrastructure project which we will use to deploy this code and to do that we are going to use cdk and do the quick intro to cdk what it is so cdk it's a way for you to build your infrastructure but not using yaml or json files but using your favorite programming languages and in my case we're going to use c-sharp to create the infrastructure and under the hood it will be compiled into cloud formation template which will be deployed using cloud formation provisioning engine but if you don't prefer net you could use python typescript go i think yes f-sharp java yeah so most languages covered okay so uh i'm running cdk init command and i'm specifying c sharp as language because we are talking about dotnet today so and then i need to install a couple of nougat packages for elastic bean stock so first package is elastic bean stock and the second one will be for i am because we need to create some policies and instance profiles let me paste it and here i am and then i will copy piece of code which i will explain later and the packages you installed they are they contain the constructs that cdk uses to define the infrastructure yes exactly and because we're going to use yeah we're going to create an environment for bin stock i installed constructs for binstock and for identity and access management so let me switch to visual studio code and also as you see we have our project structure so we have source code and next our source code with our tests and next to it we have our infrastructure so we really working with infrastructures code which is side by side with our application and this is uh what was generated by cdk but i'm going to paste this 50 lines of code which i will explain so first we are creating an instance role so we need to specify a role for instances which will be used by elastic bin stock and we are attaching managed policy elastic bean stock whipped here then we create instance profile with this roll which we just created and when we specify some options for elastic bin stock so first we specify an instance type so we are going to use t3 small instances and we are specifying instance profile which we just created which is based on this role which we just created and you see like always references and then we create an application and we create environment with this application name and providing the settings and also we want to ensure that application is created before environment so we're saying environment depends on application so basically that's it this is how we create our elastic bin stock environment using c-sharp code and this is it's much easier to read and understand comparing to a few hundred lines of yaml code and this was how many lines was this you said uh so it's 51 but including some usings spaces classes so like actual code i think it's yeah 40 lines of code even less let's let's take a look at the the output of the cdk and to see how many lines of cloud formation is is created as well when we get there yes let's see cdk synthetic to be in a folder which contains citric ages and run cdk commands i think this is a folder a json which says how to build this application and in meanwhile we need to add make also changes to our build spec to build this cdk project so let me switch to build spec and i will add few more lines of code which i will explain okay now it's a bit longer but so what we do here first we are running npm install because cdk is written in typescript so you install it using uh npm so we're installing adbrush cdk because it's not available on our build agent that's why we're installing it then in order to build it we are using this cdk synth command so we first we built our net project when we test it and when we synthesize cloud formation template and then we have this secondary artifacts so instead of our application now we have like both application and our cloud formation template so we define two set of artifacts one we called cloudformation which has this template which was generated by cdk and then our application itself and if we go to cdk so our 40 lines of code it's ended up being 300 lines of code again this is the power of cdk so you write the code in your familiar programming language and it end up being 300 line of code of json template generated okay so we created our cdk project we updated our build spec file so now let's push the changes to our code commit so now you're pushing this this cdk uh part of it to the same repository in this case the same repository and we haven't added any new build steps so the output of this stage is just that after the build we created both binaries for application and the cloud formation template generated by cdk so next step would be to deploy this cloud formation template to lbs right while that's building if you just joined us this year this is the aws nordic's office hours with miguel aguirros i'm joined by alexander draganov today to of course talk.net and to talk cdk and through this hour we're i'm saying where but i'm just watching to be honest alexander is building an application from scratch taking us through the the build process how to to actually deploy it into aws as well if you have any questions post them in the chat alexander is here to answer them and i'm here to just read them out loud yeah and so what's happening now uh the our build agent is just preceding our build spec file so first it compiles replication it runs unit test it runs the cdk synth command to compile our cdk project and to generate cloud formation template and it puts it into two sets of artifacts one with cloud formation template and this is the output and the second one is application binaries so when when you you work with cdk perhaps locally you would perhaps do the cdk build command as well to then start creating the the infrastructure but in this case that hasn't happened yet uh no it's it's done by the cdk synth so it's creating the template but but not building the infrastructure yeah so and cdk synth what it does uh based on the cdk json file so for net applications it will tell that just dotnet run and this is by c-sharp project of cdk so this is what siddique sniff does actually under the hood okay let's see if it's okay build succeeded so now we have our cloud formation template generated our code generated so now it's time to deploy it to edwards so we're going to edit our pipeline and first we need to specify that our code build it actually has two sets of artifacts so it has application and cloud formation and these were the names which i specified in the build spec cloud formation and application if i hope i haven't misspelled it that's the most dangerous thing in a live demo when you misspell something okay done and then we're going to add a stage and let's call it infra and what which stage will do we are going to use cloud formation as an action provider input artifacts we are going to take from cloud formation action will be creator update stack this is the name of the stack and this is the artifact so this is the template which will be used by this cloud formation and it's and it's called in front dot json so this is the file which was generated by cdk so artifact name cloud formation infrastruct.template.json then i need these capabilities and when there is a role which i this is the only thing which i created before so this is iamroll which gives cloud formation permission to deploy artifacts to to to deploy the stack okay so i hope that's it so guys in cloud formation cloud formation artifact update with stack using this file and using this role done done save and because we haven't changed the source code we will just click release to release this pipeline because it reruns the pipeline with the same source artifacts as last time yeah same source infra so it will run it again but this is the step we are which we are interested in to deploy uh our cloud formation template and this will take maybe three four minutes so if we have any questions there was a question around java that's answered already i also posted a link to the cdk documentation so you can see what land languages are supported as of now and also how to get started using aws cdk next there was a question around how many aws services we're using in this case and since since all of these services they are part of the developer tools but they all have their very specific use case and they're doing just that it in the end there are quite a lot of services involved but they all work together so we're using code commit code build and code the pipeline so far yeah and also as you can see there is a code artifact and code deploy so if you want to publish your inside organization if you want to have your private nuget repository for example you can use code artifact to host this private nuget repository and code deploy it's the service which is used to deploy your applications to uh elastic container service to lambda or to ec2 instances so we're not going to use code deploy because we're using elastic bin stock as our target and it's a managed service but if you want to deploy to ec2 then you can use code deploy or if you want to create uh you know canary deployments to lambda or to ecs again this is something you can easily configure in code deploy you can even use codeploy with on-premises instances as well yeah okay so now you see that this cloud formation is deployed so if we switch quickly to cloud formation so we will see that this sample up stack is now deployed so this is the yeah and if we go into it and see template so this is the cloud formation template which was generated by cdk which is now being deployed okay so it will take a couple of minutes i think and i think this is the longest step in the pipeline to deploy the cloud formation for the first time yeah but but the next time as we are not going to do any changes to our infrastructure this step will be very fast so yeah in those cases it's either create or update your template and we we haven't done any updates to a template right right so right now it's creating the everything that you defined in your cdk application with the instance roles and the elastic bean stock application yeah and we can watch progress here as well in cloud formation because under the hood is just uh deploying this created with template to cloud formation well yeah cloud formation is deploying it right all right in the meantime if you have any questions post them in the chat alexander is happy to answer any questions around.net even though it perhaps doesn't have anything to do with this specific demo question what happens at deployment failure complete stack rolls back or only failed step and we are now talking about cloud formation i think yes i would say so uh i think uh i think it's the default behavior but it will fail and it will roll back as far as i remember and this is about this step so like the build step yeah it will be as it was before that's right so it yeah it doesn't affect the the previous steps in the pipeline this is handled separately okay so let's check the output so this is the default elastic bin stock environment because we haven't deployed our application yet we only created infrastructure so this is the target elastic bin stock environment which we're going to update with our application so now it's deployed and yes and if you go to code pipeline it's already deployed so called pipeline just hasn't realized it yet let's take as deployed yeah okay now it's successful so what we did so we have our application we have our infrastructure we deployed our infrastructure we have our application binaries so the last step in the pipeline is to deploy our binaries to this environment which we just created so let's edit and add another stage and let's call it deploy i just added a link in the chat about a news from cloudformation from a couple months ago or one month ago about how you can really try from the point of failure in in a stack so have a look at that yeah and so what we're going to do we're going to deploy and action provider will be elastic bin stock input artifacts this is our application artifact which we defined as secondary artifact before and application name this is application which was deployed by previous step and this is the environment and actually that's it done save and again because we haven't done any changes to source code we just release it manually so this time it should be faster because this infra is already deployed so it will check but okay no changes in cloud formation template and it will go to this deploy stage so again we have couple of minutes i think yeah so now it will compare the new cloud formation template to the one that was actually deployed previously but since there aren't any changes it's gonna be fairly quickly that part yeah and this is and the yeah the url of load balancer will not change because again we haven't changed cloudformation template so this is this magic random number will not change so we'll just once everything deploy it will refresh this page and hopefully we'll see how applications were but again let's wait yep the build time is usually the the one that takes time during these demos all right so if things go as we hope now when we reload that page after this pipeline is done we should now see your your net application running from that elastic bean stock yes exactly and we're still on time so as i promised it should be a live demo and trend in less than one hour yeah that's good come on code yes when i do it myself i think like code build is like super fast when i do it in the live demo environments like okay why it's so slow but time flows very differently what environment you're in someone says that your clock says 35 a.m i'm not sure where uh i i i think it's with a virtual machine so i'm using this uh oh yeah so this is uh just a dev box with uh clean environment which i'm using so i think it's somewhere yeah i don't know where it is so it's we just proved that this isn't a recording it's just his virtual machine that's yeah but you see this infrastruct we already passed it so now the deployment is ongoing because we have no changes to infrastructure so what happens can you press the in progress where does that send us now no i'm not and i cannot press it no all right so if we were using codeploy instead for deployment to ec2 instances for instance we would be able to watch the progress but in this case i suppose we don't have because it's hidden by elastic and that's kind of the point with elastic bean stock that yeah it's managed environment so you only need to care about your application and not about underlying infrastructure so that's this is what what i think great about elastic bin stock yeah question if if you're using cloud9 but you were using visual studio was that right uh i was using visual studio because uh yeah i'm not that guy and i i had like visual studio code and i have like full visual studio in my development environment so that's why cloud9 is perfectly fine because now i'm just dealing with text files yeah so it should work with cloud9 as well okay so build is succeeded let's refresh it and voila i mean our application is deployed and we can make one small change to deploy version two of this application for example if i if i go to sources so we trigger full pipeline and trend yeah and my page let it be all right so committing that to the repository which would trigger the entire pipeline so while that is happening let's walk through the the steps that we've done to to get to this point uh so first step uh we created our code commit repository so this is where we store our source code then we created our code build project so we specified what sources to build and how to build them and how to build we specify using the buildspec.yaml file which is also stored in the source code repository and then we orchestrated the build using code pipeline so this is our source stage this is what what triggers for build and uh the build is triggered every time there is a change in our code commit repository when we build resources using the code build project which we created before and then as an output of this build project we have two sets of artifacts one is binaries for our.net application and the second one is cloud formation template created by the cdk so we have this inference tab which deploys the cloud formation template to the edwards account to create a target uh elastic bin stock environment and then finally we created deploy stage which deploys the binaries which we created as part of this build stage to the target elastic bistock environment so end to end from sources to running environment and and based on this setup right now we could quite easily replace different parts for instance not everyone is a net c-sharp developer but you could pretty much do the same but using any other language exactly so if we if we check how build spec dot yaml file works so here instead of dotnet build it will be your java c compilation step and yeah yeah and um if it's still building and what if now it's deploying to all right so it should hopefully be done fairly quickly then and in this case you were using code commit for instance but the source could be as you showed when you added that in our pipeline it could be github it could be s3 or bitbucket and so on so yeah so our version v2 is deployed so if we refresh with page okay v2 is deployed end to end pipeline very cool and built in under an hour so now we built it with a very simple unit test that is green it's okay it works what if a unit test fails uh so if unit test fails then this build will fail so meaning that we are not going to proceed to infrared deploy stack so let me quickly break it because that's the most exciting part to break something yes so this is our test project and this is our test so let's add another test test2 we just froze exception meaning that it will fail and and let's quickly see what's happening and since the test is part of the build step it will get be quite quick to get to get that point so this build step should should fail in this case and yeah and this is the purpose of unit test not to proceed further if something wrong happens here right right and let's see like one minute still on time yeah yeah no worries i have another question i want to ask but i want to see this fail first so it's it's not something that you want to hear from a host i want to see this trailer also i want to see i want to see when you make it fail succeed so yeah that was better come on code build want to see a similar session to deploy to eks and yeah it would be quite easy to change this into deploying to eks instead the some different cdk setup setting up eks cluster and then i think with eks it will be even more interesting because we need to set up cluster itself with cdk but also you need to manage application on the eks cluster maybe using uh cdk for kubernetes yeah so it could be interesting session okay build failed this is what we wanted to see and if you want to details to reports you see that it's failed and we see that we have one test which passed and one test which failed and that's why our build hasn't proceeded further so exactly what we expected very cool i like that the report duration is super exact in how long it took right so now a unit test failed and that means that it will stop the pipeline and the developers will notice they will fix whatever is wrong and then commit new code if you go to pipeline so here's this failed test but infra and our deployment is still on previous commit which was succeeded yeah very cool so the question i wanted to ask before or rather bring to the discussion is there was a comment early on in the chat about different environments with staging or testing environment and then probe and you could easily add those different deployment options in this pipeline yeah exactly so set up different environments using cloud formation or cdk and then deploy to those one by one and if you wanted to do it manually which i think the question was early on you could add a manual approval step in your pipeline as well so that it won't deploy into production until you manually press that approval button all right one minute to go you made it under the hour great alexander as promised yeah you keep your promises now this was great alexander very interesting to see alexander has joined us and built and deployed a net 5 application to us in under an hour using the code code suit of tools and cdk and deploying that to elastic beanstalk so thank you very much for joining us today alexander and thanks to all of the viewers and for all of the questions and comments we got throughout this hour um do follow alexander on twitter as well his handle is shown on screen and if you wish follow me as well thank you all and hope to see you again in two weeks there's no show next week bye bye bye thank you
Info
Channel: Gunnar Grosch
Views: 19
Rating: undefined out of 5
Keywords:
Id: YKlcrMs4-zI
Channel Id: undefined
Length: 59min 59sec (3599 seconds)
Published: Mon Sep 27 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.