Deployments with Docker and Elastic Beanstalk

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
one of the most common questions that i often hear is after spending days if not weeks writing software how do i get that software to be accessible to people all around the world that is how do i build a seamless continuous deployment pipeline through something like github and host that software on something like google cloud platform or amazon web services or microsoft azure in this video we are going to be doing just that an end to end walkthrough on how to write and deploy software with a continuous deployment pipeline let's get right into it from code to production so what are we doing well like i had said before we're typically spending you know hours days weeks months what have you to write code on our laptop or computer and we push it up to a version control system uh provider like github gitlab or bitbucket and what we're doing is we're saying you know maybe to our team members or maybe we're working by ourselves we basically want to say i want to merge this code into some main branch and you know typically what you see in professional environment is people review that code and they nitpick it or add comments and when everything looks good then they you get enough approvals and you merge it into the master branch so master branch will be the central state of truth this is the reflection of our system basically like master branch would be the reflection of when you literally go to google.com as a hypothetical statement and so that's what master branch would represent now when you push up code for review you know you're adding some new feature so it's not on the server it's nowhere it's basically additions and deletions to master the current state of truth uh or some other branch and you're basically saying uh hey guys and gals please review my code please approve it if it's good let's go ahead and merge it back in and this now becomes the new state of master um and so what happens here this is where the continuous deployment or cd pipeline kicks in so and again typically before you merge the code all the tests will actually run to ensure that you haven't broken anything but again what a lot of people do is once it gets merged into master it will again run all of your unit and integration tests as an automated pipeline and that's kind of the first node or first step in our pipeline then what do we do what we do is we package our code this can be a variety of things you know i'm a big fan of docker in this video we're going to be talking about docker but what people do is they basically say i have some docker file that's going to package up all my code expose all the ports that i need to be exposed and i'm going to build a new image right so this is the new kind of state of my software and i'm going to deploy this image to something like docker registry whether it's private or public registry then what you do is you say all right now i want to deploy my changes and this could be something like uh hey amazon web services or hey google cloud platform uh either pull down this new docker container and run it or you can use something like elastic beanstalk which we're going to be using which you'll basically upload that docker file and it will know how to interpret it and start running a new instance of your application in the cloud automatically and it's very nice cheap way to do that and so this is kind of from code to production workflow that we're going to be looking at development versus production so for the dev server we have a web container and it has you know of course this fancy development server and something like our index.html and the main javascript file so this is the current state of our react application when the web browser comes in and we're making requests to our dev server like local host port 3000 that request gets proxy through the dev server and gives us a response and that's how we see that icon and that text on the browser now when we go production it's almost identical but we replace this development server again with something like nginx or apache and it basically serves up index.html and the main.js file and this is the response back to the client okay so the server serves these files back to the client and this is kind of how you handle requests okay so this is what we're going to be building in this example so let's go ahead and get started we have a lot to cover so i'm going to create a new terminal window here and first thing i want to do is basically have something to deploy right so at the end of the day we are deploying an application in this specific example i'm going to be using react and node.js but if you do not have that or you're writing something in java feel free to follow along in a different language or framework of your choice so inside of my desktop so we'll keep this simple i've actually already gone ahead and created this but if you are following along and you want to create react app you just do mpx create react app and then you name it whatever you want here right so i name mine deployment tutorial so what i'm going to do is go into basically deployment tutorial and this is just auto generated boilerplate code and create react app if you're not familiar with it basically gives you a development server with a bunch of webpack config stuff and hot module reloading and gives you a really easy way to prototype react applications kind of on the fly so if i go ahead and run npm run start here give it a second to boot up this will basically run the application alright so the application pops up here you can see there's another change here if i actually go into the code so i have it pulled up right here if i go into this code in the source file and i can just basically come in here and say this is another update and i save it and if i go back to my browser it'll actually automatically reload and this is part of the benefit that we get with the dev server as we had noticed before so it's really quick to update now in this in this tutorial we're going to be utilizing docker and if you're not familiar with docker i have some videos on it it's great for a number of reasons i'm not going to go into too much detail but the main thing is that you know if you swap between different deployment pipelines or you're running your stuff on different machines docker doesn't really care if you have the docker daemon installed on the target machine it will run or create a container and instance of your image that you build of all your code and it will just work it's really great right you don't need to worry about dependencies or libraries or specific versions if you're running it on a different machine it will just work so with that philosophy in mind we're going to be building a development grade workflow in docker with all of our code in this case for create react apps code and we're also going to have a production grade workflow which will actually handle the deployment pipeline that we'll be using when we actually merge commits into something like github and deploy it on aws so let's go ahead and look at that um first thing that i'm going to do is first of all i don't care about any of this code right this is boilerplate code that's given to me i'm just going to use it at face value i'm going to build a docker file and it's going to be used just for development okay and dockerfile is basically going to say how am i going to take the code that i have here in my repository and package it up into my container for it to actually run inside this little container and there's basically just a few super simple commands that we're going to look at and so the first thing i want to do is specify the base image so if i come in here and i say from node alpine so no alpine is a super bare bones uh linux distribution um and node uh basically is gonna be installed on this on this distribution so very lightweight i'm gonna set the current working directory to uh slash app so this is where we'll basically be storing all of our files in the container and i'm going to copy the package.json file into app so the dot is the current working directory in the container and package.json again is specific to you know this project but that's where all the dependencies are going to be defined and the command i'm going to run is npm install so npm install will scan for the package.json and it will install all of the dependencies on the machine the final thing i want to do is copy everything from my current working directory on the host machine into the container and again the current working directory is app and i will boot up the container when the container starts it will basically run npm run start and this is kind of the setup that we are going to have so very nice and very simple now one of the things that you'll notice here is that requests come into the development server and they basically handle or interpret the request or packet that comes in and then they give us some sort of response now we've added another layer here we're dealing with a container so we need to make sure that we're exposing the port of the container to allow requests to actually come in again we have another layer of network abstraction over our software now so we need to ensure that we have the capability to allow those requests to actually proxy through now you can do that in docker files but one thing that i really enjoy using is docker compose and this is just another utility that's used to facilitate um docker file create docker image creation and things like volume mounting which we'll look at and exposing different ports so let's go ahead and do that what i'm going to do is build a docker compose file so you come in here docker compose i'll call it development and the file type is actually.yaml yml now what are we going to do here well essentially what i'm going to do is first i'm going to specify the version of docker compose that i'm using it's going to be version 3 and i'm going to start defining the services and things that i want to build here so i'm going to build a new service let's just call it um let's just call it deployment for example and this could be named anything it could be called my cool website for example or whatever whatever you want to name it i'm going to call it deployment and i'm going to specify the build so when i actually build this code when i run docker compose what docker file is it going to represent a reference and i'm going to set the build context so this is where the docker file exists so the dot is going to be my current working directory and then the docker file we're going to be very uh verbose about this but the docker file is going to be dockerfile.development and this matches the file that we have here okay so now what i want to do is expose some ports so this will allow request to actually come in so i'm going to come in here and do ports and it's going to be an array so i do a dash and i'm just going to say you know port 4500 from my host machine gets mapped into port 3000 on the container this is pretty much arbitrary right if you're hosting a web application on something like amazon web services you know maybe you would want this to be port 80 so things when you hit your website directly for example um it will map that request to port 3000 in the container port 3000 is what our react dev server is running on the final thing i want to do is mount some volumes so volume mounting is basically a reference so if i mount a particular folder you can think of it kind of like a link or a pointer so if i mount a folder that exists on my laptop into the container if i make changes to that file on my laptop hard drive it will be automatically reflected in the container and this is how in our development environment we're going to get hot reloading and so what i'm going to do is i'm basically going to say i want to map everything from my current directory into slash app which is the working directory the container and one thing i'm going to add here is if i add slash app slash node modules it will just ensure that i don't try and map anything from my hard drive to node modules since we we did install the dependencies in the container so we don't need to kind of remap node modules into the container and that's pretty much it so this this is how we get create react app running locally but now it's running locally via docker so let's go ahead and see if this works see if we ran into any errors so i'm going to go into my directory here and you'll notice here are my docker files so in order to run this i'm just going to do docker compose i'm going to specify the docker compose file that i want to use which is docker compose development.yaml and i'm just going to do up up and that will start this thing so you'll notice it's building deployment which is the name of the service in here and we'll just give it a minute to run now just a quick note as this is running you may notice that this is taking a while to run and what uh basically the reason that's happening is because when we look at the docker file you'll notice that we're copying package.json we're installing all the dependencies and then on top of that we copy everything from our working directory into the container again and as you'll notice again we're copying all the dependencies which we totally don't need to do and so this will go from taking forever what we can do is actually just remove the node modules so i'm going to do rmrf node modules this will take a second to delete and now what i can do is i can actually just run this again and it should be a bit faster so you'll notice it builds way faster than it was before and that's because again the dependencies are already in the container we do not need to copy the dependencies from our local machine uh into the container again that's redundant and completely unnecessary and when we look at the output it says hey it's available on localhost 3000 so you may think okay let me go ahead and open a new window here i am going to go localhost 3000 and you'll notice boom it does not work and the reason this is is because this create react app this application is running inside of a container on port 3000 but it's not exposed as port 3000 remember if we look back at the docker compose yaml file we're actually exposing it as port 4500 so i would have to come in here and type in 4500 and this is how we get updates so this is actually running inside of docker now now if i come into uh my source code and i actually modify something like i come in here and i say test updates and i save it you'll notice that it immediately updates the docker version of our application as well and the reason is that because when we look at the docker compose we have mounted the actual application source code into uh the container so we'll get immediate updates and this is a really great way to iterate utilizing docker uh locally a very simplified setup for that now you may be asking okay how does how does this help me with deployments for production grade deployments and that's what we're going to look at next so i'm going to kill this you can also run docker compose down if you want to kill anything that's running because we named this differently than dockercompose.yaml you actually need to specify it explicitly so you do docker compose f docker compose development and then you do down and that'll ensure that everything is cleaned up we're going to build the same thing now for our production environment and this is the stuff that will get deployed to amazon web services what i'm going to do is i'm going to create a new file i'm going to call it docker file and i'm going to create a new docker compose file i'm just going to call it docker compose.yaml now very very similar with some of my new differences that as to what we did before so let's first look at docker file so docker file i'm again going to specify the base image so node alpine and i'm going to just do almost the same thing so set the working directory to app so workdir app we're going to copy the package.json into the working directory and i'm going to run again npm install so from here if you remember i'm gonna again copy everything over there and the difference here is we are not running the dev server so as i had mentioned in the beginning the dev server is great for local development but not production grid environment so what i'm going to do is i'm going to get the static assets and if we run npm run build what it's going to do is it's going to bundle a minified version of all the code and all the dependencies that we need into just a couple files okay and these are static files that need to get served from a server and this is what requests when requests come through to our server it will serve these uh basically these static files and interpret whatever our request is to send back to the client so those files will be generated in the build directory after running npm run build so this is the first step now what we're going to do is we're going to do something a little bit weird and this is this is called a multi-stage build what we're going to do is actually add another build and what it's going to do is use the files from the first build step in the second build step and what we're going to do is we're going to get the actual server that we need so this is just so we can get the static assets built from react right the thing that's going to run our react application this is what we need to do to use that now what i want to do is take these static assets that got generated and i want to make sure that engine x which is a server i want to make sure that it can host these asset files out of a directory in this container so nginx has a image available on the docker registry so if you go into docker registry engine x uh you can come into the docker hub and this is the image that we're going to be using so basically tells you how to use it so hosting simple static content this is almost exactly what we're going to be doing and it tells you kind of where to put your static assets so we're actually going to put our static react assets into this directory and see if they can still be accessed kind of like our development server so what i'm going to do is i'm going to specify the engine x base image here and what i'm going to do is just say copy and we're gonna say i'm gonna copy from the first build step so everything is zero index so i'm gonna copy from the zeroth build step which is here and what i'm gonna do is i'm gonna pull from slash app slash build which is where the static assets are going to get built and i'm going to copy them into uh nginx's open uh basically where they serve static assets so it's going to be under user share slash engine x slash html and this is where static assets will get served and hosted within the server um and this this right here will be the thing that we're kind of serving from so this is what's going to run on our application server in aws for example an engine is a nginx server inside of a docker container now what we need to do is build the corresponding docker compose file so again very very similar to the first use case we're going to specify the version that we're using which is three in this case and then the services so docker compose previously we called it uh deployment so i'll just be consistent here and also call it deployment and then what we need to do is specify uh basically the build context again so i'm gonna say build the context will be my current working directory and the docker file to be verbose once again is going to be docker file notice that it's not docker file.development so slightly different and the final thing here that we're going to do is basically just change this the ports that we're exposing so i'm going to actually expose port 80 so we'll come in here i'm going to specify port 80 is going to be exposed to port 80. so engine x is by default running on port 80 here so basically what this means is if i were to hit for example local host port 80 it would remap the request internally to the docker container to the service running on port 80 and give me a response that way and so this is how the compose definition file is going to look so i'm just going to save that so now what i can do and i can even call this production if i wanted to for example now what i want to do is i want to come in here and i want to say docker compose up and notice i don't need to specify a file in this case because docker compose yaml is the default all right so after running this you'll notice that it kind of hangs here and the point here is that it's actually not hanging what we haven't done is we haven't attached a dash d flag and that will basically run it as a background process but it's actually running fine if you see this the thing is nginx doesn't really give you a lot of logs until requests are coming in and so if i were to actually make requests then we would see logs i think i might actually be having issues hitting on port 80 because i already have something running on localhost 480 so don't think this will actually go through um yeah i'm already running nginx in a different container so i think there's a conflict here but if i were to come in here and actually change this and and i can prove this um i can come in here and do like eight one two three and if i run this one more time it's actually a lot faster because things are cached if i come in here and i do local host eight one two three uh you'll notice that you get logs and this is the production grade version of the application that we're going to be deploying and again you may be asking what makes this the production version well we only copied over the static assets and now we're serving them through nginx we are not relying on the react dev server for example and all of the fancy processes that run in the background there so this is the optimized production build and so that's what we have here now now what we need to do is actually set up the deployment pipeline through github and deploy this code to aws and in this case we're going to be using elastic bean stock so this will require an aws account we're going to be using the free tier it does require a credit card but it will not cost you anything so first thing we want to do is create a repository on github that we can access this code from so i am going to go to github i'm going to create a new repository and i'm going to call this simple engineer aws deployment tutorial and i know that's really long name but i'm going to make it public and create it all right so i have a blank repository on github right now make this a little bit bigger so you can see it now what we want to do is upload the code that we have written so far or in this case generated along with our docker files so what i'm going to do is i'm going to copy i actually don't have ssh key set up on this computer right now so i'm just going to be using the hth ttps counterpart here and what i'm going to do is i'm going to initialize a git repository in this code uh with git init and i'm going to basically say git remote add origin and specify this as the repo if you see something that says like origin already exists or is already defined you can do git remote remove origin and then re-run this command if you need to all right so if i run a git status uh it'll basically show all of the things that uh differ from the branch that we're trying to merge in so i'm gonna add all of these and you'll notice that they are staged now and i'm just gonna commit them and call this in a initial commit and we will go ahead and push this directly to master so almost instantly you should see all of the code now reflected on this repository and this is really useful because now what's going to happen is as you make commits or changes to the master branch what we're going to do is actually package up our working application and deploy it to amazon web services so before we go any deeper in the world of github first thing we need to do is go over to amazon web services so i'm going to go into the aws management console and it's gonna probably ask me okay so i'm already logged in so once you log in you'll be granted with the main console here i'm just gonna go and type in elastic bean stock now elastic bean stock is an orchestration offering from aws and it's really good for basically handling the deployment of applications it does automated load balancing and scalability and it utilizes s3 to store some of your application code that it deploys is very nice overall you can see some of the things that it does so capacity provisioning load balancing automated scaling etc and you do have health monitoring as well uh which i believe is through uh cloudwatch so uh very very nice thing first thing that we want to do is go ahead and create the application so i'm gonna come in here and i'm going to call this uh simple engineer deployment tutorial again and i don't really need any tags or anything um the platform i am using node.js if you're using something like java then you would select java but i'm using node.js platform branch i'm going to use amazon linux 2. just to note if you are following along and you are familiar with docker and you choose to not use docker compose for whatever reason and you just want to use a docker file amazon linux 2 on elastic bean stock will by default look for a compose file so if you just want to use docker file go ahead and select amazon linux this is kind of the previous generation version of this but i am using docker compose so i'm going to be using linux 2 and i'm going to go ahead and create this application this process in its entirety takes probably five to ten minutes so i will just let it complete and we'll come back all right so just a few minutes later and you will be redirected to a page that looks like the following and you'll notice that basically you have an application and you have an environment and you'll notice that you're granted with a url that looks kind of like the name that you specified and when you click on this it'll basically give you a congratulations this is the default aws elastic bean stock website that they load up and no worries about this we're actually going to be overriding this through our github deployment but this is good we can see that the health check looks okay and then here are some of the event logs that we can see as we are kind of messing around with this so now what we want to do is set up through github we want to basically say anytime we make a commit on the master branch how can we deploy our application which in this case will be our docker file how can we run our most updated code off the master branch on elastic bean stock and this is the basically the server accessible all around the world and so that's what we're going to look through and we're going to be using github actions so if we go over here at the top of the repo we're going to click actions and i'm just going to see you know you can see that there's a lot of different things over here but i i'm just gonna do set up a workflow myself so i'm gonna actually just delete everything and i'm gonna name this main.yaml like so and we're gonna walk through a series of steps so like i said this this will actually be completely available on my blog and we're going to kind of go through this line by line so what i'm going to do is i'm going to copy the first chunk here and i'm going to explain each and every line here but it is a bit tedious so this is just a name and i can just call this like you know deployment from github to aws right and this is going to run anytime i push to the master branch okay so this is the action this is the branch that i want this to be relevant to um and so the next thing is basically uh i need to build a job okay and and make sure that you know you have no typos here because yaml is fairly fairly tedious um i'm gonna come over here and i'm gonna just kind of copy the first piece here so jobs will specify all the things that i want to run after my code gets merged to master so make sure the indentation looks okay again so actually this is supposed to be at the root level um i'm going to have a job in here called test and deploy and actually this is not running any test so i'm just going to delete this and do deploy and it's going to run on ubuntu so these jobs in github different than docker it's running the job on ubuntu the latest version and it's going to run a series of steps in sequential order so the first thing it's going to do is run this checkout action and the checkout action is actually an action built by the github team if we check it out no pun intended we can see that this action checks out your repository under the github workspace environment variable so your workflow can access it so really exactly as the name states this action is going to check out this code right this code here into the virtual machine running ubuntu's latest version so that's the first thing that we're going to do so nothing too fancy the next thing that we want to do in this workflow is we want to basically generate the deployment package that we want to use so generate deployment package now and i'm actually going to just copy this to my other monitor here and we will walk through this so generate deployment package is going to be the name and what we're going to do the way that elastic bean stock works at least for this action that we're going to be using is it takes in a zip file and in the zip file uh there's going to be a docker file that it will scan for her docker compose compose file in this case and it will use that to actually build our image and initialize an instance of our image so it'll run a container and all of these assets basically all the files in our repository get zipped up and placed in an s3 bucket which is essentially a virtual hard drive running on the aws ecosystem so i know that's a lot of stuff there but i'm going to come in here and i'm going to say i'm going to run and i'm going to zip recursively in a new zip file called deploy.zip and i'm going to just zip everything and i'm going to exclude anything that's related to node modules okay so we don't want to run or package up any node module stuff and that's kind of the first piece so now that i have my zip file um this i'm actually going to just copy and paste this these two steps that i'm going to have here are solely used just to get a unique timestamp value that i can add to the uh file that i upload to aws so first thing i'm going to do is have a step that gets me a timestamp so this will be uh basically the current time that this step runs uh and then what i'm gonna do is there's an action a github action that does string replacement um and this again could be a little bit overkill but i like the way that this works so string replacement what it'll do is it will take in the current time so if i go to steps dot current dash time which is the id that we are using here and i grab the outputs object and i grab time what i'm going to do is just run a regular expression replacement here and this just does some formatting on the time that i use in the final step so if that doesn't make sense i'll i'll show you one thing that you can do to kind of avoid these two things but i do like this to have a time stamp to kind of version when i deploy my software um okay so the final step is basically deployment to elastic or elastic bean stock so the action that we're using here is this beanstalk action the current version is version 14 and you can actually check this out it's pretty pretty cool so come in here and beanstalk deploy open source thing you can check it out basically as a github action that's used to submit your code to elastic bean stock in a github action so pretty straightforward you'll notice here that i have environment secrets uh here and you may be asking where do these come from well they come from the secrets that are defined on the repository so what we're going to need to do is set those up but in order to set them up we're going to need to actually get those secrets to begin with and the way that we do that is we need to build a new user so back to aws what i'm going to do is i'm going to search for iam and this is identity and access management so if i come here you'll notice a bunch of stuff what i'm going to do and this is this is kind of a tangential philosophy in aws is you don't really want to attach specific policies to a user itself typically what you do is you build a group that has policies defined like what it has access to does it have access to elastic bean stock does it have access to specific s3 buckets and then you put the user under that group so what i'm going to do is i'm going to build a new group here and i'm going to call it elastic bean admin group and what i'm going to do is i'm going to give it a policy so i'm going to search for bean stock and i'm going to isolate its access to full access for elastic bean stock here so it's going to be able to do everything that elastic bean stock can do in this ecosystem and i'm going to create it so that's my group and what i'm going to do is now create a service account under that group so under add user i'm going to type in github service account and i'm going to give it programmatic access and what i'm going to do is just add it to my elastic bean admin group and we'll just go ahead and continue i don't need to add tags and i'm going to create this user now this may look familiar but this is the access key and secret access key that we're going to be using do not share this information okay i'm obviously going to be deleting this account after you'll never be able to retrieve this again so make sure you download a csv copy of the credentials so what we do in this point is we need to store these as a secret on github so in order to do that we basically go back into our uh repository i'm gonna open up a new tab under settings you'll see secrets and i'm gonna add a couple secrets so you'll notice the first one is called aws access key id so i'll come in here and i'll create one called that and i'll copy the access key and create this and we'll go ahead and create a new repository secret under the aws secret access key and we will go ahead and copy this as well make sure there's no leading or trailing spaces and i'll go ahead and add this all right so i have my two secrets added so now my github action should have access to that next thing is the application name and the environment name so if we go back and we look at elastic bean stock you'll notice that if we go to applications this will be the application name and the environment name is actually adjacent to it so i'm just going to go ahead and copy this this is the simple engineer deployment tutorial is the application name and the environment is this environment here and the version label is basically why we did this timestamp string replace stuff to begin with so what it does is it has a prefix so simple engineer deployment and what i do is i pull from the previous step format time the replaced time so format time is the id okay so that's how i access it i go into the output of this step and i pull out the value in the object called replaced and this is going to be a nicely formatted date and this allows me to just kind of track changes over time um because these uh value these files get uploaded to s3 the deployment package it's going to reference is the deploy.zip file this is what we packaged up up here also just one note you know the region is where you created the beanstalk application so if you come up here up here you'll notice that i am under us east 2 so make sure that these things match or you will get a failure uh some people you know obviously build their stuff based on where they exist or where they're going to have a lot of users or where they're basically building their application so make sure that it matches and that's pretty much it so this is also just a recap any time that i push commits to the master branch this job is going to kick off the job is the deploy job it's going to run on the latest version of linux ubuntu it's going to check out the code package all my code except for node modules into a zip file i'm going to get a nicely formatted timestamp to use as my version label and i'm going to push this zip file which contains my docker compose file over to elastic bean stock and it will just automatically know how to interpret that this is the credentials that i use to gain access to that environment and once it's all done it should deploy the code directly to beanstalk so i'm going to commit this new file and once it's committed i can actually go back into actions and you'll notice that a new workflow was submitted a new task is running and it's queued so deploy is the name of the job that we have and you'll notice it'll start setting up so let's just give it a minute to run so you can notice that it almost instantaneously checked out the code and generated the zip file got the time stamp so you can actually see in the logs things that it's doing which is pretty nice and then kind of what's happening here and deploy to elastic bean stock should take just a couple minutes so you can actually go back into amazon web services and see if there are changes being done to it so you can actually notice that elastic bean stock is updating your environment this is from this github action so this will take a couple minutes what'll also happen is that if you go to amazon s3 this is where the asset files get uploaded for this particular deployment so beanstalk will have created a new bucket for you and inside you'll notice that here's simple engineer deployment tutorial and this is the interpolated string value with the string replacement that we ran in the github action this will be the actual files that elastic bean stock is pulling so if we go back to elastic bean stock we can check out to see where we're at it's still running so we'll give it another couple minutes all right so looking back at our deployment job it looks like everything was successful which is always a good sign so if we go back into aws health status check is okay it says environment updated completed so now if we go back to our elastic bean stock url you'll notice that here we can see our react application so all is well so it looks like we're complete um just final one final note before we break off and summarize is if you want to delete this and not get charged just go into your applications and select the application go into actions and hit delete application it'll require you to put in the application name so go ahead and copy this and just hit delete and that's all you need to do um so that's all we got so thanks so much for watching if you have any questions feel free to drop a comment or tweet me at the simp engineer on twitter and i'd be more than happy to respond so thank you guys so much for watching i know this was a long one please like the video and subscribe if you enjoyed this content and i will keep making more alright take care
Info
Channel: Ryan Schachte
Views: 7,049
Rating: undefined out of 5
Keywords: aws, elastic beanstalk, amazon web services, docker, automated deployments, github, github actions, docker-compose, node, react, nodejs
Id: ssVQ7OKdXiM
Channel Id: undefined
Length: 45min 7sec (2707 seconds)
Published: Mon Dec 21 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.