Uploading Files to AWS S3 with Github Actions

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey in this week's video I'm gonna show you how to set up load files to s3 using get while using github actions so the idea here is you would have a pipeline set up the workflow and github actions would be going through the stages and one of those stages for example would be upload to s3 this is an excellent thing if you're trying to use s3 for for example deploying your your actual source under your source code be your finished package jar or it may be a positive file or CLI to the light or not any sort of files that you wanna store on s3 and you are you're using github as a source control management this is how you can get those files from github to s3 without having to set up a nice or a server on my screen right now is a lesson that I've written highlighting all of this everything I'm gonna go through in today's video I'm gonna make a few assumptions in this video I'm going to assume that you have an s3 bucket setup like this one I'm gonna assume that you have an IM role setup that has access on this s3 bucket I have another demo here I'm noticing that my websites a little broken but I'll fix this but essentially this URL here I will also link to this in the description below it tells you how to set up an s3 bucket with I am rural access I'm also assuming you have github account with and get github actions on it so if you don't really know what get up actions is you can think of something like jenkins or circle CI or seven for CI build these like workflows or pipelines to essentially complete some sort of task and they can be triggered on things like a push event to your codebase so i use mine for deploying it my servers essentially every time I make a change in merge something to master I want to let it to production I will use github as get up actions to do go through the stages of testing and getting them back out to the to production so the first step here is going to be simply setting up a new github repository that's github.com slash new I'm gonna call mine demo s 3gh actions I'm gonna make my private just scroll down and hit create and right away the first thing I'm actually I do here is go to settings go to secrets and I've saved my secrets for my my I am and user but this my access key and my access secret so I'm just close to access AWS access key and paste that in there I'm gonna paste this one in here with this I know that when I go to build my Ashwell workflow and his reference the secret value values in the work flow and I will get the those credentials out and it'll be able to authenticate me to give me access to mastery bucket head back over to code here I'm gonna clone the repository took my local so I'm gonna use my terminal oh my computer I presume in I switched to my desktop I'm gonna do git clone get demo as three actions clear that zoom in here love more the next thing is I'm gonna open up an editor to actually building out the files I want so I'm going to Pat him and dot okay that's now open and I'm gonna do one last thing here in ad I'm just gonna create a readme now put a title on there called demo I'm gonna save that I'm gonna head back over to my terminal we're gonna commit this up to master and I'll refresh the page my actual github page okay so now we have this readme file we have a file and github repository we can begin kind of working through the steps of setting up the workflow after that we can then verify that everything has been successfully deployed to s3 so the first step here is they want I want to set up a workflow file I'm gonna head back over to Adam head new file and I'm gonna paste this whole URL so it creates my doc github and inside that there's a workflow spot folder and that is a release that yamo that's what I'm calling my file and every push to master essentially what you're gonna do here is you're gonna paste this in here explain this in this workflow essentially what we're in dudes on every push to master it's gonna run this Ubuntu image the first step will be checking out the code the second step will be running this bash script it's gonna be called release dot Sh in that bash script when I pass in one argument being the bucket name one argument being the local path one being the access key one being the secret key and one being the filename on the s3 bucket sorry this is not the local bot this is the AWS s3 key and this is the local battle explain all those in a second first thing first thing is first rich I create this this folder and this file create that actually I never actually added my bucket name so I'm gonna go back uh actually I'm not gonna do it I'm gonna just hard coat it in here my bucket name won't change my bucket name is devil bucket for github actions I'll paste it in here I have an access key of an access central I don't have a mind of zip so I'm gonna create one of those as well back to my new open copy this and I say uh press call my app and then my last step here is actually create this file release that Sh going back to the lesson here all the arguments I'll explain what all the arguments do pretty Oh - stick I didn't need to create this zip and you mean - okay it takes all those arguments it deletes the local path it then zipped up the local path ignoring the dot get get ha die kit folder the dot github the release and this Python file I'm gonna explain in a second and then it's gonna do a photo install boto 3 which is the AWS s3 Python SDK and it's actually gonna run a Python script called upload dot py passing those similar arguments in you may be wondering why the upload well unfortunately you can't use CLI if you don't have it installed so you could do just curl calls I find it's actually easier to do Python because if you want to do some other manipulations afterwards it Python makes it incredibly easy so I'm going to copy and paste this in and what I mean by that is I was literally called main method here I grabbed my five arguments I create a session with the AWS access key and ews access secret I create new s3 client and then I just called the upload using my local path my bucket name and a tub is key and it's that simple to actually do the upload so with all this I'm going to close it all when I head back over to our terminal and do a commit up and we'll see this run how did a flow before you actually push that up I had one more file here just to demo it called mean dump you I call it death main print hello world and then go me add that on to my hair that I mean that's who I come in and and dude get push and so as you can see my release branch is running so check that one out so it's doing the release test three and it's failed let's figure out why invalid syntax oh I think my website screwed up oh yes it did right here you just need to fix that right there my website removed it so - at all and bug fix and then it's gonna run again okay this is done uploading this is doing the PIP install this is creating that zip folder if I head over to my s3 bucket and just kind of refresh the page here it's just folder water this is my zip just don't download that and see the contents so it has a bunch of files I didn't mean to actually upload so much my my ignore my spend failing so it has all the contents by playing okay so that's empty I got some of it right I don't know why the I have to look at my exclusion criteria but that's not the point of the video I said this - X didn't work but that's fine it's actually that's how you upload files from your repository I'm just wrapping up the whole repository and just doing a full upload the files you can do the same with multiple parts thanks for watching this video I have a new video each week please subscribe
Info
Channel: Keith, the Coder
Views: 7,270
Rating: undefined out of 5
Keywords: S3, AWS S3, Github Actions, Continuous Delivery
Id: hGh7DkQKeq0
Channel Id: undefined
Length: 11min 27sec (687 seconds)
Published: Wed Feb 26 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.