Automate File Handling With Python & AWS S3 | Five Minute Python Scripts

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey everyone what's going on and Derek here in this video I want to show you how we can automatically upload files to Amazon s3 using Python and the AWS s3 API let's get started so what's the idea here the idea is we might have a directory that's very cluttered and we need to organize it and back it up at the same time we can do this by writing a script that checks for file extensions and uploads those files to certain sub directories within a bucket so getting started will obviously need an AWS account if you don't have one go ahead and sign up for one and I'll just sign in to mine once you sign up or login this will be the home page the first thing we need is a user that has API keys to access s3 let's make one of those now to do this we'll use the I am manage access to AWS resources and click on it I already have a few users created but we'll create another one for this example or just call this one something like YouTube dummy user and we want to be sure to give this user programmatic access this will give us API keys that we can use to access AWS resources now we'll click on permissions next we need to assign this user to a group with certain permissions attached permissions are just what allows your user to do certain things within the AWS platform and in this example since our user needs to access s3 buckets they need full access to s3 so I'll add my user to this group that has Amazon s3 full access we'll click it and go on well skip the tags and we'll just skip through and create the user this will give us a page that has their API keys on it we need to copy these over to a Python script that way we can save them for later copying over the access key ID or copy it and I'll move it into a script called secret sup py and place it here I'll do the same thing for my secret access key but I won't show it in this video but when you do this for yourself you'll place it here once you have that we'll go over to a new python script i'll call mine automatic s3 uploader dot pi in this script we need to import those variables that we just created in the secrets the py file so we'll import from secrets import and then we had access key and secret access key next we need to install the packages that will be using AWS uses moto3 so we'll use it and to walk through directories on a desktop we'll need import OS once we have our imports and our API keys it's only a few lines to do this our first step is to interact with that API so it can access the resources will say client will be photo third client will pass in the service that we want to use in our case is s3 our AWS access key ID will be equal to our access key and our AWS secret access key will be equal to the secret access key these are just keyword arguments that the client method takes in and these are the variables that were created in the secret sub py with this we're now able to access all s3 resources that this access key has permission to which was full access so now we're able to upload files and download files using this client now we just need to walk a directory and depending on the extension of the file upload it to s3 to do this we'll say for every file in OS dot lists directory so this is taking every file within your same working directory so I'm doing this on my desktop so it will take all of these files and we'll say something like if the string not py so it's a Python file because it has the extension a py is in our file we'll specify what bucket to upload it to so upload a file bucket and we'll upload it to a bucket that I created earlier YouTube dummy bucket and then we need to pass in a key which is just the path of the file inside the bucket since this if statement is for Python files we'll say upload file key will be equal to Python slash and then we'll concatenate the string we'll make that the follow name is a string and will pass in the pub so we're uploading the file to the path of a Python subdirectory and then calling it the same name that it was in our list directory finally we'll do this upload by same client upload file and then we'll pass in the arguments that we need the first one is the file name so let's just say file the second one is the bucket so we'll say upload file bucket and the last one is the key we only have one if statement here for dot pie files but if you have a lot of different extensions feel free to add more statements like this under this for every file in your directory if you're working with a ton of documents I think this is a great way to backup your files and you can organize them as you do it I don't have many files on my desktop py extensions so let's create a few more to make sure that this works we'll say test up hi testing the pod and then we'll do you Derek not PI now we'll go back to automatic s3 uploader open up a terminal and we'll type in in Python 3 automatic s3 uploader PI automatic s3 uploader not pi let's see if we get it that time once we hit execute we get no return in the terminal but that's ok because we didn't specify return we'll go back to s3 and open it up now in our s3 bucket we see that we're in the bucket of YouTube dummy bucket and we have a subdirectory called Python and this is what we created here in our file key so we'll go back and open that up and now we see that every file in that directory was automatically uploaded into this s3 bucket and that's pretty much it for this one I hope it's helpful to you if you have any questions or comments about this group please let me know if you have any suggestions for future videos please let me know those too until next time [Music]
Info
Channel: Derrick Sherrill
Views: 31,864
Rating: 4.9712524 out of 5
Keywords: Python, Python Automation, Tutorial, How To, Derrick Sherrill, File management, python file management, amazon web services, aws, s3, buckets, storage, amazon storage, python aws
Id: dqkoBrgFSus
Channel Id: undefined
Length: 6min 50sec (410 seconds)
Published: Mon Feb 10 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.