Upload, Download, and Delete Files From Google Cloud Storage Using Cloud Storage API In Python

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hey howson guys in this tutorial we'll learn how to upload download and daily files in google cloud storage using google cloud storage api in python so before we dive into the tutorial i want to quickly cover the prerequisites first so to be able to uh follow along this exercise first you will need to have some experience working with python and it would be ideal uh if you have some experience working with google cloud api you also need to have an active google cloud account and just these three things and if you don't have any experience working with google cloud api then you can watch the video from the link in the description below to get started with how to use google cloud api in python and one more thing so before you can use cloud storage api make sure that you enable the service first by going to your google cloud console then go to apis and services then go to library and look for cloud storage api and enable the service now let's dive into the tutorial so here in my google cloud storage i have two buckets created now let me delete the first bucket because that's going to be the bucket that i'm going to create uh using the api in this tutorial so here in my python script i already wrote down the steps so for this exercise we are going to follow four different steps step one is going to be uh preparing the variables next we're going to create a class to store different functions to perform different uh tasks next we're going to use the api to create a bucket and we'll name the bucket gcp api demo next we're going to use the api to upload files and this should be uh step five hey give me a second step five should be download and delete files now let's start with the import statement so for this python program i'm going to import the os module and for the file management i'm going to use path lib library to manage files directory and to detect the file mic type and it's going to be used by the class we're going to import the my types module and to work with google cloud storage api from google.cloud we're going to import the storage module and if you don't have the module installed you can use the command pip install google cloud storage all right so let's do this we're going to uh prepare the variable first now let's do this i'm going to insert the initial template for the gc storage class so here we have a tuple object and then in the object uh storage classes basically this type of object stores the tna that will need to assign based on your usage use case we're going to assign the tier by referencing the storage classes table which i'll show you in a second as for the gc storage class so here is going to be the constructor and the constructor is going to uh takes an instance of the cloud storage client instance inside the constructor we're going to create attribute call client to store the cloud storage instance now going into step one we're going to prepare the variables so here i'm creating four different variables my working directory variable which is going to point to my project for the directory inside my project folder i have two folders created one is called my files inside the my files folder these are the two files i want to upload to my google cloud storage account then to download the files to disa download a folder down to delete the files i upload it to my cloud storage basically the float looks like this so we're going to upload the file i'm going to download the files to a different folder then we're going to delete the files that we uploaded to the cloud storage just a very simple workflow but this is a pretty common uh use case in many places now let me go ahead and create my google cloud storage instance in the gc storage class object now going into step three we want to use the api to create a bucket and we'll name the bucket gcp api demo right so going back to the gc storage class i want to insert three functions and i'll go over the function one by one so the first function is going to be create bucket so this function takes three parameters the bucket name storage class then the server location we want to host the files now inside the create bucket function we are referencing the storage client instance and from the client object we're referencing the blocking method then we're going to provide a bucket name to create a bucket reference so this bucket object at this point is just a reference then we're going to attach the storage class attribute giving the storage class argument and to actually create the bucket we need to use the create bucket method and we insert the bucket reference followed by the bucket location or the server location and the other two methods are get bucket and this buckets so get bucket basically will takes the bucket name and referencing the bucket object if the bucket edits this in your google cloud storage account and the list buckets method is going to list all the buckets in your account based on the project that you provide in this case i'm using my sql for bigquery project right so let me go ahead and create my gc storage class and import the libraries as well as creating the students classes tuple options let's see uh let me recreate the variables in the storage client instance and go ahead and create the gc storage option and i'll name the object gcs now in step 3 we're going to create the gcp api demo bucket now here i'm going to insert an if statement now if we look at this code here so basically i'm checking the using the gcs the list buckets method i want to make sure that this gcs api demo bucket is not created already and if the bucket is created already they want to use the get backing method to create my broadcast option to reference to the bucket now let me go ahead and run this if block and that creates the gcp api demo bucket so here if i refresh the page and here's the gcs api demo bucket now if i simply run line 43 here to attempt to create the same bucket and it's going to return here your previous request to create the name bucket succeed and your id own the bucket so that's why we need to use an if condition to check the bucket's availability first now if i run the if block again and because this condition is never met so it's going to go ahead and go to the else block and create the broadcast reference right so to upload files using google cloud storage api it is actually pretty straightforward all right so here i have a loop we know the files folder is tied to this my files folder inside the my files folder i have two files one is a data csv file and the other one is an image file and since the files folder object is referencing into a folder i want to list out all the files based on this files folder path and we can provide a pattern to specify the files so here my pattern is looking at all the items in this folder and i want to return all the items that has an extension so if i go in the round loop and i'll just print the file path object based on this output here and it's going to return uh two strings and these two strings are basically the file location to these two files now to upload a file going back to the cloud storage class i'm going to insert a method called upload file so when we are uploading a file we need to provide the blockcase location or bucket reference where we're going to upload the file to and block destination is basically the option in the uh source file location that we're going to upload to the storage account now onto three one thing so here let me go into the uh brackets folder so this is one of the uh we're seeing the when you mainly upload a csc file to google cloud storage if you look at the my type of the application type so google cloud storage is going to treat a csv file as an excel file type and if you have worked with uh csv files before then you know uh this is the wrong mime type it should be uh csv slash txt let me delete this file so that's why in this upload file method i have a condition to check the file type if the file type is csv they want to overwrite the my type to text csv and for photoshop files i also noticed in that the engine has not interpreted the file type correctly otherwise i'm going to use the my types module the guest type function to return the file my type that we're going to pass to the upload function from the blob object right so from the uh blockchain argument so this referencing this actually let's go back uh that's referencing the bucket underscore gcs option so i'm just going to type to uh this bucket here and from the broadcast argument there's a function called block and the block method will takes a string and the string is going to be the name that you want to assign to the blob option and from the blob object want to use the upload from file name method to upload a file so here's the file path from the file path argument followed by providing the file mic type and that was in the blob object right so let's go ahead and run the upload files operation all right so inside this loop i'm uploading the files from the files folder and i want to upload the files into two different formats one is way out the file extension and one is with the file extension now if i go ahead and run this loop uh let's slide in and create the oh i know why so i need to recreate the uh gcs object right so if i go in the runs loop now it's going to upload the files these two files and it's going to create two different versions so i would recommend that every time we upload a file always use the full file name and you will see why in a second so here the second operation upload the two files using the full file name so here we name these two objects data.csv and fusionai.png if i go into well extensions folder so based on the uh blob destination so we can provide a subdirectory by simply append the direction name followed by the object name now if we go into the web extensions subdirectory so here we have the excessive two files data in visual ai except that we didn't assign an extension name to those two files but if we look at the object detail here we can see that these two objects has the correct file type assigned to the metadata now let's go to step 5 which is to download and daily files all right so going back to the gc stores class i want to add one more function and it's going to be the last function of this class so the function is called list blobs and function takes a bucket name as argument basically we can use this function to return all the objects giving the bucket name so in this case if we here in this gaming example from the gcs object i'm going to insert the list blobs function and my bucket name is going to be gcs api demo now if i execute the function it's going to return iterated object so we can insert loop to generate the output let's just print the object name so if i run the loop it's going to return each object's location now to uh download the files and delete the files after finished downloading so here let's do this i'm going to save the output gcs unscrew demo blobs and from the irradiated object i'm going to say full blob in gcs demo blobs and because these two files have a subdirectory i want to make sure that these two files destination matched to what we have in the cloud storage so here inside the loop i'm going to do this so i know i already created the download folder path option the link to this downloaded folder so i can basically join the object's name by using the join path method from the pathflip object and this will be a blob dot name and let me name this to path download and from the path object i can check if a subdirectory exists by referencing the parent attribute that exists and here i want to insert an if condition if the set directory does not exist i want to create a set directory in the downloaded folder so from the path object we can create the statute by referencing the parent folder then create the directory by using the mk the function i want to set the parent to two and this will be plural appearance now we can safely download the file to the rightful location so from the blob option we can reference in the download to file method then we're going to provide the download path and because uh path download is an object on the string so we need to convert that to a string to return the file path so we can do that by inserting the stream function and once we download the file we can delete the object using the delete method right so here are if i look at my gcs api demo bucket now if i go ahead and run discord block to download the files and here i'm getting an attribute let me take a look oh so you have a typo this should be download to file name all right so let me try again so we have the first two files downloaded inside the well extension subdirectory we have our two files while the extension so the reason why i recommend that you always provide the full file name is because when you download the file if your object does not include the extension name then when you download the file it's going to be uh difficult to figure out what's the right extension to use all right so this is going to be everything i'm going to cover in this video and hope you guys found this video useful and feel free to leave your feedback on any question in the comment section below and as always thank you guys for watching i will see you guys in the next video
Info
Channel: Jie Jenn
Views: 12,423
Rating: undefined out of 5
Keywords: google cloud storage, google cloud storage api, gcs
Id: 1cDqRrw3t9o
Channel Id: undefined
Length: 20min 16sec (1216 seconds)
Published: Tue Sep 20 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.